Space the Nation: Some reasons to be alarmed about autonomous robots

Contributed by
Oct 2, 2018, 3:03 PM EDT

The headlines: FAA cracks down on rogue drones
In-flight charging gives drones unlimited autonomous range
Government may gain new power to track, shoot down drones

Of all the predictive narratives toyed with by genre creators, it’s speculation about the dangers and possibilities of drones that feels the most eerily accurate — and the most stubbornly ignored.

Perhaps it’s worth defining “drone” to realize how ubiquitous they’ve become, and how little thought has been put into their presence in our lives. The word “drone” in news articles usually refers to some sort of remotely-operated flying robot that has one of three uses: delivering packages, taking pictures, or killing people. (As a vernacular reference, it’s only the latter: i.e., “Don’t drone me, bro!”) But think just a little more expansively about what “drone” really describes: Is the mode of transport limited to flying? Not in Washington D.C., where “wheeled coolers” have been approved to deliver groceries. And are drones’ operations limited to by some human back at HQ making the real-time decisions? Nope: “Autonomous drones” are already mapping underground caverns and available to consumers as self-guided “witnesses” to — in the words of an enthusiastic review — “a run, or a party, or a vacation memory.” As for uses beyond delivery/photography/murder, well: By this somewhat broader definition, we’re already using drones to vacuum our floors, drive taxis and perform medical procedures. Maybe the best definition of a drone is any machine that performs a specific service remotely so that a human doesn’t have to. So, yes, the Roomba is a drone. So is the Terminator.

And all this is happening without much oversight. The Federal Aviation Administration only requires the registration of flying drones that weigh more than half a pound, which means that most hobbyist camera drones don’t need it. I doubt anyone has thought to license or regulate Roombas at all, though it doesn’t take much imagination or technological know-how to turn a Roomba into a weapon or a surveillance device. (They are arguably surveillance devices already, depending on who can hack into them.) Driverless cars were on the road before anyone figured out exactly how to handle the inevitable injuries and fatalities. Indeed, because the Arizona government has intentionally created loose regulations around autonomous vehicles in order to encourage businesses to invest there, literally no one was held responsible for the first death of a pedestrian hit by a self-driving car: a homeless woman in Tempe named Elaine Herzberg.

But even before driverless cars hit the road, people at least worried about pedestrian deaths. The rise of drones has created whole new categories of concern—or allowed existing concerns to intensify. Drones can be used by stalkers. Hobbyist drones could be engineered to carry deadly materials, sure — but what about the panic that could be caused by just rigging one to disburse baking powder over a stadium? (One person already papered a 49ers crowd with leaflets.) The world’s most popular drone manufacturer — D.J.I. — is based in China and has been accused by the U.S. government of sending information gathered by its drones sold in the U.S. back to Chinese officials.

Laws that do exist around drones are targeted to the most obvious of hazards: the physical. So there are regulations about the altitude you can fly at and flying near airports or over groups of people—you’re also not supposed to fly under the influence or during emergency response efforts. These aren’t bad regulations, but they seem exceedingly naive. First of all, there are no FAA cops—no one patrolling the skies enforcing these regulations (though perhaps that is something other drones could do). Even more important: We don’t yet have a set of social norms around drone use, the kind of ingrained adherence to intangible and unspoken rules that, for instance, keeps most of us from parking across two spaces even if there’s no one else in the lot or not even slowing down for stop signs even if it’s 3 AM in the middle of nowhere. Part of the reason we do those things is that we worry about getting caught, sure, but a bigger part is the social conditioning that tells us yellow lines and red octagons mean something within our community—a meaning that had to be created and tended to, by the way, because people also started driving cars (auto-mobiles!) before we had many rules about them.

The parallel between the introduction of the car and the drone is instructive: In both, you had a fantastical (the horseless carriage!) and fascinating piece of consumer technology with obviously valuable uses that was both easy to operate yet potentially fatal. Detroit was where the commercial car was born and felt the impact of its arrival — as it were — the earliest and the hardest. In 1911, Detroit had 12 police officers in its “Traffic Squad,” and they patrolled a single intersection in four-man shifts. Five years later, a quarter of its force — 250 officers — were devoted to managing traffic. This didn’t cap the awful death toll that cars came with, however: In 1917, Detroit and its suburbs had 65,000 cars on the road, resulting in 7,171 accidents and 168 fatalities. That’s one death for every 386 cars, which may sound manageable but keep in mind that today’s fatality rate, nation-wide, is 30 deaths per million registered vehicles.

With every other metropolitan area experiencing something close to Detroit’s body count, the nation was wracked by concern over this violent intrusion into everyday life. Lawmakers struggled with how to address the menace. In Georgia, the state court of appeals ruled, “Automobiles are to be classed with ferocious animals and … the law relating to the duty of owners of such animals is to be applied.” Cities held “safety parades” to raise awareness, featuring children who had survived accidents waving from (presumably slow-moving) cars and in Washington, DC, 10,000 children marched dressed as ghosts, one for each a death that year.

Keep in mind, this was in response to a fatality rate of less than 1 in 400. Herzberg’s death in Arizona puts that state’s fatality at 1 death per 600 cars. I wonder when, if ever, we’ll get to a parade.

And that death rate is simply related to self-driving cars and entirely accidental deaths. But perhaps military drones deserve the same kind of public outcry—forget the civil rights issues around due process for the “enemy combatants,” think of the civilians, the equivalent to pedestrians minding their own business. Warfare has given flying drones their appropriately menacing reputation, and despite some free-floating notion equating robotics and precision, 11 percent percent of those killed by United States military drones are non-combatants. (That was under Obama and Bush, by the way. The Trump administration has both increased drone strikes and declined to report on civilian casualties.)

Numbers like that have caused marches and uprisings in the context of police shootings — and companies have recalled entire product lines over rates like the one in Arizona — but we all seem incurious and unbothered by the mass introduction of potentially lethal and mostly unaccountable robots into our homes and skies. We have a sense that some of it is creepy but we have yet to come to an agreement about what is wrong. Ethicists have pointed out that manufacturers of autonomous devices seem singularly unprepared creating algorithms that have to do with limiting the necessary number people who might die in a situation (“Let’s say the only way the car can avoid a collision with another car is by hitting a pedestrian…Is the pedestrian a child? Is the alternative to swerve away from the child and into an S.U.V.? What if the S.U.V. has just one occupant? What if it has six?”).

On the regulatory front: The crackdowns appear to be coming in the form of increasing the government’s ability to track, intercept and/or destroy civilian drones without due process, rather than determine the bounds for drone use in general. Here the car metaphor breaks down: Imagine if speeding meant that the government could simply take your car away, or if it could take your car away for no reason at all. The American Civil Liberties Union suggests this approach could be used to keep journalists from reporting on disaster zones or law enforcement actions.

And none of this is happening in a public conversation, we don’t wonder about the metaphorical the absence of parking lines in the sky and few people think about who might be responsible for an unruly Roomba. But both of those assumptions are wound up in a different sort of unspoken social trust: Faith in authority. I think most of us just assume that decisions about regulations will get made somehow, that the commercial enterprises that want us to buy drones care enough about profits to keep us safe, that the imagination of the bad folks will be matched by the imagination of the good.

I don’t think we need science fiction to teach otherwise. Just look around.

Top stories
Top stories