Syfy Insider Exclusive

Create a free profile to get unlimited access to exclusive videos, sweepstakes, and more!

Sign Up For Free to View
SYFY WIRE robots

San Francisco police propose allowing robots to use deadly force against people

So much for the laws of robotics...

By Cassidy Ward
Police robot

Early in Edward Neumeier’s career, he was in the art department for Blade Runner and started wondering what it might be like if a robot cop lived in that dystopian cyberpunk future world. The result of those musings eventually became RoboCop, which spawned a long-lived franchise including two sequels, a cartoon, and a live-action TV series (now streaming on Peacock!).

While Alex Murphy, a.k.a. RoboCop, maintains some of his prior humanity, the same can’t be said for his fully robotic co-workers. The ED-209 law enforcement droid designed by the fictional mega-corporation Omni Consumer Products doesn’t have any problem killing people, even when it’s not supposed to. Now the San Francisco Police Department appears to be taking a page out of OCP’s playbook, according to a recently released equipment proposal.

In addition to armored vehicles, battering rams, explosives, loudspeaker systems, and firearms, the equipment proposal, if accepted, would also change the way the SFPD uses the robots when interacting with the public. An early draft of the proposal included verbiage which explicitly stated that robots would not be used as a Use of Force against any person. However, during deliberations, that verbiage was crossed out and replaced with new guidelines. The proposal as it exists today states, “Robots will only be used as a deadly force option when risk of loss of life to members of the public or officers is imminent and outweighs any other force option available to SFPD.”

That wording follows a comprehensive list of robots the SFPD has at its disposal, which includes surveillance robots and bomb disposal robots. At present, the department has seventeen robots, twelve of which are operational as of this writing. According to The Verge, the change in policy was ultimately accepted by Aaron Peskin, the Dean of the city’s Board of Supervisors, because he was convinced there were situations in which a robot using deadly force against a person might be the only option. That version is now going up to the Board of Supervisors for approval.

Peskin isn't coming up with the idea that police robots might need to use deadly force out of nowhere. In 2016, after a shooter killed five people in Dallas, police there cornered a suspect in a parking garage where a standoff ensued. Hours later, Dallas police deployed a bomb disposal robot with an explosive attached and detonated it in the garage, killing the alleged shooter. In a press statement after the incident, Dallas Police Chief David Brown said, “we saw no other option but to use our bomb robot and place a device on its extension for it to detonate where the suspect was. Other options would have exposed our officers to grave danger. The suspect is deceased as a result of detonating the bomb.”

While that incident proves that bomb disposal robots can be used (albeit not as intended) to deliver deadly force to a person, police departments need not go to all that trouble in order to use robots to kill. According to the SFPD’s equipment proposal, they are currently in possession of QinetiQ TALON, which is capable of bomb disposal, reconnaissance, communications, rescue missions, HAZMAT situations and more. Importantly, the United States military presently uses a version of the very same robot equipped with grenade launchers, machine guns, and other weapons.

In addition to deadly force, the proposal also grants the SFPD the ability to use robots in “training and simulations, criminal apprehensions, critical incidents, exigent circumstances, executing a warrant or during suspicious device assessments.”

RELATED: Robots are killing people but not the way you think

That last bit sounds like the sort of thing you do with a bomb disposal robot, but “critical incidents” is defined broadly enough that nearly anyone could feasibly find themselves on the wrong side of a weaponized robot.

Of course, the SFPD isn’t the only player in the debate over the use of robots. Six major robotics companies, including Boston Dynamics, released an open letter earlier this year pledging not to weaponize their robotic systems and calling on others to have the same restraint. The letter reads, in part, “We believe that adding weapons to robots that are remotely or autonomously operated, widely available to the public, and capable of navigating to previously inaccessible locations where people live and work, raises new risks of harm and serious ethical issues. Weaponized applications of these newly-capable robots will also harm public trust in the technology in ways that damage the tremendous benefits they will bring to society.”

Since the arrival of drone warfare, a lot has been made about the potential pitfalls of technologizing violence. Drone pilots report a surreal feeling as they stalk and kill targets thousands of miles away from the safety of a station in Virginia. Then they live with the trauma of having facilitated violence, even through a screen, sometimes for the rest of their lives. One can’t help but wonder if we might not see the same failings play out in California that we saw in Afghanistan and elsewhere. If the Board of Supervisors accepts the proposal, we might find out what RoboCop looks like in the real world.