© 2024 SDPB Radio
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

San Francisco considers allowing law enforcement robots to use lethal force

Law enforcement has used robots to investigate suspicious packages. Now, the San Francisco Board of Supervisors is considering a policy proposal that would allow SFPD's robots to use deadly force against a suspect.
Joe Raedle
/
Getty Images
Law enforcement has used robots to investigate suspicious packages. Now, the San Francisco Board of Supervisors is considering a policy proposal that would allow SFPD's robots to use deadly force against a suspect.

Should robots working alongside law enforcement be used to deploy deadly force?

The San Francisco Board of Supervisors is weighing that question this week as they consider a policy proposal that would allow the San Francisco Police Department (SFPD) to use robots as a deadly force against a suspect.

A new California law became effective this year that requires every municipality in the state to list and define the authorized uses of all military-grade equipment in their local law enforcement agencies.

The original draft of SFPD's policy was silent on the matter of robots.

Aaron Peskin, a member of the city's Board of Supervisors, added a line to SFPD's original draft policy that stated, "Robots shall not be used as a Use of Force against any person."

The SFPD crossed out that sentence with a red line and returned the draft.

Their altered proposal outlines that "robots will only be used as a deadly force option when risk of loss of life to members of the public or officers are imminent and outweigh any other force option available to the SFPD."

The SFPD currently has 12 functioning robots. They are remote controlled and typically used to gain situational awareness and survey specific areas officers may not be able to reach. They are also used to investigate and defuse potential bombs, or aid in hostage negotiations.

Peskin says much of the military-grade equipment sold to cities for police departments to use was issued by the federal government, but there's not a lot of regulation surrounding how robots are to be used. "It would be lovely if the federal government had instructions or guidance. Meanwhile, we are doing our best to get up to speed."

The idea of robots being legally allowed to kill has garnered some controversy. In October, a number of robotics companies — including Hyundai's Boston Dynamics — signed an open letter, saying that general purpose robots should not be weaponized.

Ryan Calo is a law and information science professor at the University of Washington and also studies robotics. He says he's long been concerned about the increasing militarization of police forces, but that police units across the country might be attracted to utilizing robots because "it permits officers to incapacitate a dangerous individual without putting themselves in harm's way."

Robots could also keep suspects safe too, Calo points out. When officers use lethal force at their own discretion, often the justification is that the officer felt unsafe and perceived a threat. But he notes, "you send robots into a situation and there just isn't any reason to use lethal force because no one is actually endangered."

The first time a robot was reported being used by law enforcement as a deadly force in the United States was in 2016 when the Dallas Police Department used a bomb-disposal robot armed with an explosive device to kill a suspect who had shot and killed five police officers.

In an email statement to NPR, SFPD public information officer Allison Maxie wrote, "the SFPD does not own or operate robots outfitted with lethal force options and the Department has no plans to outfit robots with any type of firearm." Though robots can potentially be equipped with explosive charges to breach certain structures, they would only be used in extreme circumstances. The statement continued, "No policy can anticipate every conceivable situation or exceptional circumstance which officers may face. The SFPD must be prepared, and have the ability, to respond proportionally."

Paul Scharre is author of the book Army Of None: Autonomous Weapons And The Future Of War. He helped create the U.S. policy for autonomous weapons used in war.

Scharre notes there is an important distinction between how robots are used in the military versus law enforcement. For one, robots used by law enforcement are not autonomous, meaning they are still controlled by a human.

"For the military, they're used in combat against an enemy and the purpose of that is to kill the enemy. That is not and should not be the purpose for police forces," Scharre says. "They're there to protect citizens, and there may be situations where they need to use deadly force, but those should be absolutely a last resort."

What is concerning about SFPD's proposal, Scharre says, is that it doesn't seem to be well thought out.

"Once you've authorized this kind of use, it can be very hard to walk that back." He says this proposal sets up a false choice between using a robot for deadly force or putting law enforcement officers at risk. Scharre suggests that robots could instead be sent in with a nonlethal weapon to incapacitate a person without endangering officers.

As someone who studies robotics, Ryan Calo says that the idea of "killer robots" is a launchpad for a bigger discussion about our relationship to technology and AI.

When it comes to robots being out in the field, Calo thinks about what happens if the technology fails and a robot accidentally kills or injures a person.

"It becomes very difficult to disentangle who is responsible. Is it the people using the technology? Is it the people that design the technology?" Calo asks.

With people, we can unpack the social and cultural dynamics of a situation, something you can't do with a robot.

"They feel like entities to us in a way that other technology doesn't," Calo says. "And so when you have a robot in the mix, all of a sudden not only do you have this question about who is responsible, which humans, you also have this strong sense that the robot is a participant."

Even if robots could be used to keep humans safe, Calo raises one more question: "We have to ask ourselves do we want to be in a society where police kill people with robots? It feels so deeply dehumanizing and militaristic."

The San Francisco Board of Supervisors meets Tuesday to discuss how robots could be used by the SFPD.

This story has been updated to include portions of an email statement to NPR by the SFPD.

Copyright 2022 NPR. To see more, visit https://www.npr.org.

Ari Shapiro has been one of the hosts of All Things Considered, NPR's award-winning afternoon newsmagazine, since 2015. During his first two years on the program, listenership to All Things Considered grew at an unprecedented rate, with more people tuning in during a typical quarter-hour than any other program on the radio.
Brianna Scott is currently a producer at the Consider This podcast.