By David M. Greenwald
Executive Editor
San Francisco, CA – It may sound absurd—and as one drills down into the issue, perhaps more so—but the San Francisco Board of Supervisors approved implementing robots that can kill, on an 8-3 vote with just Dean Preston, Hillary Ronen, and Shamman Walton opposing it.
Supervisor Preston told the Vanguard this week, “I don’t think most people in San Francisco support this” and believed if placed on the ballot, it would go down overwhelmingly, just as an initiative for unrestrictive taser use went down in 2018, 62-38.
“I’m getting tons of communications of people who are against what the board did (Tuesday),” Preston said.
For Preston this vote “speaks more to the nonstop fear-mongering from more conservative forces here—from the police, from the mayor, from the local media—that suggests more police, more weapons is the answer to everything.”
He noted that last budget cycle the Board of Supervisors voted to give a $50 million increase to the San Francisco Police Department.
The proposal came directly from the mayor, but was supported by the police.
“It was a proposal from the Mayor and the police department,” Preston explained. He added, “I can’t believe we were seriously asked to give this authorization. Obviously eight of my colleagues felt differently.”
He further explained that the policy would authoroize the police to use deadly force.
“They have clarified that they’re not seeking the ability to have that robot fire a weapon. They’ve pushed back against these sort of Robocop sci-fi move type image of the robot firing a weapon,” he said. But then they end up in a more absurd place, “Instead, they have said that what they’re talking about is a robot that could carry a device that could be detonated.”
Preston said, “In other words, like a bomb-bearing robot—that doesn’t give me much comfort.”
He added, “They seem to attack me for somehow not honestly portraying what they’re proposing here. I’ve called it—as many advocated and concerned citizens have called it—killer robots.”
He said, “They’re not seeking permission for a robot that would fire, like fire a gun type weapon, but instead one that would, would carry an explosive.” But he added, “It still seems absolutely reckless to give this authority to the police department, and there’s been no real showing of the need for it anyway.”
A big question many have is under what conditions would there be a practical use for such technology and authorization.
Preston explained that “they throw out a lot of hypothetical situations, but when you actually examine those situations, they don’t make any sense as a justification for this policy.”
He threw out one such example.
He said, “The police officers association tweeted a whole thread attacking me yesterday. And one of their arguments was that by denying them this technology, this technology would’ve been used to stop the sniper, shooter in the Mandalay, in the Vegas killings.”
“You have the same police department that is saying they want to strap explosives to a robot saying that this would be used to go after a sniper shooting from a room in a fully occupied hotel. Like how does that add up?”
He continued, “I mean, what are you going to do, blow up the fully occupied hotel? Like, I mean it doesn’t even make internal sense. So they can’t cite and don’t have not even tried to cite a single incident in San Francisco or in California where this technology would’ve been useful or would have been appropriate to use.”
Preston called it “classic fear-mongering.”
He said, “Yesterday, with all due respect to some of my colleagues, there were just fear mongering speeches about the horrors of people firing assault weapons and mass shootings and killing people, and all that’s real. But none of them tied it to why you would need this technology.”
Preston added, “I think their line of reasoning is that because there are horrible incidents in which shooters are using military grade weapons, therefore any grant of authority or weaponry to the police is justified.”
Supervisor Preston noted that there are situations when having a robot is good policy.
“Nobody disagrees that a robot may be warranted to check a suspicious package or something,” he said. “Or try to scope out a bomb or potential bomb that’s out there.”
But he said, “That’s not the argument here.”
The problem he has: “There is no scenario that’s being presented where you need to have robots inflicting and using deadly force on people in San Francisco.” Moreover, he added, “Unfortunately we are in a political climate where the police aren’t even really pressed to make that case because you have elected officials who are willing to grant anything that the police department requests, whether that’s unjustified budget increases from the budget process or robots that can kill.”
Now a real concern is not only with San Francisco, but the idea that San Francisco will now be used as justification for the next city to implement this kind of policy.
“I think it was important when Oakland said no to this. That sent a good message,” he said. “I am very concerned in San Francisco both to the harm this policy will inflict, but also the message that it sends to other cities that are grappling with this issue.”
Supervisor Preston noted that this policy will likely disproportionately harm communities of color.
“We have not addressed the racial disparities in use of force in San Francisco. So we haveevery reason to believe that additional weaponry will be used disproportionately against Black and brown communities in San Francisco,” Preston explained.
He also noted the history of using such bombs across the country.
He cited the MOVE bombing in Philadelphia, that killed 11 people—including five children—and burning down over 60 homes.
He cited the Los Angeles Police Department’s negligent use of explosives, during which they blew up half a neighborhood.
“We had a history in San Francisco, of a bomb that was detonated outside of the Mayor’s house during an extremely intense dispute with the San Francisco Police Department.”
Preston said he was hopeful that community pushback will cause his colleagues to rethink their vote for a second reading—although he acknowledged he was not optimistic.
The community recourse would be to put this to a vote.
“I don’t think that a majority of people want robots that kill people under the discretion of the San Francisco Police Department,” he said.