By Genesis Guzman and Stephanie Boulos
DETROIT, MI – A Detroit man who was falsely charged – using troubling facial recognition software – in 2019 for shoplifting at a Shinola store – now, with the ACLU’s help, he’s now suing Detroit police.
He was arrested after facial recognition software falsely matched his driver’s license picture to the blurry image of the man shoplifting in a surveillance video.
In 2019, Detroit police were investigating a robbery that had occurred in a Shinola store in 2018 – the amount of stolen property totaling to about $3,800. The surveillance video showed a blurry image of a Black man wearing a Cardinals team hat.
After months of no leads the police ran the image through a facial recognition system and it matched it to Robert Williams’ driver’s license photo.
His picture was shown to the shop security guard who hadn’t witnessed the robbery but only picked out Williams in the lineup based on the grainy security video he saw.
Williams reported in his testimony to a congressional subcommittee “My picture was used in some type of line up and I’ve never been in trouble.”
Eighteen months after the robbery Williams was called by Detroit Police telling him to report to be arrested without providing any more information. When he asked for more information he was threatened with arrest at his work.
Believing it to be a prank call, Williams came home from work where his two young daughters, wife, and neighbors watched while he was arrested on his front lawn.
When Williams’ wife asked where he was being taken the officer responded that she should “Google it.”
He was detained for almost 30 hours. Williams testified that when he was shown the image of the man in the video he told the officer “I hope you don’t think all Black people look alike”
The officer interrogating him finally acknowledged that the software must have made a mistake, but instead of being released, he was arraigned and held until dark.
When he was finally released he was left outside in the dark alone to wait for his wife to pick him up.
Later at the probable cause conference the prosecutor dropped the charges without prejudice, which meant that the police could still question and “harass” his family again.
Williams has sued the Detroit Police and is now taking his fight to the House Subcommittee on Crime, Terrorism, and Homeland Security. He has testified about his arrest and experience with facial recognition software.
The concern over facial recognition technology has been rising over the years, especially in the Detroit area according to Fox 2 local news. James Craig, a former police chief, mentioned how the technology was one of many tools used when investigating.
But Williams’ testimony served the purpose of pleading Congress to pass legislation against using facial recognition technology, because of its room for error, like in Williams’ case.
The testimony given by Williams was just a small part of the hearing being held by a subcommittee on crime, terrorism, and homeland security in the House of Representatives.
The Facial Recognition and Biometric Technology Moratorium, introduced by Congress recently, serves the purpose of indefinitely banning the use of the technology by law enforcement.
Williams, however, is not an isolated case – others have also been falsely arrested for a crime. The issues with facial recognition seem to be unequally affecting women and people of color.
According to research mentioned by Vice on Facial Recognition, “research has repeatedly shown that facial recognition technology is fundamentally biased against women and people of color, leading to errors like this. Even when working properly, privacy advocates have argued facial recognition systems disproportionately target communities of color, creating further pretext for police intervention.”
Critics from Congress are arguing that this technology alone is not enough to stop crime, and Isaac Robinson, the former late state Representative from Michigan, even pushed forward legislation to ban its use until there’s more research.
Say WHAT? How is that a lineup?
That’s not a line up. What they are saying is that he saw a grainy security video and on that basis picked Williams out of the line up.
yeah I know it’s not a line up, but how do you base a lineup on identifying someone from a photograph that you never saw? What not have a cop, or a random passerby, look at the photograph and then do the lineup? Makes as much sense.
I am right with you on that…
This doesn’t surprise me and I think there’s a reason for it (the people of color part, not the women part). I was photographing a friend once with extremely dark skin who was in shadow. I asked him to turn be better lit and I said, “black people can be hard to photograph”. He kept this stagnant look on his face, turned to the person next to him and said, “Was that racist?”. Then he started laughing his arse off. (Highbeam knows the people involved 😉 )
Similarly, I try to take an ‘equitable’ balance of pictures of my cats. But one is jet black while the other is mix of complex patterns of three light fur colors. The black cat is so hard to photograph, because the black fur absorbs the light so I get a lot a pictures that look like someone spilled an ink blob. I would think it would be very hard to definitively identify my cat from any other black cat by a photograph, whereas my lighter cat would have detail that could positively identify it (were it to get into the chicken coop, for example). I have found that silhouettes can make interesting cat photograhps, and usually the picture is OK if I can get the eyes. The point is, from a scientific point of view, dark skin absorbs light and therefore the features/details aren’t as clear to optical machines relative to lighter skinned people, especially when out of direct light. So I think there’s a scientific reason for this, not that cameras are inherently ‘racist’ by design.
And sexist? It may be that women in general wear a lot more accessories and change their look, sometimes daily, relative to men, so this could contribute to differing patterns that could overall make the software make less positive and/or correct IDs.
Seriously, ID software at best should be used as a tool to lead the way to *possible* suspects, but never as a positive identifying tool.