Facial Recognition Software Finally Gets Around To Getting An Innocent Person Arrested

from the great-job-everyone dept

Well, it’s happened. The thing people have been warning about for years. A person lost some of their freedom due to a facial recognition mismatch. It may have only been 30 hours, but it should have been zero. And it might have been zero hours if investigators had bothered to read the disclaimers attached to its facial recognition search results.

According to the New York Times report, this is the first time a false positive has led to someone being arrested. Or, at least, the first time the public’s been made aware of it. A few years ago, the FBI and a local law enforcement agency used “facial analysis” performed by humans to arrest the wrong man twice for two separate robberies. This time, it was software. And it took 30 hours away from an innocent person.

On a Thursday afternoon in January, Robert Julian-Borchak Williams was in his office at an automotive supply company when he got a call from the Detroit Police Department telling him to come to the station to be arrested. He thought at first that it was a prank.

An hour later, when he pulled into his driveway in a quiet subdivision in Farmington Hills, Mich., a police car pulled up behind, blocking him in. Two officers got out and handcuffed Mr. Williams on his front lawn, in front of his wife and two young daughters, who were distraught. The police wouldn’t say why he was being arrested, only showing him a piece of paper with his photo and the words “felony warrant” and “larceny.”

His wife, Melissa, asked where he was being taken. “Google it,” she recalls an officer replying.

Googling it would not have helped. Williams was taken away by cops and held for 30 hours, accused of shoplifting watches from an upscale store nearly two years earlier. All the police had to work with were a blurry, lo-res screengrab from the store’s CCTV camera and facial recognition software provided by DataWorks Plus. DataWorks tests AI algorithms created by contractors by running searches using low-quality images. DataWorks does not provide any measurements of these algorithms’ accuracy, however. It apparently packages up its collection of algorithms and sells access to government agencies.

Five months after the crime was committed, Michigan State Police digital image examiner Jennifer Coulson uploaded the image captured by the store’s camera. But investigators apparently ignored the big, bold warning attached to the top of the search results.

After Ms. Coulson, of the state police, ran her search of the probe image, the system would have provided a row of results generated by NEC and a row from Rank One, along with confidence scores. Mr. Williams’s driver’s license photo was among the matches. Ms. Coulson sent it to the Detroit police as an “Investigative Lead Report.”

“This document is not a positive identification,” the file says in bold capital letters at the top. “It is an investigative lead only and is not probable cause for arrest.”

The State Police still maintain investigators did nothing wrong. They didn’t only rely on the facial recognition search results. They also showed Williams’ drivers license photo to the loss prevention person at the store (as part of a “six-pack” of facial photos) and she supposedly identified him.

The contractors that supplied the AI system to the State Police seem skeptical of everything that happened here — even their own software’s participation in the arrest of Williams.

[Brendan] Klare, of Rank One, found fault with [the loss prevention person’s] role in the process. “I am not sure if this qualifies them as an eyewitness, or gives their experience any more weight than other persons who may have viewed that same video after the fact,” he said. John Wise, a spokesman for NEC, said: “A match using facial recognition alone is not a means for positive identification.”

One of the officers who questioned Williams said — after viewing the surveillance video again with Williams in the room — that he “guessed the computer got it wrong.” Even so, Williams was held for another several hours before being released on $1,000 bond. According to his first-person account of this incident, the first 18 hours of his detainment were spent in an overcrowded holding cell without any interaction with investigators or the officers who arrested him.

Two weeks later, prosecutors dropped the charges, but left themselves the option to refile the charges in the future. And the prosecutor’s office appears to think that’s a possibility. The spokesperson says there’s another witness investigators are interviewing and cops might try to run Williams in again for a crime he didn’t commit.

And Williams has an alibi. He posted a video to his Instagram account during his commute home from work. This occurred during the time the store was being robbed. But investigators never bothered to check. And this investigation — which appears to have stemmed almost solely from a “match” generated by a lo-res CCTV screengrab — resulted in an innocent man having his life severely disrupted.

Filed Under: arrests, detroit, facial recognition, robert williams, wrong guy

Read More