Published in AI

Racially-biased facial recognition led to bloke's arrest

by on25 June 2020


Bad enough when humans do it

It what is being touted as a world first, a faulty facial recognition match led to a Michigan man's arrest for a crime he did not commit  becauset the AI thought all black people looked alike.

Robert Julian-Borchak Williams was in his office at an automotive supply company when he got a call from the Detroit Police Department telling him to come to the station to be arrested.

He thought it was a prank until an hour later a police car pulled up behind, blocking him in. Two officers got out and handcuffed Williams on his front lawn, in front of his wife and two young daughters, who were distraught. The police wouldn't say why he was being arrested, only showing him a piece of paper with his photo and the words "felony warrant" and "larceny".

His wife, Melissa, asked where he was being taken. "Google it", she recalls an officer replying. The police drove Williams to a detention centre. He had his mug shot, fingerprints and DNA taken, and was held overnight.

Next day two detectives took him to an interrogation room and showed him a still image from a surveillance video, showing a heavyset man, dressed in black and wearing a red St. Louis Cardinals cap, standing in front of a watch display. Five timepieces, worth $3,800, were shoplifted.

A close-up image showed that it was clearly not Williams. Williams picked up the image and held it next to his face. "No, this is not me... you think all black men look alike?" 

Williams knew that he had not committed the crime in question. What he could not have known was  that his case may be the first known account of an American being wrongfully arrested based on a flawed match from a facial recognition algorithm.

Facial recognition systems have been used by police forces for more than two decades. Recent studies by MIT and the National Institute of Standards and Technology, or NIST, have found that while the technology works relatively well on white men, the results are less accurate for other demographics, in part because of a lack of diversity in the images used to develop the underlying databases.

In this case the state’s technology is supplied for $5.5 million by a company called DataWorks Plus. Founded in South Carolina in 2000, the company first offered mug shot management software, said Todd Pastorini, a general manager. In 2005, the firm began to expand the product, adding face recognition tools developed by outside vendor.

At the moment there is a huge amount of evidence which shows AI based facial recognition systems are pants and the big players have withdrawn their products and will not sell them to cops, because that is likely to get someone killed.

But that has not stopped smaller companies trying to sell facial recognition systems to law enforcement using all sorts of claims which include being able to predict if someone is going to commit a crime based on what they look like.

 

 

Last modified on 25 June 2020
Rate this item
(0 votes)

Read more about: