Published in News

Police facial recognition is rubbish

by on05 July 2019


More than 80 percent inaccurate

Four out of five people identified by the Metropolitan Police's facial recognition technology as suspects were innocent, according to an independent report.

Researchers found that the controversial system is 81 percent inaccurate - meaning that, in most cases, it flagged up faces to police when they were not on a wanted list.

However, that flies in the face of the Met’s figures which claims that it only makes a mistake in one in 1,000 cases.

The report, revealed by Sky News and The Guardian, raises "significant concerns" about Scotland Yard's use of the technology, and calls for the facial recognition programme to be halted.

Citing a range of technical, operational, and legal issues, the report concludes that it is "highly possible" the Met's usage of the system would be found unlawful if challenged in court.

The Met has been monitoring crowds with live facial recognition (LFR) since August 2016, when it used the technology at Notting Hill Carnival.

It has conducted 10 trials at locations including Leicester Square, Westfield Stratford, and Whitehall during the 2017 Remembrance Sunday commemorations.

The first independent evaluation of the scheme was commissioned by Scotland Yard and conducted by academics from the University of Essex.

Professor Pete Fussey and Dr Daragh Murray evaluated the technology's accuracy at six of the 10 police trials. They found that, of 42 matches, only eight were verified as correct - an error rate of 81 percent. Four of the 42 were never found because they were absorbed into the crowd, so a match could not be verified.

The Met prefers to measure accuracy by comparing successful and unsuccessful matches with the total number of faces processed by the facial recognition system. According to this metric, the error rate was just 0.1 percent.

Professor Fussey and Dr Murray claimed the Met's use of facial recognition during these trials lacked "an explicit legal basis" and failed to consider how this technology infringed fundamental human rights.

Professor Fussey told Sky News: "Our report conducted a detailed, academic, legal analysis of the documentation the Met Police used as a basis for the face recognition trials. There are some shortcomings and if [the Met] was taken to court, there is a good chance that would be successfully challenged."

 

Last modified on 05 July 2019
Rate this item
(0 votes)

Read more about: