According to Ars Technica, South Wales Police began using automated facial recognition technology on a trial basis in 2017, deploying a system called AFR Locate overtly at several dozen major events such as soccer matches. Police matched the scans against watchlists of known individuals to identify persons who were wanted by the police, had open warrants against them, or were in some other way persons of interest.
In 2019, Cardiff resident Ed Bridges filed suit against the police, alleging that having his face scanned in 2017 and 2018 was a violation of his legal rights. Although he was backed by UK civil rights organization Liberty, Bridges lost his suit in 2019, but the Court of Appeal today overturned that ruling, finding that the South Wales Police facial recognition programme was unlawful.
The court ruling said that too much discretion is currently left to individual police officers and it was unclear who can be placed on the watchlist, nor is it clear that there are any criteria for determining where AFR can be deployed.
The police did not sufficiently investigate if the software in use exhibited race or gender bias, the court added.
The South Wales Police in 2018 released data admitting that about 2,300 of nearly 2,500 matches -- roughly 92 percent -- the software made at an event in 2017 were false positives. The ruling did not completely ban the use of facial recognition tech inside the UK, but does narrow the scope of what is permissible and what law enforcement agencies have to do to be in compliance with human rights law. Other police inside the UK who deploy facial recognition technology will have to meet the standard set by today's ruling. That includes the Metropolitan Police in London, who deployed a similar type of system earlier this year.