Published in AI

Clearview AI sued in California

by on11 March 2021


Stop hunting for Sara Conner


Civil liberties groups have sued Clearview AI claiming its automatic scraping of people's images and its extraction of their unique biometric information violate privacy and chill protected political speech and activity.

For those who came in late, Clearview AI has amassed a database of more than three billion photos of individuals by scraping sites such as Facebook, Twitter, Google and Venmo. It's bigger than any other known facial recognition database in the US, including the FBI's.

The New York company uses algorithms to map the pictures it stockpiles, determining, for example, the distance between an individual's eyes to construct a "faceprint." This technology appeals to law enforcement agencies across the country, using it in real time to determine people's identities.

Not surprisingly, civil liberties groups have problems with this. The plaintiffs -- four individual civil liberties activists and the groups Mijente and NorCal Resist -- allege Clearview AI "engages in the widespread collection of California residents' images and biometric information without notice or consent".

This is important for proponents of immigration or police reform, whose political speech may be critical of law enforcement and community members that have been historically over-policed and targeted by surveillance tactics.

Clearview AI enhances law enforcement agencies' efforts to monitor these activists, as well as immigrants, people of colour and those perceived as "dissidents", such as Black Lives Matter activists, and can potentially discourage their engagement in protected political speech as a result, the plaintiffs say.

The plaintiffs are seeking an injunction that would force the company to stop collecting biometric information in California. They also seek the permanent deletion of all images and biometric data or personal information in their databases.

 

Last modified on 11 March 2021
Rate this item
(3 votes)

Read more about: