Apparently, he is one of the billions whose face has been turned into a search term without his consent. Clearview AI scraped billions of photos from the internet to create a huge database of faces. By uploading a single photo, Clearview’s clients, which include law enforcement agencies, can use the company’s facial recognition technology to unearth other online photos featuring the same face.
Marx emailed Clearview if they had used his snaps and because he was in the EU they had to show him. The pictures were around a decade old but both showed Marx, looking fresh faced in a blue T-shirt, taking part in a Google competition for engineers. Marx knew the pictures existed. But unlike Clearview, he did not know a photographer was selling them on stock photo website Alamy without his permission.
Marx released that he was not in control of what people did with his data. In February 2020 he filed a complaint with his local privacy regulator in Hamburg the case has been closed but Marx says he has not been notified of the outcome.
“It’s almost been two and a half years since I complained about ClearView AI, and the case is still open,” says Marx, who works as a security researcher at the IT security company Security Research Labs. “That is too slow, even if you take into account that it’s the first case of its kind.”
Across Europe, millions of people’s faces are appearing in search engines operated by companies like Clearview. The region might boast the world's strictest privacy laws, but European regulators, including in Hamburg, are struggling to enforce them.
In October, the French data protection authority became the third EU regulator to fine Clearview 20 million euros ($19 million) for violating European privacy rules. Yet Clearview has not removed EU faces from its platform, and similar fines issued by regulators in Italy and Greece remain unpaid.
Marx does not believe it’s technically possible for Clearview to permanently delete a face. He believes that Clearview’s technology, which is constantly crawling the internet for faces, would simply find and catalog him all over again.
Each of these platforms reveals deeply personal information. “You can tell where I study, which political party I like,” Marx says. Together, the pictures these companies have collected of him point to an industry that reveals vastly more information than any social media profile.
When Marx started pulling on this thread back in 2020, all he wanted was for one company to stop collecting pictures of his face. Now it’s bigger than that. Today, he’s calling for regulators to stop the industry from collecting pictures of Europeans altogether. For that to happen, regulators will have to make an example of Clearview. The question is, can they?