Dubbed Take It Down, the tool is operated by the National Center for Missing and Exploited Children, and funded in part by Meta Platforms, the owner of Facebook and Instagram.
The site lets anyone anonymously – and without uploading any actual images – create a digital fingerprint of the image. This fingerprint (a “hash”) then goes into a database and the tech companies that have agreed to participate in the project remove the images from their services.
The only participating platforms are Facebook, Instagram, Yubo, OnlyFans and Pornhub. If the image is on another site, or if it is sent in an encrypted platform such as WhatsApp, it will not be taken down.
If someone alters the original image – for instance, cropping it, adding an emoji or turning it into a meme – it becomes a new image and thus need a new hash.
NCMEC spokesperson Gavin Portnoy said: “Take It Down is made specifically for people who have an image that they have reason to believe is already out on the web somewhere, or that it could be. You’re a teen and you’re dating someone and you share the image. Or somebody extorted you and they said, ‘If you don’t give me an image, or another image of you, I’m going to do xyz.’”
Portnoy said teens may feel more comfortable going to a site than to involve law enforcement, which wouldn’t be anonymous, for one.
Meta, when it was still Facebook, attempted to create a similar tool, although for adults, in 2017. It didn’t go over well because the site asked people to send their (encrypted) nudes to Facebook which was felt to be like giving a list of elderly people living alone to Harold Shipman.
The company tested out the service in Australia for a brief period, but didn’t expand it to other countries. In 2021, it helped launch a tool for adults called StopNCII – or nonconsensual intimate images, AKA revenge porn. That site is run by a UK non-profit, the UK Revenge Porn Helpline, but anyone around the globe can use it.
Antigone Davis, Meta’s global head of safety, said Take It Down is one of many tools the company uses to address child abuse and exploitation on its platforms.
“In addition to supporting the development of this tool and having, reporting and blocking systems on our platform, we also do a number of different things to try to prevent these kinds of situations from happening in the first place. So, for example, we don’t allow unconnected adults to message minors,” she said.