Published in News

Apple scans iCloud for child sexual abuse

by on06 August 2021


Finally catches up with other cloudy services 

Fruity cargo cult Apple is finally catching up with other cloud companies like Dropbox, Google, and Microsoft and regularly scanning files looking for kiddie porn.

Jobs' Mob has dragged its feet on doing this mostly because it does not want to be seen scanning users files and even offered Apple fanboys encrypt their files before they get to its cloud. While this might be good for user privacy it had effectively made the iCloud a perve's paradise.

Dropbox, Google, and Microsoft already scan user files for content that might violate their terms of service or be potentially illegal, like kiddie porn.  Apple is not going that far yet. 

Its technology that will allow the company to detect and report known child sexual abuse material to law enforcement in a way it says will preserve user privacy, the company said.

The detection of child sexual abuse material (CSAM) is one of several new features aimed at better protecting the children who use its services from online harm, including filters to block potentially sexually explicit photos sent and received through a child's iMessage account. Another feature will intervene when a user tries to search for CSAM-related terms through Siri and Search.

Apple said its new CSAM detection technology -- NeuralHash -- instea works on a user's device and can identify if a user upload known child abuse imagery to iCloud without decrypting the images until a threshold is met and a sequence of checks to verify the content are cleared.

Matthew Green, a cryptography professor at Johns Hopkins University, revealed the existence of the technology in a series of tweets. The news was met with some resistance from some security experts and privacy advocates, but also users who are accustomed to Apple's approach to security and privacy that most other companies don't have. 

Last modified on 06 August 2021
Rate this item
(2 votes)

Read more about: