Published in News

Apple shelves spying plans

by on16 December 2021


Surprised that users object to mass surveillance

Fruity cargo cult Apple has quietly shelved plans to search everyone’s data base to look for kiddie porn.

Jobs' Mob has quietly nixed all mentions of CSAM from its Child Safety webpage, and this is the closest we will see that Apple has abandoned controversial plan to detect child sexual abuse images on iPhones and iPads.

For those who came in late Apple announced a planned suite of new child safety features, including scanning users' iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos, and expanded CSAM guidance in Siri and Search.

It was surprised that users who were not paedophiles had a problem with Apple’s autocratic monitoring of their databases.

The features were criticized by a wide range of individuals and organizations, including security researchers, the privacy whistleblower Edward Snowden, the Electronic Frontier Foundation (EFF), Facebook's former security chief, politicians, policy groups, university researchers, and even some Apple employees.

Apple has yet to make an announcement that the plan has been abandoned, but it is unlikely to happen now.

Last modified on 16 December 2021
Rate this item
(1 Vote)

Read more about: