Apple Removes All References to Controversial CSAM Scanning Feature From Its Child Safety Webpage

This article has been indexed from MacRumors: Mac News and Rumors – Front Page

Apple has quietly nixed all mentions of CSAM from its Child Safety webpage, suggesting its controversial plan to detect child sexual abuse images on iPhones and iPads may hang in the balance following significant criticism of its methods.



Apple in August announced a planned suite of new child safety features, including scanning users’ iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos, and expanded CSAM guidance in Siri and Search.

Following their announcement, the features were criticized by a wide range of individuals and organizations, including security researchers, the privacy whistleblower Edward Snowden, the Electronic Frontier Foundation (EFF), Facebook’s former security chief, politicians, policy groups, university researchers, and even some Apple employees.

The majority of criticism was leveled at Apple’s planned on-device CSAM detection, which was lambasted by researchers for relying on dangerous technology that bordered on surveillance, and derided for being ineffective at identifying images of child sexual

[…]
Content was cut in order to protect the source.Please visit the source for the rest of the article.

Read the original article: Apple Removes All References to Controversial CSAM Scanning Feature From Its Child Safety Webpage