Apple Kills Controversial CSAM iCloud Photo Scanning Feature

After much controversy, Apple is no longer moving forward with it.

Apple's photo scanning feature for iCloud in search of child sexual abuse material (CSAM) is no longer moving forward.

The iPhone maker confirmed that it killed its plans to roll out such a security feature, which brought massive controversy to the renowned tech giant. And as such, the detection tool would no longer see the light of day.

Apple CSAM iCloud Photo Scanning
In this photo illustration, the logo of the Apple computers brand is reflected in the eye of a man looking at a computer screen on May 8, 2006 in London, England. The Beatles record label Apple Corps has lost high court trademark dispute against technology giant Apple Computer over the use of the Apple logo. Illustration by Bruno Vincent/Getty Images

Apple's CSAM iCloud Photo Scanning Feature is No Longer Rolling Out

As per the latest report by Mac Rumors, Apple announced that it is working to dish out new security features that address child safety.

And the proposed new safety features include CSAM detection on iCloud Photos. It should scan the photos of iCloud users to look for potentially child-abusive images. The iPhone maker planned to release it on iOS 15 and iPad OS 15.

However, the all-new iOS 16 has started rolling out, and we have yet to see it coming.

Apple iCloud
SAN FRANCISCO, CA - JUNE 06: Attendees walk by a sign for the new iCloud during the 2011 Apple World Wide Developers Conference at the Moscone Center on June 6, 2011 in San Francisco, California. by Justin Sullivan/Getty Images

It is worth noting that the feature got heavily criticized even before Apple started rolling it out. Security researchers and even some employees of the tech giant warned against it.

Due to the feedback of its customers, the Cupertino-based tech giant postponed its rollout. Initially, the firm plans to release the detection feature before the end of 2021.

Given that, the tech behemoth kept mum about it for almost a year now until today. Apple confirmed in its recent statement to Wired that it no longer plans to move forward with it.

Why is Apple Killing its CSAM iCloud Photo Detection Feature?

The tech giant says that it has "decided to not move forward with our previously proposed CSAM detection tool for iCloud Photos."

Apple now says that "children can be protected without companies combing through personal data."

But despite that, the iPhone maker would still have child safety in mind.

"We will continue working with governments, child advocates, and other companies to help protect young people, preserve their right to privacy, and make the internet a safer place for children and for us all, Apple says.

Wired notes in its report that the iPhone maker confirmed that the CSAM detection feature is dead shortly after rolling out expanded end-to-end encryption to iCloud.

This time, the iOS 16.2 update is bringing end-to-end encryption even to backups and photos on iCloud. It should further enhance the privacy of its users.

Critics find the CSAM doing otherwise. Some cybersecurity experts warned that the detection feature could be used by law enforcement as a backdoor to surveil some users.

But now, Apple has completely abandoned its plans to roll out the scanning feature.

Teejay Boris
Tech Times
ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Tags:AppleCSAM
Join the Discussion
Real Time Analytics