Apple's photo scanning feature for iCloud in search of child sexual abuse material (CSAM) is no longer moving forward.
The iPhone maker confirmed that it killed its plans to roll out such a security feature, which brought massive controversy to the renowned tech giant. And as such, the detection tool would no longer see the light of day.
Apple's CSAM iCloud Photo Scanning Feature is No Longer Rolling Out
As per the latest report by Mac Rumors, Apple announced that it is working to dish out new security features that address child safety.
And the proposed new safety features include CSAM detection on iCloud Photos. It should scan the photos of iCloud users to look for potentially child-abusive images. The iPhone maker planned to release it on iOS 15 and iPad OS 15.
However, the all-new iOS 16 has started rolling out, and we have yet to see it coming.
It is worth noting that the feature got heavily criticized even before Apple started rolling it out. Security researchers and even some employees of the tech giant warned against it.
Due to the feedback of its customers, the Cupertino-based tech giant postponed its rollout. Initially, the firm plans to release the detection feature before the end of 2021.
Given that, the tech behemoth kept mum about it for almost a year now until today. Apple confirmed in its recent statement to Wired that it no longer plans to move forward with it.
Read Also : Apple Car Key Feature Now Shareable to Android Devices! Here's How to Share Digital Car Keys
Why is Apple Killing its CSAM iCloud Photo Detection Feature?
The tech giant says that it has "decided to not move forward with our previously proposed CSAM detection tool for iCloud Photos."
Apple now says that "children can be protected without companies combing through personal data."
But despite that, the iPhone maker would still have child safety in mind.
"We will continue working with governments, child advocates, and other companies to help protect young people, preserve their right to privacy, and make the internet a safer place for children and for us all, Apple says.
Wired notes in its report that the iPhone maker confirmed that the CSAM detection feature is dead shortly after rolling out expanded end-to-end encryption to iCloud.
This time, the iOS 16.2 update is bringing end-to-end encryption even to backups and photos on iCloud. It should further enhance the privacy of its users.
Critics find the CSAM doing otherwise. Some cybersecurity experts warned that the detection feature could be used by law enforcement as a backdoor to surveil some users.
But now, Apple has completely abandoned its plans to roll out the scanning feature.
Related Article : Apple Store Robbers Caught on Video, Authorities Record Back-to-back Apple-related Crimes in November