Apple is adding new security measures that would help protect its users from nude content. These features will be implemented for kids and will be optional for adults.
At Apple's Worldwide Developers Conference 2023 in Cupertino Monday, the company debuted expansions to the "Communication Safety" features for kids, including an additional feature for adults. Communication Safety reportedly scans messages on children's devices to flag content that they receive or send in iOS messages that contain nudity.
Apple announced during the event that this feature would also expand to FaceTime video messages, Contact Posters, and the Photos picker tool where users choose images or videos to send and AirDrop, Wired reported.
Apple Users Can Get a Feature That Allows On-Device Detection of Harmful Content
According to Wired, this feature detecting harmful content will be on the user's device instead of the company having to scan iCloud for illegal content. It will reportedly flag inappropriate content for kids, while adults could also get the opt-in filter option. Last December, the company announced that it would get rid of the iCloud photo-scanning tool that aims to combat child sexual abuse material (CSAM).
The Rape, Abuse & Incest National Network or RAINN and other anti-sexual violence organizations have stopped using the term child pornography and switched to referring to it as CSAM since they believed it is more accurate to call it CSAM than child pornography due to evidence of child sexual abuse.
"Just as kids can't legally consent to sex, they can't consent to having images of their abuse recorded and distributed. Every explicit photo or video of a kid is actually evidence that the child has been a victim of sexual abuse," RAINN explained.
Apple Feature Could be Activated Immediately for Children Under 13
This on-device nudity detection feature is a part of the company's initiative to make its devices and ecosystem safer for children without compromising their privacy. Apple initially wanted the Communication Safety feature to help protect children, but it has also released a version of this feature that could work for adults too.
It was noted that this feature would be automatically turned on for all children under 13 through its Family Sharing plan. Only parents can manually disable this feature on their children's devices.
Apple is Investing in an API to Make the Feature More Integratable for iOS Developers
Erik Neuenschwander, Apple's head of user privacy, said the Communication Safety feature allows parents to protect their children from potential harm. Apple also announced in December that it would make an application programming interface (API), allowing third-party developers to integrate the feature into their apps.
Once the Communication Safety API is integrated with third-party developers, the feature will be enabled in the app and help protect children by detecting CSAM.
Users Can Add the API to Their iOS Apps to Bring the On-Device Feature
The API will reportedly be known as the Sensitive Content Analysis framework and will be available for iOS users. Platforms like Discord said they plan to incorporate it into their iOS apps to help protect their users without compromising their privacy.
However, the anti-CSAM initiatives noted that there are still potential situations wherein Communication Safety will still be unable to flag and protect children. That is why Apple is trying to make the feature's API version easier for apps to integrate.
Apple Users Can Access the Feature From the iOS Privacy & Security Settings Menu
Apple noted that it also received feedback regarding wanting the feature for adults, which made them create a version that can be enabled for adults. Thus, the firm also launched Sensitive Content Warning, which also uses local scanning to flag and blur photos and videos containing nudity. It is reportedly designed to be more subtle than Communication Safety for kids.
The Sensitive Content Warning feature can be enabled and controlled through the iOS Privacy & Security settings menu. Once activated, it will help protect adult users from seeing unwanted content.