Apple's Latest iOS Beta Adds A Feature to Protect Children From Inappropriate Imagery

iPhone
iPhone Apple Unsplash/ Paul Hanaoka

Apple's latest iOS 15.2 beta has added an opt-in communication safety feature that is made to warn children when they send or receive pictures that contain nudity.

Apple's Communication Safety Feature

The new Messages feature is part of Apple's child safety initiatives that the tech giant announced back in August, according to Apple Insider.

The feature is different from CSAM, a controversial system that Apple created in order to detect child pornography in iCloud.

Although the feature is not enabled by default, a child's parent or guardian can change the settings via the Family Sharing plan, according to The Verge.

The feature will automatically detect nudity in all messages, blur them, and a warning will pop up.

Apple stated that children would be provided with resources, and they would be reassured that it is alright not to view the image if they do not want to.

Also Read: Apple iCloud Private Relay to Release as Beta Feature when iOS 15, iPadOS Releases This Fall-Why?

If the child tries to send pictures that contain nudity, a similar process will happen. Whatever the case is, the child will be given the option to contact anyone that they trust for help.

Unlike the pats version of the iMessage feature, parents and guardians will not be notified by the system if it detects a photo sent to a child that contains nudity.

Apple stated that parental notification could pose a risk for a child, and it may even trigger a threat of physical violence or abuse.

Apple stated that the detection of nudity in a photo would not leave a device, and it does not encroach upon the end-to-end encryption of messages.

The feature is opt-in, and it can only be accessed via the beta version of the iOS. This means that it is not currently rolled out to everyone.

Apple did not provide any timeline on when the feature could reach its final stage, and there is still a chance that it would not be included in the official release of the iOS.

Apple's CSAM Feature

The feature is different from Apple's controversial CSAM detection system that the company announced months ago and was forced to delay due to the protest, according to CNBC.

In September, Apple stated that the CSAM detection feature would make its official debut before 2021 ends.

Now, the tech giant stated that it would take more time for them to collect input from the public and privacy experts and make the needed changes.

The CSAM feature of Apple was slammed by critics because they believe that it will falsely accuse people of saving child pornography on their devices even though it is not true.

The critics pointed out that the system is not polished, and any mistake will cause inconvenience and shame to the user.

There is still no indication of when the system will roll out. However, the tech giant stated that it would be giving more guidance to children and parents through the Search option and through Siri in case they have any questions about the feature.

In an update to iOS 15 and other iOS updates in 2021, Apple will intervene when the users try to search for child exploitation topics, and they will explain why it is harmful.

This article is owned by Tech Times

Written by Sophie Webster

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion
Real Time Analytics