Apple's Feature That Blurs Explicit Images To Be Rolled Out to Other Countries

iPhone
iPhone Unsplash/Bagus Hernawan

Apple's upcoming feature that automatically blurs explicit images sent to minors is now rolling out to other countries.

Apple's Anti-Nudity Feature is Coming to Other Countries

The "Communication Safety in Messages" feature was launched in the United States in 2021. The feature is now coming to the Messages apps on iOS, macOS, and iPadOS for users in the United Kingdom, Canada, Australia, and New Zealand.

The schedule of the launch for each country is unclear, but according to The Guardian, the feature is coming to the United Kingdom "very soon."

The scanning happens on-device, and it does not affect the end-to-end encryption of messages, according to The Verge.

The instructions on enabling the anti-nudity feature, which is integrated with Apple's current Family Sharing system, can be found on Apple's website.

The opt-in feature scans incoming and outgoing pictures for explicit images to protect minors. If found, the image is immediately blurred, and guidance is provided to the user so they can find help.

They will also see a message that will reassure them that it is okay not to view the image and to leave the conversation, especially if it makes them uncomfortable.

The pop-up message will also encourage the user to get help from someone that they trust or trained professionals. Users will also be given the choice to block the person who sent them an explicit message and image.

Similar to its initial release in the United Kingdom, minors will have the option of messaging an adult they trust about a flagged image.

When Apple announced the feature in August 2021, it suggested that this notification would happen automatically. The critics were fast to point out that the original approach risked outing queer children to their parents, and they could be abused.

Apple also plans to expand the rollout of the said feature for Siri, Spotlight, and Safari searches that will point users towards safety resources if they search for topics related to child sexual abuse.

CSAM Feature

Aside from these two child safety features, Apple originally announced the third initiative in August 2021 that involved scanning pictures for child sexual abuse material or CSAM before uploading them to a user's iCloud account.

However, the Apple feature drew massive backlash from several privacy advocates, who argued that it risked introducing a backdoor that would undermine the security of the users, according to 9to5Mac.

The CSAM feature will automatically contact the authorities and report the user once it detects an explicit image saved in the iCloud account.

The privacy advocates are concerned that the feature might make a mistake in detecting an image and may wrongly accuse a user.

The advocates also voiced their concern about different scenarios, like if a user saves a picture of their child in their iCloud account.

The tech company later announced that it would delay the release of all three features while it addressed the public's concerns.

Having released the first two features, Apple has yet to provide an update on when the CSAM detection feature will be available on its devices.

Related Article: Apple CSAM Detection: How to Stop it from Scanning Your iPhone, iPad Photos

This article is owned by Tech Times

Written by Sophie Webster

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Tags:Apple
Join the Discussion
Real Time Analytics