Apple's Plan to Scan Images to Detect Child Abuse is Met with Criticism as Privacy Concern Rises

iPhone
iPhone 12 Pexels/Pixabay

Apple revealed its plan to scan iPhones of its US consumers for images of child sexual abuse. Although the move drew praise from child protection groups, it raised concern among security advocates and researchers that believe others could easily misuse the system.

Apple's Child Abuse System Under Fire

The researchers stated that the system could be used as a weapon against an individual as it will be a lot easier to invade their privacy. The system can also be used by governments that need tools to surveil their citizens.

Apple designed a tool called NeuralMatch that will detect known images of child sexual abuse. It will scan images before they get uploaded to iCloud.

If the system finds a match, it will be forwarded to a security team. If the image is confirmed as part of child pornography, the user's account will be automatically disabled.

Apple will notify The National Center for Missing and Exploited Children, or NCMEC, and they will call law enforcement, according to Gizmodo.

The tech company plans to scan the encrypted messages of the users as well. The company wants to check for sexually explicit content as part of its child safety measure, but the privacy advocates are not happy about it.

What Does NeuralMatch Flag?

NeuralMatch will only flag pictures that are already in the NCMEC's database. Parents taking photographs of their child in a bathtub, swimming pool or playing in the rain will not be flagged.

However, privacy researchers said that third-party companies could use the tool for improper purposes since it does not see images, it only uses mathematical fingerprints.

Matthew Green, a cryptography researcher at Johns Hopkins University, warned that the tool could be used against innocent people by sending them inappropriate images to trigger matches for child abuse imagery. That could easily fool the algorithm and alert law enforcement.

Major tech companies such as Google and Facebook have used scanning tools for sexual images for years, according to BBC.

Apple has used those tools to scan files stored in the user's iCloud service and their emails for child pornography, but it is not securely encrypted, which means the tool sometimes misses images that it should have reported.

Apple's Efforts to Tighten Security

The tech giant has been under pressure for years. The government has been pushing the tech company to increase its surveillance of encrypted data, according to MIT.

Introducing new security measures can be tricky, as the company needs to crack down on the child sexual abuse imagery problem while keeping the users' privacy protected.

Hany Farid, a researcher at the University of California, stated that it is possible to secure devices from inappropriate materials while preventing the tool from being used for other purposes.

WhatsApp uses end-to-end encryption to protect the privacy of its users while it detects malware in the app.
Apple is one of the first major tech companies to use end-to-end encryption, in which messages are arranged in a way that only senders and recipients can understand.

However, law enforcement has pressured the tech company to access the said information to investigate crimes like terrorism or child exploitation further.

The tech company said that the changes would begin later this year, and it will be included in the launching of its latest operating system, the iOS15.

This article is owned by Tech Times

Written by Sophie Webster

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Tags:Apple
Join the Discussion
Real Time Analytics