Apple Faces $1.2B Lawsuit From CSAM Victims After Its Latest Decision to Skip Adding Tools

Apple's CSAM efforts are flailing and victims are not letting them off easily.

Medhat Dawoud on Unsplash

Over thousands of CSAM (child sexual abuse materials) victims are now taking the fight against Apple after the company ultimately decided to skip adding tools that will help detect it on their platform. There are many companies and services which already offer CSAM detection technology by keeping databases in check regularly, and this tool can detect these illegal materials to be later verified by humans.

In this matter, Apple has negated itself over time as the company first claimed that it will apply a kind of CSAM scanning tool for its services, but the company later pulled out from offering this technology to prevent further endangerment.

Apple Faces $1.2B Lawsuit from CSAM Victims In Latest Development

According to the latest report by ArsTechnica, CSAM victims recently came together to launch a collective lawsuit against Apple Inc., centering on their efforts, or their lack thereof, to detect and foil CSAM's spread. It is known that Apple is still one of the biggest companies in the world, especially within consumer tech, but instead of applying said CSAM rules, the company went back on it.

Thousands of victims banded together for a proposal regarding a class action lawsuit against Apple, with the company now facing a possible $1.2 billion fine to pay the plaintiffs, should Cupertino lose.

However, it was previously argued by Apple that adding these CSAM tools may lead to abuse on the watchdogs' part, most especially government entities, and may lead to illegally surveilling the tech giant and being victimized.

Apple Went Back on its Word With New CSAM Tools

It is known that Apple previously agreed to apply CSAM scanning tools for its services, but the company maintained its stance regarding the matter, particularly as it may also cause harm to the company. Apple went back on its word to applying new CSAM tools, and this is despite Apple claiming that it also complies with a certain country's laws, especially one that centers on scanning capabilities.

Should Apple lose the case, the $1.2 billion would be joined by an order that would request the return of its CSAM monitoring tools that the company failed to apply.

CSAM Tools and Prevention Efforts

There are many authorities and government organizations who are campaigning against the rampant availability of CSAM content on the internet and have since lobbied for different actions that can help fight against this problem. One of the most popular moves was from the European Union which previously introduced its CSAM scanning program but has since been criticized for its overwhelming monitoring on accounts.

Moreover, it is also known that generative AI's rise significantly influenced the more dangerous actions by bad actors in creating deepfaked child pornography in the online world, with authorities already cracking down on this matter. That being said, renowned AI companies like OpenAI, Microsoft, Meta, and more joined forces to uphold child safety within their platforms and avoiding the misuse of generative AI for these materials.

In various regions, Apple saw its fair share of CSAM monitoring requirements from different authorities, including the United Kingdom's Online Safety Bill centering on client-side scanning capabilities. However, Apple previously went back on its word to add its CSAM scanning tool that would supposedly help more unknowing children from being victimized, and now they are facing a $1.2 billion lawsuit against their recent change of heart.

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Tags:AppleCSAM
Join the Discussion
Real Time Analytics