Australia's Internet Watchdog to Draft Industry Standards for Tech Giants to Combat Online Child Abuse, Pro-Terror Material

The draft standards address the challenges posed by "synthetic" child sexual abuse material and pro-terror content.

Australia's eSafety Commissioner has initiated the process of drafting industry standards designed to urge major technology companies, including Meta, Apple, and Google, to take more decisive action against online child abuse material and "pro-terror content."

The standards, currently open for public consultation and awaiting parliamentary approval, cover Designated Internet Services such as apps, websites, and file storage services, as well as Relevant Electronic Services, encompassing messaging services, online dating services, and gaming.

Meta
Buda Mendes/Getty Images

Draft Standards

The draft standards address the multifaceted challenges posed by "synthetic" child sexual abuse material and pro-terror content generated through open-source software and generative AI. These efforts come under the framework of Australia's Online Safety Act, initiated in January of the previous year.

Industry associations were initially entrusted with the task of formulating enforceable codes for eight sectors of the online industry. While six draft codes received approval earlier this year, covering various online sectors, the remaining two, governing Designated Internet Services and Relevant Electronic Services, fell short of providing adequate safeguards.

As a result, the eSafety Commissioner has transitioned to developing standards.

These standards, now in the consultation phase, encompass a range of obligations, from proactive measures for detecting and deterring unlawful content to processes for handling reports and complaints.

They also include tools and information aimed at empowering end-users to enhance online safety and reduce the risk of harmful content online.

Worst-of-the-Worst Online Content

Julie Inman Grant, the eSafety Commissioner, urged industry stakeholders and interested parties to participate in the consultation process. She emphasized the significance of these codes and standards in addressing the worst-of-the-worst online content, such as child sexual abuse material and pro-terror content.

The commissioner clarified that the standards do not mandate companies to compromise end-to-end encryption or introduce systematic vulnerabilities in their encrypted services.

However, operating an end-to-end encrypted service does not absolve companies of responsibility, and the standards aim to ensure that meaningful steps are taken to curb the proliferation of seriously harmful content, particularly child sexual abuse material.

Various industry interventions were highlighted, with Meta's end-to-end encrypted WhatsApp messaging service cited as an example. The service scans non-encrypted parts of its platform to identify potential indicators of child sexual abuse material, contributing to approximately one million reports of child sexual exploitation and abuse annually, according to Grant.

The eSafety Commissioner will review all submissions received during the consultation process to prepare the final versions of the standards for presentation in Parliament. The proposed timeline suggests that the standards would come into force six months after their registration.

The industry codes and standards come with enforcement powers that the eSafety Commissioner will utilize as needed to ensure compliance. All stakeholders are encouraged to thoroughly review the discussion paper and accompanying fact sheets before preparing their submissions.

Byline
Tech Times
ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion
Real Time Analytics