EU Urges X to Explain Cut to Content Moderation Resources Amid Disinformation Fears

This inquiry comes amid disinformation fears ahead of the European elections.

The European Union (EU) has called on Elon Musk's social network X (formerly Twitter) to clarify a reduction in content moderation resources, citing concerns about disinformation ahead of the upcoming European elections in June.

X
KIRILL KUDRYAVTSEV/AFP via Getty Images

EU Requests More Information About X's Reduction in Content Moderation Resources

This move is part of the EU's investigation into X, initiated in December under legislation to combat illegal online content.

Similarly, the EU has launched a probe into Meta's Facebook and Instagram platforms over worries that they are not doing enough to combat disinformation.

The European Commission has expressed the need for more information from X regarding its content moderation activities and resources.

This request follows X's transparency report from April, which revealed a nearly 20% reduction in the team of content moderators compared to the October 2023 report.

Additionally, X has decreased linguistic coverage within the EU from 11 languages to seven.

The Commission has requested that X provide comprehensive internal documents and information about these changes.

Risk Assessments on X's Generative AI Use

According to the EU's press release, the Commission is also asking for the platform's risk assessments related to the implementation of generative AI tools within the EU and other relevant areas under investigation.

Specifically, the Commission seeks detailed insights and internal documents regarding X's content moderation resources, particularly concerning the reduction in the team of content moderators as reported in the Transparency report.

Moreover, the Commission is interested in understanding the risk assessments and mitigation measures associated with generative AI tools concerning electoral processes, illegal content dissemination, and rights protection.

The investigation into X, initiated in December 2023, aims to assess potential breaches of the Digital Services Act (DSA) related to risk management, dark patterns, advertising transparency, content moderation, and data access for scholars.

The recent request for information represents a step forward in the ongoing investigation, building upon prior evidence gathering and analysis, including X's Transparency report from March 2024 and responses to previous requests for information regarding mitigation measures for risks associated with generative AI.

X must furnish the specified details concerning content moderation resources and generative AI to the Commission by May 17, 2024, and address any additional inquiries by May 27.

Article 74 (2) of the DSA grants the Commission the authority to levy fines for inaccuracies, omissions, or deceptive information submitted in response to these inquiries.

Failure to meet these deadlines may lead to the imposition of periodic penalty payments if the Commission requests the information through formal decision-making after non-compliance.

Byline
Tech Times
ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion
Real Time Analytics