The European Commission has recently taken a decisive step in investigating TikTok's compliance with the Digital Services Act (DSA) in light of growing concerns about child safety.
As first reported by Reuters, the Commission has launched formal proceedings to investigate the popular short-form video app's potential violations of child protection, advertising transparency, data access for researchers, and management of addictive design and harmful content.
If TikTok is found to have violated its obligations under the DSA, it may face fines of up to 6% of its global turnover.
EU Commission's Formal Proceedings Against TikTok
According to the Commission, the investigation will look into various issues, including the effectiveness of TikTok's measures for assessing and mitigating systemic risks posed by its algorithmic systems.
The platform, which has garnered immense popularity among younger demographics, will also be tested for its effectiveness in preventing behavioral addictions and the so-called 'rabbit hole effects,' which could have a negative impact on users' well-being.
Furthermore, the Commission will investigate TikTok's efforts to ensure a high level of privacy, safety, and security for minors, particularly default privacy settings and the operation of recommender systems. Transparency in advertising practices and data availability for researchers will also be scrutinized.
This move is part of the European Union's larger efforts to regulate online platforms and protect users, particularly children, from potential harm. TikTok, along with several other major online platforms, was designated as a Very Large Online Platform (VLOP) by the DSA in April 2023, subjecting it to stringent compliance requirements.
VLOPs such as Alibaba's AliExpress, Amazon Store, and Apple App Store, alongside platforms like Booking.com, Facebook, and Google services such as Play, Maps, and Shopping, are among the 19 online platforms outlined by the European Union.
The DSA mandates that these platforms ensure algorithmic transparency while granting users greater control over their data, including options to opt-out from recommendation systems, facilitating the reporting of illegal content.
What's Next?
Since February 17, the DSA has applied to all online intermediaries operating in the EU, ushering in a new era of regulation aimed at increasing digital safety and accountability.
While TikTok has yet to respond publicly to the Commission's formal proceedings, the investigation highlights the growing regulatory pressures on tech companies in the wake of rising concerns about online harms, misinformation, and privacy violations.
The Commission's next steps will be to gather additional evidence, conduct interviews, and potentially impose interim measures or non-compliance decisions. However, the duration of the investigation is unknown, depending on a number of factors, including the case's complexity and TikTok's cooperation.
With TikTok's immense influence over millions of users, particularly young people, the Commission's investigation could have far-reaching implications for global online safety standards and regulatory frameworks.
Stay posted here at Tech Times.