EU Launches Investigation Into Facebook, Instagram Over Child Safety Concerns, Scrutinizing Algorithmic Systems

Facebook and Instagram find themselves once again under the microscope of European Union regulators.

In the wake of growing concerns about online safety for children, Facebook and Instagram find themselves once again under the microscope of European Union regulators.

This fresh wave of scrutiny comes amid mounting pressure for social media platforms to enhance their measures for protecting young users from harmful content and predatory behavior.

Facebook Parent Company Meta Reports Strong Quarterly Earnings
A sign is posted in front of Meta headquarters on April 28, 2022 in Menlo Park, California. Justin Sullivan/Getty Images

Facing New EU Digital Scrutiny

The European Union has initiated new probes into Facebook and Instagram, alleging their inadequate protection of minors online violates the bloc's stringent digital regulations for social media platforms.

This marks another phase of examination for Meta Platforms, the parent company of Facebook and Instagram, within the framework of the European Union's Digital Services Act.

The legislation, implemented last year across 27 nations, aims to enhance oversight of online platforms and safeguard internet users.

The European Commission, the EU's administrative arm, voiced concerns regarding the algorithmic mechanisms Facebook and Instagram use to recommend content like videos and posts.

There's unease that these mechanisms might take advantage of children's vulnerabilities and lack of experience, potentially fostering addictive behavior.

The Associated Press reported that concerns about these mechanisms possibly exacerbate the "rabbit hole" effect, steering users toward progressively distressing content.

The commission is also investigating Meta's implementation of age verification tools to prevent young children from accessing Facebook or Instagram and encountering unsuitable content.

These steps require users who create an account to be at least 13 years old and examine whether the company adheres to DSA regulations that demand stringent privacy, safety, and security for minors.

Meta's Response

In a statement, Meta emphasized its decade-long efforts in crafting over 50 tools and policies to ensure safe and age-appropriate online experiences for young people.

The company expressed readiness to discuss its initiatives with the European Commission, highlighting that child protection is a shared challenge across the industry.

These cases represent the latest instances of child protection being scrutinized under the DSA, which mandates platforms implement rigorous measures to safeguard minors.

Earlier this year, the commission initiated two investigations into TikTok over concerns regarding potential risks to children.

European Commissioner Thierry Breton expressed skepticism about Meta's compliance with DSA obligations in a recent social media post, citing concerns over the potential negative impact on young Europeans' physical and mental well-being using Facebook and Instagram.

These newly announced cases add to the existing investigations into Facebook and Instagram under the DSA, mainly focusing on their efforts to combat foreign disinformation ahead of the upcoming EU elections.

Additionally, investigations are underway concerning the compliance of social media platform X and ecommerce site AliExpress with EU regulations.

There is no set deadline for the conclusion of these investigations, and potential violations can result in fines of up to 6% of a company's annual global revenue.

Written by Inno Flores
Tech Times
ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion
Real Time Analytics