A charity established by the family of Molly Russell, a London teenager who tragically took her own life after being exposed to harmful content on social media platforms, has denounced Mark Zuckerberg's recent pledge to prioritize kindness on the newly launched Threads app.
The Guardian reports that Molly Rose Foundation argues that Zuckerberg's commitment contradicts the troubling reality of Instagram, which significantly contributed to Molly's suicide.
As discussions surrounding online safety intensify, the clash between aspirations and actions within social media moderation continues to be a pressing concern.
Inquest Findings, Foundation's Criticism of Instagram
Molly Russell, just 14 years old, died in 2017 after encountering distressing material on various social media platforms, including Instagram.
Last year, an inquest into her death confirmed that harmful online content related to self-harm, suicide, and depression had played a contributory role.
The Molly Rose Foundation, created by Molly's family, expresses deep concern over the association between Threads and Instagram.
Notably, users of Threads are required to have an Instagram account to access the new app, and most rules governing Instagram will be carried on to Threads.
Highlighting Instagram's Unsafe Platform
The Foundation accuses Instagram, owned by Meta (formerly Facebook), of a poor track record in protecting users from harm.
Inconsistent content moderation and a casual approach to platform design choices have exposed children to dangerous content.
A spokesperson for the Foundation criticizes Meta's marketing efforts, highlighting the significant disparity between promoting Instagram as a safe platform for families and the reality experienced by users.
Labeling Zuckerberg's prioritization of kindness over profit as "absurd and contrary to reality," the Foundation asserts that Meta has much work to do to ensure the safety of its users on Instagram, Facebook, and WhatsApp.
Read Also : Twitter Warns to Sue Meta for Hiring the Employees it Fired for Threads-Elon Musk Regrets Layoffs
The Online Safety Bill
The Guardian also reports that children's digital safety campaigner Beeban Kidron adds her voice to the chorus of criticism against Zuckerberg and Meta.
Kidron, influential in shaping the forthcoming online safety bill, emphasizes Meta's hubris in positioning itself as a provider of kind services while neglecting its responsibility to address the toxic elements within its existing platforms.
The online safety bill will impose a duty of care on tech companies, compelling them to safeguard children from harmful content or face severe consequences, including substantial fines or even jail time for executives.
Meta's Response
In response to the mounting concerns, a spokesperson from Meta emphasizes the company's commitment to safety.
They assert that all of Meta's products, including Threads, will enforce Instagram's community guidelines for content and interactions.
The spokesperson highlights Meta's substantial investments of over $16 billion since 2016, specifically aimed at building teams and technologies to protect users.
Notably, Meta has implemented safety features carried over from Instagram onto Threads, such as filtering out replies containing certain words and automatically blocking accounts that have been blocked on Instagram.
Stay posted here at Tech Times.
Related Article : Meta Receives 30 Million Sign-Ups for Twitter Rival 'Threads'