Social Media Platforms That Fail to Ban Suicide, Self-Harm Content Might Face Multi-Million Fines

After the death of a 14-year-old young teen in northwestern London in 2017, the government of the United Kingdom is considering fining social media platforms such as Facebook and Instagram for failing to tackle sensitive content such posts that encourage suicide and self-harming.

Social Media Platforms Could Face Multi-Million Fines

According to a report by the Daily Mail, the social media platforms that fail to deal with these types of content can be fined for millions of pounds under the law.

Moreover, a bill that will likely be introduced early next year is also proposing that any frequently offending platform would be blocked altogether in the UK to help protect young people in the country who are online almost all the time.

The threat of the multi-million fine, as well as the possible ban, will likely force social media platforms to get rid of "illegal harms on their platforms" and take these matters seriously.

This follows the death of Molly Russell, 14, who killed herself in 2017 after seeing images of self-harm on popular social media site Instagram, which led her father to campaign for better protection for social media users.

"Molly's suicide smashed like a wrecking ball into my family's life. I am in no doubt that the graphic self-harm content and suicide-encouraging memes on Molly's social media feeds helped kill her," Russell's father, Ian, said.

Furthermore, he said that it is hard to know how content on Facebook, Instagram, Twitter, Snapchat, and such sites could lead young people to self-harm, "or worse, take their own lives" due to lack of research using data from these companies.

Studies Conclude the Same Thing

According to a study by Lancet in 2019, 40% of young people who were frequently on social media are more likely to have some mental health issues of some sort.

Several other studies share the same endgame: people are more likely to develop depression, anxiety, eating disorders, and body dysmorphia due to the contents they see on social media, such as the photo-sharing site Instagram.

More often, people have the urge to follow influencers who only post "perfect" selfies and lives online, leading young people to feel self-conscious about their own bodies and image.

Unfortunately, the Royal College of Psychiatrists said in January of this year that it was actually hard to assess the full scale of the problem as many sites refuse to hand over anonymized data.

Acting on Sensitive Topics

"A child seeing, and worse, being 'suggested,' self-harm material is every parent's nightmare. It's right we act on this and make sure tech firms are in no doubt this stuff must not be on their platforms," a government source said.

Nevertheless, the government is reportedly keen on preventing self-harm survivor stories and suicide support groups from being accidentally deleted by the companies.

The exact amount of the fines are not yet specified, but it could mean colossal loss even for multi-billion companies such as Facebook, especially if they are entirely banned from a country.

If you or a loved one is going through rough times, the Suicide Prevention Lifeline hotline is open 24/7. Please call 1-800-273-8255. In the UK, confidential support is available on Samaritans at 116123.

This article is owned by Tech Times

Written by: Nhx Tingson

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion
Real Time Analytics