TikTok is Working on Removing Dangerous Challenges Before They Spread

TikTok
TikTok challenge Unsplash/ Solen Feyissa

TikTok is now trying to strengthen the detection and enforcement of rules against dangerous online challenges and hoaxes.

One in five teenagers has already participated in an online TikTok challenge, according to a survey commissioned by the platform.

TikTok to Stop Dangerous Challenges

The survey looked at teenagers' broad online experience, without focusing on any one platform. The survey also revealed that 1 in 50 teenagers already took part in a risky and dangerous challenge, while fewer than 1 in 300 took part in a really dangerous challenge.

There has been widespread concern about the proliferation of potentially harmful online challenges across numerous platforms. In 2020, the "skull-breaker" challenge that was shared on TikTok was linked to several severe injuries.

Earlier this year, doctors warned of the risk to life and limb of the viral "milk-crate challenge" on TikTok which invited people to climb pyramids of milk crates.

TikTok also banned the "dry scopping challenge" after doctors warned of its health risk.

However, online challenges can be positive and it can promote causes, like the "ice-bucket challenge," which helped raise awareness of amyotrophic lateral sclerosis or ALS, according to Social Media Today.

Violating Content

The independent report titled "Exploring Effective Prevention Education Responses to Dangerous Online Challenges" stated that TikTok commissioned draws on a survey of parents, teachers and 5,400 teenagers between the ages of 13 to 19 years old in the United States, the United Kingdom, Germany, Italy, Australia, Mexico, Brazil, Vietnam, Indonesia, and Argentina.

In response to the findings, TikTok said technology that alerts their safety teams to sudden increases in violating content linked to hashtags would be expanded to also capture potentially dangerous behavior, according to BBC.

For example, if a hashtag like #foodchallenge is used to share recipes suddenly saw an increase in interest apparently connected to videos breaking the rules, the TikTok team would investigate.

TikTok already has a policy of removing content that promotes or glorifies dangerous acts, according to Newsbreak.

Self-harm Hoaxes

Experts contributing to the report noted that adolescence is a period that has always been associated with heightened risk-taking.

But it comes at a time of heightened public debate about the impact of social media on teenagers, after Facebook whistleblower Frances Haugen revealed Facebook research into the effect Instagram had on their mental health.

The TikTok research also looked at self-harm and suicide hoaxes. Some schools warned parents about Momo, a sinister character with bulging eyes setting children dangerous challenges such as harming themselves.

Even though experts said that it was a hoax, the survey still indicates that Momo affected children.

Alarmist Warnings

Of those to have seen a hoax on TikTok, 31% said that it had a negative impact while 63% of those said this had been on their mental health.

TikTok said that hoaxes like these usually have similar characteristics and in previous cases, false warnings have circulated suggests that children were being encouraged to take part in games which often resulted in self-harm.

The hoaxes spread through warning messages, further encouraging others to alert as many people as possible to avoid perceived negative consequences.

And as well as removing the hoaxes, it would now begin to remove alarmist warnings about them, as they could cause harm by treating the self-harm hoax as real.

This article is owned by Tech Times

Written by Sophie Webster

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Tags:TikTok
Join the Discussion
Real Time Analytics