A new study claims that the TikTok For You Page (FYP) automatically shows self-harm videos to its users.
The research, conducted by international consumer watchdog group Eko, stated that the video platform easily offers violent content. Because of this, Eko warned that TikTok may be harmful, especially to young consumers.
The new study "Suicide, Incels, and Drugs: How TikTok's deadly algorithm harms kids" revealed how fast TikTok shows harmful content.
New Study Claims TikTok FYP Automatically Shows Self-Harm Videos
According to News Nation's latest report, Eko created 13-year-old accounts to see if TikTok promotes harmful content to young users.
Shockingly, involved researchers were able to trigger TikTok's algorithm to target its accounts; videos shown to them are about suicide and violence.
Eko said that this activity violates TikTok's own Community Guidelines.
"According to TikTok, the platform does not allow content depicting, promoting, normalizing, or glorifying activities that could lead to suicide, self-harm, or disordered eating," said the watchdog group.
TikTok Suicide Hashtag List
Eko's new study shows just how alarming the suicide videos and violent content on TikTok really are.
The watchdog organization even provided a last to show the suicide-related hashtags that receive billions and millions of views:
- #sh (6 billion views, over 920,000 posts)
- #imdone (1 billion views, over 230,000 posts)
- #hurting (More than 725 million views, over 120,000 posts)
- #realrx (More than 300 million views, over 20,000 posts)
- #sadslideshow (More than 270 million views, over 17,000 posts)
These five suicide hashtags are just among the self-harm hashtags on TikTok that receive lots of views and are used on thousands of posts.
If you want to see more, you can visit Eko's official study by clicking this link.
As of writing, it is hard to conclude if TikTok's new Community Guidelines can solve the suicide content issue on the video platform. But, the social media giant said that removing violent content and restricting mature videos from appearing on young users' FYPs are included in the new rules.
The updated Community Guidelines are expected to roll out on Apr. 21.
In other news, BCC decided to mandate employees to remove the TikTok app from their corporate devices. We also reported about the potential buyers in case ByteDance really sells TikTok.
For more news updates about TikTok and other social media platforms, always keep your tabs open here at TechTimes.