Researchers Claim that TikTok's Algorithms are Promoting Self-Harm, Eating Disorders Amongst Young Users

Researchers created fictitious teen TikTok accounts and claimed that the app was feeding them harmful content.

According to a report released on Wednesday, Dec. 14, TikTok's algorithms are encouraging videos on self-harm and eating disorders to young users.

AP reports that TikTok accounts for fictitious teen personalities in the United States, United Kingdom, Canada, and Australia were made by researchers at the charity Center for Countering Digital Hate (CCDH).

RUSSIA-INTERNET-TIKTOK
This picture taken in Moscow on November 11, 2021 shows the Chinese social networking service TikTok's logo on a tablet screen. KIRILL KUDRYAVTSEV/AFP via Getty Images

Testing TikTok's Algorithm


To test how TikTok's algorithm might respond, the researchers behind the accounts liked videos that contain self-harm and eating disorders.

The team claims that in a matter of minutes, the short-form video app started recommending videos about weight loss and self-harm, including ones with photos of models, razor blades, and content on suicide.

The accounts were given even more inappropriate material when the researchers generated profiles with user names that suggested a particular predisposition to eating disorders, like names that included the words "lose weight."

Social media algorithms function by recognizing topics and material that a user is interested in, and then sending them more of it so they may spend as much time on the site as possible.

However, critics claim that the same algorithms that highlight content about a particular interest can also lead users to access inappropriate content.

Josh Golin, executive director of Fairplay, told AP that younger users are more susceptible to peer pressure, bullying, and harmful content as they spend more time on online platforms.

He also claimed that TikTok was not the only platform that failed to shield young users from inappropriate content and intrusive data collection.

TikTok Denies Claims

TikTok refuted the claims made by the researchers in a statement from the company's spokesperson. The app claims that the results were biased because they did not use the platform in the same way as regular users.

The social media company also noted that the type of material a person receives shouldn't be influenced by the name of their account.

It is worth noting that users under the age of 13 are not permitted on TikTok, and videos that promote eating disorders or suicide are prohibited as per the platform's official guidelines.

TikTok users in the US who look for content about eating disorders are presented with a prompt containing links to mental health websites and the National Eating Disorder Association's contact details.

However, researchers at the CCDH discovered that TikTok had received billions of views for videos about eating disorders.

They found that young TikTok users occasionally used coded language for eating disorders to bypass the app's content moderation.

The CEO of CCDH, Imran Ahmed, said that TikTok's self-regulation has failed after their findings. He added that federal rules must be enforced to ensure the online safety of children.

Ahmed pointed out that the TikTok version made available to domestic Chinese audiences is intended to encourage young users to view math and science-related content and has time restrictions for 13- and 14-year-olds.

A bill in Congress is currently in the works to impose new regulations limiting the information that social media companies can gather about young users and establish a new office within the Federal Trade Commission dedicated to safeguarding the online privacy of children.

Byline
Tech Times
ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Tags:TikTok
Join the Discussion
Real Time Analytics