Instagram content from adult sex content creators is reportedly instantly being pushed to 13-year-olds after just 3 minutes of logging in and account creation. Meta rejected the test results as not indicative of youths' general experiences.
The Wall Street Journal conducted the experiment, in which participants were asked to use newly created test accounts with an age restriction of 13 to scroll through Instagram Reels.
A test account's feed will be dominated by sexualized content in less than 20 minutes if it choose to ignore other types of information and watch sexually suggestive stuff through to the end.
Similar procedures were followed in independent testing conducted by the Journal and Laura Edelson, a professor of computer science at Northeastern University. The studies, which lasted for seven months and ended in June, demonstrate that the social media platform has persisted in providing children with adult-oriented information.
According to the experimenters, in one instance, a brand-new 13-year-old test account that only watched women's videos recommended by Instagram started receiving videos regarding anal sex after just thirty minutes.
TikTok and Snap on Sexual Content
The sexualized content for younger users was not produced in similar experiments conducted on TikTok's and Snapchat's short-video offerings.
According to Edelson, who noted that the entire mature engagement on TikTok appears to have significantly less inappropriate material than the teen experience on Reels, all three platforms have disparities in what content will be suggested to teens.
The study's results refute statements made by Instagram's parent company, Meta Platforms, in January that it will start censoring sensitive content, such as images that could be sexually suggestive or graphic for users younger than 16.
Legislations on Social Media Algorithms
This report coincides with the introduction of bills in several states to control the content that children see on social media. According to recent reports, New York may soon prohibit social media corporations from deploying content algorithms on teenagers.
The state took a step to shield children from automated feeds, which it has long considered harmful. With or without parental consent, the legislators' draft agreement seeks to restrict social media algorithms. Due to their likelihood of addiction and the negative consequences for youth, social media platforms have recently come under scrutiny.
New York City Mayor Eric Adams revealed in February that his administration had brought legal action against Facebook and Instagram, two social media sites that are part of Meta Platforms, alleging they had contributed to the epidemic of adolescent mental health issues.
In January, the same state declared that social networking apps are a "hazard" to public health and an environmental "toxin," with a key reason being the mental health of youth. The mayor emphasized that his city is the first of its size in the country to take such action and issued a warning about the dangers of social media.
Adams said the city will now treat social media as other public health hazards, making sure digital firms take responsibility, just as the surgeon general treated tobacco and guns.