People who search YouTube for a completely innocuous topic and ended getting questionable recommendations share their stories on Mozilla's "YouTube Regrets."
On Tuesday, Oct. 15, the foundation, best known for the internet browser Firefox, published 28 instances of the YouTube algorithm pushing anti-LGBT+ sentiments, white supremacist propaganda, misogynistic or incel rants, conspiracy theories, and other extremist and sensitive content.
The goal of the project is to force the Google-owned video site to make its algorithm more accessible to independent researchers who might be able to help refine its recommendation system.
From Tap Dancing To Body Harming Videos
One parent recounted how their 10-year-old daughter was recommended extreme dance and contortionist videos that contain body-harming and body-image-damaging advice from a simple "tap dance" search on YouTube. Now, the impressionable young girl is restricting how much food she is eating to maintain a slender appearance.
Another person who follows a drag queen who posts positive affirmation and confidence-building videos got recommendations for hateful content, including anti-LGBT+ sentiments.
"It got to the point where I stopped watching their content and still regretted it, as the recommendations followed me for ages after," the individual who remained anonymous stated.
Mozilla said that more than 2,000 stories of bad recommendations were submitted a few days after the foundation announced the project.
"The stories we collected show that this powerful force can and does promote harmful content," wrote Ashley Boyd, vice president of advocacy at Mozilla. "Further, the number of responses we received suggests this is an experience many YouTube users can relate to."
The stories published by the nonprofit were not independently verified, but the project is not meant to be scientific but it gives "a flavor for the way in which people are experiencing [YouTube]," added Boyd.
This also is not the first time that the popular site has been in trouble for pushing extremist content. A story published by The New York Times in August revealed how the YouTube algorithm "radicalized Brazil."
A Better YouTube
Making sure that users do not stumble into sensitive or toxic content on YouTube is crucial because the site attracts 2 billion monthly users. About 70 percent of the time that people spend on the popular platform is the result of its recommendation system.
YouTube has adopted new efforts to clean up the site. In September, the company said that over 100,000 videos and 17,000 channels have been removed under a stricter anti-hate speech policy. Earlier this year, the site also claimed that recommendations of "borderline content and harmful misinformation" were reduced by 50 percent.
However, Mozilla thinks that YouTube should do more to resolve the issue. The nonprofit proposed three concrete steps, which include providing independent researchers access to data and historical archive of videos.