YouTube's recommendation algorithm shows false information and inappropriate videos to its users repeatedly, as per a study by Mozilla.
The non-profit organization behind Firefox studied data extracted from the RegretsReporter browser extension that Mozilla released in 2020.
The volunteers of the study reported videos from YouTube to the browser extension for ten months.
The expanse of Google's video platform drew a significant position in most people's lives. YouTube achieves 2 billion viewers in a month, as per CNET. Not to mention that its users spend 1 billion hours in total watching videos in just a single day.
However, Mozilla's study suggested that the algorithm recommending billions of users which videos to watch next has a colossal flaw.
YouTube Recommendation Algorithm Favors False Info and Inappropriate Videos
According to Wall Street Journal, the Mozilla Foundation saw in its study that YouTube's recommendation tool directs its viewers to videos that spread misinformation and inappropriate content.
To be precise, AdWeek reported that a bulk of the reported videos provide misinformation brought upon by political agenda and "wildly inappropriate" animation, which attracts young minds.
It turns out the YouTube algorithm recommended about 71% of the videos that the study's volunteers reported. It is to note that 9% or about 200 clips, which garnered 160 million views, have been taken down eventually by the platform.
Mozilla confirmed that these deleted videos, which were initially recommended to users, have violated the policies that the platform is implementing.
As such, the Senior Manager of Advocacy of Mozilla, Brandi Geurkink said that "the recommendation algorithm was actually working against their own abilities to police the platform."
He further dubbed it as "bizarre."
Mozilla added that users from non-English speaking countries saw a significant spike in reporting such videos, which is 60% higher than English language locations.
YouTube Algorithm and Click Bait
TechCrunch noted that the new research by Mozilla echoes the idea that YouTube's algorithm still favors video that will trigger its users to click even if most of them are content that spreads misinformation.
The outlet further opined that the problematic system might have been a result of equating views to advertisements.
Mozilla's RegretsReporter
The tech company innovated the extension that receives feedback from YouTube's appalling recommendations. It was initially created to further examine the algorithms of the video platform.
Meanwhile, the extension is available for both Google Chrome and Firefox.
On September 17, 2020, YouTube was caught recommending videos that could generally affect younger audiences.
Read also: Rumble Monetization: Is It Really 10x Better Than YouTube? Here's How Your Videos Can Start Earning
YouTube Recommendation: What Users Could Do?
According to the non-profit organization, YouTube users could instead browse their viewing history. Then, delete the videos that could end up enabling the algorithm to recommend more terrible videos.
Related Article : Algorithm Used by Netflix, Facebook Trained to 'Predict' and Crack the 'Biological Language' of Cancer, Alzheimer's
This article is owned by Tech Times
Written by Teejay Boris