You’ll See Less And Less Conspiracy Theory Videos, Says YouTube

YouTube has now promised that to make some adjustments so that the site recommends less and less conspiracy theory videos, or for that matter, any content rife with misinformation.

It's a significant step toward reducing the service's tendency to steer people into watching extremist content. As part of a limited test in the United States, YouTube said it would stop recommending "borderline content" — videos that skirt its policies and community guidelines.

YouTube Recommendations

"We'll begin reducing recommendations of borderline content and content that could misinform users in harmful ways," YouTube said, "like videos promoting a phony miracle cure for a serious illness, claiming the earth is flat, or making blatantly false claims about historic events like 9/11."

The company downplayed just how much content the move could affect, saying it would only impact less than 1 percent of videos available on the platform. But there has been a surge of complaints lodged against the site's odd algorithm in recent years.

Extremist Content

The scenarios vary, but the common denominator among the complaints is that users often see extremist content recommended to them even if such videos are far from their interests. As such, the move could introduce significant changes for people who encounter these recommendations on a regular basis.

The move comes at a time when YouTube continues to be criticized for pushing extremist content, which many critics have pointed out that it's been doing so for years. Plenty of editorials discussing this phenomenon note that some people, who are otherwise not well-versed in extremist culture, actually begin their induction by watching these sort of videos on the site.

Worse yet, in a report published recently by BuzzFeed, it was found that YouTube's recommendation engine functions in such a way that a new account watching a nonpartisan political video would later encounter extremist content after merely six videos.

There are many things to consider about YouTube's announced move, and although it's ostensibly a positive change, one has to wonder exactly which videos will be considered "borderline content" and which ones will be skipped.

More importantly, there's the fact the deciding factor ultimately comes down to a combination of machine learning and human moderators, according to YouTube. This could go two ways: Either the system works and tags extremist content successfully, or it works too well and ends up tagging videos that shouldn't be. Such an outcome has happened before — a lot of content creators have been complaining about their videos being taken down or being unfairly labeled as borderline content.

In any case, YouTube plans to roll out the changes more broadly if the tests are successful. But it's clear as early as now that things are not as simple as they seem, and there's a lot of nuance the company has to consider paying attention to before deploying changes willy-nilly.

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion
Real Time Analytics