On Feb. 16, YouTube outlined its new plans to tackle misinformation on the platform. Stopping misinformation before it goes viral, addressing misinformation in languages other than English, and limiting cross-platform sharing of misinformation are the three areas of focus, according to YouTube's chief product officer, Neal Mohan.
YouTube to Tackle Misinformation
YouTube's attention to cross-platform sharing would limit views of videos that are seen as problematic under the platform's current misinformation guidelines, according to The Times.
The video streaming company says adjustments to its recommendation system have reduced the consumption of these borderline videos, but traffic from other sites embedding and linking to these videos remains a problem.
The possible fixes include disabling the share button on the platform or breaking links to videos that have already been suppressed on YouTube.
Warnings that a video could include misinformation are another possible fix and something that the platform employs for graphic and age-restricted content.
In order to stop misinformation from spreading, YouTube is considering larger and more knowledgeable teams and partnerships with non-governmental organizations and local experts.
The platform may also add new labels to videos on emerging, fast-developing topics like natural disasters.
The measures are the latest look at YouTube's attempt to balance safety and suppressing misinformation with freedom of expression, which is an issue that has been in the public eye throughout the COVID-19 pandemic.
The platform is the biggest source of online video, with more than 2 billion monthly users. Mohan said that they will be careful in liming the spread of potentially harmful misinformation, while allowing space for discussion of and education about sensitive and controversial topics.
The critics argue that the platform is not doing enough to address the misinformation issue. Earlier this year, more than 80 fact-checking groups sent a letter to YouTube CEO Susan Wojcicki demanding greater action on misinformation on the platform, according to Protocol.
COVID-19 Misinformation
For the past 2 years, YouTube has removed more than 1 million videos related to harmful information about coronavirus, like false cures or claims of a hoax, according to CNET.
Putting the number in context is difficult because of the gargantuan scale of Google's service, the internet's biggest source of video, with more than 2 billion monthly users.
It is double the total tally of videos removed since the pandemic started. YouTube said that it had removed more than 500,000 videos for COVID-19 misinformation.
But YouTube does not disclose how many videos are uploaded to its library, and it has not yet updated its total video-removal stats covering the last 5 months, obscuring the picture of how coronavirus-related removals stack up to other kinds. The platform removes almost 10 million videos every 3 months.
YouTube, just like the other social media platforms like Reddit, Facebook, and Twitter, give users a platform to post their own content, has grappled with how to balance freedom of expression with effective policing of the worst content posted on its site.
Over the years, YouTube has had issues with different kinds of misinformation, discrimination, conspiracy theories, hate, harassment, exploitation, child abuse, and videos about mass murder, all at an unprecedented global.
Related Article : YouTube Receives Ban Threat From Russia Over Channels Deleted Due to COVID Misinformation
This article is owned by Tech Times
Written by Sophie Webster