A recent study by data experts at the University of Illinois Urbana-Champaign, in collaboration with the United Nations Global Pulse, examined whether YouTube promoted anti-vaccine sentiments during the COVID-19 pandemic.

According to MedicalXpress, researchers conducted an algorithmic audit to investigate whether YouTube's recommendation system inadvertently directed users seeking vaccine-related content toward anti-vaccine material.

During the COVID-19 pandemic, there was a surge in anti-vaccine sentiments and the proliferation of medical misinformation. It was fueled by various factors, including pre-existing vaccine skepticism, the rapid spread of information through social media, and a general atmosphere of uncertainty and fear.

New Turkish Law Allows Government Control Of Media Outlets And Internet Content
(Photo : Chris McGrath/Getty Images)
ISTANBUL, TURKEY - MARCH 23: The YouTube and Netflix app logos are seen on a television screen on March 23, 2018 in Istanbul, Turkey. The Government of Turkish President Recep Tayyip Erdogan passed a new law on March 22 extending the reach of the country's radio and TV censor to the internet.

Finding Anti-Vaccine Content on YouTube

The study involved participants trained by the World Health Organization (WHO) and individuals from Amazon Mechanical Turk, who were tasked with intentionally finding an anti-vaccine video starting from an initial WHO COVID-19 informational video.

The team compared their findings with videos from YouTube's programming interface and the Up-Next recommendations to clean browsers devoid of user-identifying data.

Employing machine learning techniques, the researchers analyzed over 27,000 video recommendations on YouTube to classify anti-vaccine content.

Lead author of the study, Margaret Yee Man Ng, an Illinois journalism professor, stated, "We found no evidence that YouTube promotes anti-vaccine content to its users. The average share of anti-vaccine or vaccine hesitancy videos remained below 6% at all steps in users' recommendation trajectories."

Initially, the study sought to gain insights into YouTube's content recommendation techniques, aiming to determine if they unintentionally guided users towards anti-vaccine sentiment and hesitancy.

Contrary to public perception, the study reveals that YouTube primarily recommended health-related content not explicitly tied to vaccination.

Ng noted that "the videos that users were directed to were longer and contained more popular content, and attempted to push a blockbuster strategy to engage users by promoting other reliably successful content across the platform."

Read Also: YouTube Removes False Claims About Cancer Treatments Following Updated Medical Misinformation Guidelines

Video Recommendations of YouTube 

The research also sheds light on how users' real-world experiences differed from the personalized recommendations obtained through YouTube's programming interface. This is significant as it highlights the impact of user-specific browsing histories on video recommendations. 

The study emphasizes that understanding recommendation systems is crucial for transparency and accountability, allowing users to comprehend the choices being made for them by platform designers.

In summary, the study finds no substantial evidence suggesting that YouTube actively promoted anti-vaccine content during the pandemic.

The platform's recommendation system primarily directed users towards broader health-related content rather than specific vaccination-related material. The team's findings were recently published in the Journal of Medical Internet Research. 

Related Article: Why COVID-19 'Super-Dodgers' are Valuable to Health Experts; Are You One of Them? Here's How You Can Help

Byline

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion