No proof YouTube promoted anti-vaccine content amid Covid-19 pandemic: Study
San Francisco, Sep 18: Researchers found no strong evidence that YouTube promoted anti-vaccine sentiment during the Covid-19 pandemic.
The study, published in the Journal of Medical Internet Research, examined if YouTube’s recommendation system acted as a “rabbit hole,” leading users searching for vaccine-related videos to anti-vaccine content.
For the study, researchers asked the World Health Organization - trained participants to intentionally find an anti-vaccine video with as few clicks as possible, starting from an initial informational Covid-19 video posted by the WHO.
They compared the recommendations seen by these users to related videos that are obtained from the YouTube application programming interface (API) and to YouTube’s Up-Next recommended videos that were seen by clean browsers without any user-identifying cookies.
The team analysed more than 27,000 video recommendations made by YouTube using machine learning methods to classify anti-vaccine content.
“We found no evidence that YouTube promotes anti-vaccine content to its users,” said Margaret Yee Man Ng, an Illinois journalism professor with an appointment in the Institute of Communications Research and lead author of the study.
“The average share of anti-vaccine or vaccine hesitancy videos remained below 6 per cent at all steps in users’ recommendation trajectories,” said Margaret Yee Man Ng.
Initially, researchers only want to understand YouTube’s famously opaque techniques for content recommendations and whether these techniques funnel users toward anti-vaccine sentiment and vaccine hesitancy.
“We wanted to learn about how different entities were using the platform to disseminate their content so that we could develop recommendations for how YouTube could do a better job of not pushing misinformation,” said UN Global Pulse researcher Katherine Hoffmann Pham, a co-author of the study.
“Contrary to public belief, YouTube wasn’t promoting anti-vaccine content. The study reveals that YouTube’s algorithms instead recommended other health-related content that was not explicitly related to vaccination,” Pham added.