A new study reveals that many YouTube videos on the platform still contain misinformation about the coronavirus pandemic.
Out of the 69 videos analyzed in the study, about one in four (27.5%) contained misleading information about the coronavirus. These videos containing false facts racked up a total of more than 62 million views.
The study, published Thursday in the BMJ Global Health journal, reviewed videos that showed up after entering the keywords “coronavirus” and “COVID-19” into YouTube’s search bar. Researchers from the University of Ottawa looked at the most viewed videos out of the search results during a single day in March.
Videos were considered non-factual if they contained one or more false statements about the coronavirus in terms of transmission, typical symptoms, prevention strategies, potential treatments, and epidemiology of the coronavirus.
Some examples of false facts found in these videos included government conspiracy theories, that the coronavirus only affects immunocompromised, cancer patients, and older people and that pharmaceutical companies have a cure but won’t sell it so they can make money.
Of the 19 videos containing misleading information, six were from entertainment news outlets, five were from network news, five were from internet news, and three were from consumer videos.
The study also revealed that videos from professional and government organizations had the most informative content, but were the least viewed.
“Although YouTube is a powerful educational tool that healthcare professionals can mobilize to disseminate information and influence public behavior, if used inappropriately, it can simultaneously be a source of misleading information that can work significantly against these efforts,” the report says.
A YouTube spokesperson told Digital Trends that the study draws broad conclusions.
“We are always interested to see research and exploring ways to partner with researchers even more closely. However it’s hard to draw broad conclusions from research that uses very small sample sizes and the study itself recognizes the limitations of the sample,” the spokesperson said. “We’re committed to providing timely and helpful information at this critical time. To date we’ve removed thousands and thousands of videos for violating our COVID-19 policies and directed tens of billions of impressions to global and local health organizations from our home page and information panels.”
According to YouTube, they’ve removed thousands of videos from the platform for violating coronavirus misinformation policies and make sure that general coronavirus search results point users to authoritative sources such as credible news organizations and health institutions.
YouTube recently announced that it would be expanding its fact-checking efforts, targeting coronavirus misinformation “that comes up quickly as part of a fast-moving news cycle, where unfounded claims and uncertainty about facts are common.”
Even with these ramped up fact-checking efforts, a viral coronavirus conspiracy video called “Plandemic” was able to make its rounds on both Facebook and YouTube last week, racking up 1.8 million views, including 17,000 comments and nearly 150,000 shares.
The misleading video speculates that the spread of COVID-19 was planned by billionaires to enforce worldwide vaccinations. It also draws critiques of Dr. Anthony Fauci, a leading member of the White House Coronavirus Task Force, and his response to the pandemic, misusing comments he made about the virus’s ability to mutate out of context.
Both Facebook and YouTube have removed the video from their platforms.
- Twitter adds warning labels to coronavirus misinformation
- Trump’s executive order could be a disaster for thwarting misinformation
- Facebook takes down viral ‘Plandemic’ coronavirus conspiracy video
- Twitter is struggling to keep viral ‘Plandemic’ conspiracy video off its platform
- Twitter’s misinformation problem will never disappear as long as bots exist