YouTube has had a problem for some time with conspiracy theories and other misinformation being spread over its platform. Even the platform’s supposedly child-friendly version, YouTube Kids, has had problems with recommending David Icke’s lizard people conspiracy theory videos to young viewers. In 2018, the platform attempted to address this issue by adding factual information links from Wikipedia to videos discussing known conspiracy theories, after the company drew criticism when a conspiracy theory video about a mass shooting at Stoneman Douglas High School in Florida appeared in the site’s trending section.
Now YouTube is planning to take further action and has announced plans to improve the ways that recommendations work by “reducing recommendations of borderline content and content that could misinform users in harmful ways — such as videos promoting a phony miracle cure for a serious illness, claiming the earth is flat, or making blatantly false claims about historic events like 9/11.” This kind of content is not quite a violation of YouTube’s community guidelines, so it does not have to be removed from the platform entirely, but YouTube has recognized that this is not content most people want to see and that it leaves viewers with a bad impression of the site.
YouTube is generally very secretive about the algorithms used for purposes like generating recommendations, so it has not specified exactly what this change will entail. But it has said that “this change relies on a combination of machine learning and real people” and that it will “work with human evaluators and experts from all over the United States to help train the machine learning systems that generate recommendations.” The use of human moderators in this task is to be welcomed, given the well-established problems with using artificial intelligence to enforce content standards.
Hopefully this change will mean less factually untrue information being spread through YouTube recommendations, although that content will still be available on the platform in recommendations for channel subscribers and in search results. “While this shift will apply to less than one percent of the content on YouTube, we believe that limiting the recommendation of these types of videos will mean a better experience for the YouTube community,” the statement said.