Google is bringing back human moderators to oversee YouTube content, taking over from automated systems that were given more responsibilities at the onset of the COVID-19 pandemic.
YouTube revealed in late August that in the three months prior, 11.4 million videos have been removed from the platform for violating its Community Guidelines. This is the highest number of videos taken down from YouTube over a three-month period since the service was launched in 2005, and it was attributed to the higher reliance on A.I. as the pandemic prevented human reviewers from going to work.
YouTube admitted, however, that some of the videos would have been erroneously removed.
“One of the decisions we made [at the beginning of the pandemic] when it came to machines who couldn’t be as precise as humans, we were going to err on the side of making sure that our users were protected, even though that might have resulted in a slightly higher number of videos coming down,” YouTube’s chief product officer told the Financial Times.
The Google-owned company revealed that it has reversed the decision to take down 160,000 videos, the Financial Times reported. Normally, less than 25% of appeals are successful, but under A.I. moderation, the percentage of successful appeals has increased to 50%.
However, while Mohan claims that more human moderators will start overseeing YouTube content, it remains unclear how that will happen amid the ongoing pandemic. Digital Trends has reached out to Google for additional details on their working arrangements, and we will update this article as soon as we hear back.
- YouTube makes it easier for new creators to earn money
- YouTube relaxes rules around swearing and demonetization
- YouTube is rolling out handles. Here’s what you need to know
- YouTube to overhaul channel names with @ handles for all
- Searches for health topics on YouTube now highlights personal stories