About 75 percent of YouTube extremist videos are now removed by artificial intelligence program before ever seeing a human flag — but now the platform will also penalize content that falls in a gray area between what is allowed in the usage policies but still contains controversial hate speech or extremism. On Tuesday, in an update on the effort to fight terror-related content, YouTube shared progress on the platform’s current efforts as well as where it is headed next.
In July, the company introduced four new focus areas to help remove extremist content, a tall order considering YouTube has around 400 hours of video uploaded every minute. Those areas included both more software detection through AI and more human evaluators. While the change was implemented less than two months ago, the AI software has improved in both speed and efficiency, YouTube says, removing content in many cases before the video received a flag by a viewer. That same software has helped to double the number of videos removed from YouTube for extremist content.
YouTube says they are continuing to make more hires as well as refining the software.
YouTube now also has more than 15 organizations on an advisory board to help with any new policies regarding what is allowed on YouTube and what is not. That number of organizations will continue to grow, the company says.
Videos that fall into that category also cannot be monetized, part of a change to the YouTube’s content guidelines announced in June. The change was, in part, prompted by several advertisers leaving the platform when their ads were placed alongside offensive content.
The changes do not just target creators of extremist videos, however — when a user searches for keywords related to those extremist videos, YouTube now instead plays a curated playlist of videos that debunk the extremist messages.
“Altogether, we have taken significant steps over the last month in our fight against online terrorism,” the official blog post reads. “But this is not the end. We know there is always more work to be done. With the help of new machine learning technology, deep partnerships, ongoing collaborations with other companies through the Global Internet Forum, and our vigilant community we are confident we can continue to make progress against this ever-changing threat.”
- YouTube removed 8M videos in 3 months, with machines doing most of the work
- Social Feed: Fire department saves kid on Facebook; YouTube’s latest gaffe
- Hackers place gun images in Vevo YouTube videos “just for fun”
- YouTube will add factual links to conspiracy-theory videos — using Wikipedia
- Social (Net)Work: What can A.I. catch — and where does it fail miserably?