YouTube said that it removed more than 30,000 videos last month that contained hate speech content.
In a blog post published on Tuesday, September 3, the video platform also announced it plans to update its current harassment policy, which will represent a “fundamental shift in our policies.”
“We’ve been removing harmful content since YouTube started, but our investment in this work has accelerated in recent years,” YouTube said. “Because of this ongoing work, over the last 18 months we’ve reduced views on videos that are later removed for violating our policies by 80%, and we’re continuously working to reduce this number further.”
YouTube’s latest updates — that they said are “coming soon” — focus on removing content, raising authoritative voices, rewarding trusted creators, and reducing the spread of material that is against policy.
“We go to great lengths to make sure content that breaks our rules isn’t widely viewed, or even viewed at all, before it’s removed,” YouTube said in Tuesday’s post.
YouTube initially made updates to its anti-hate speech policy in June. The updates now remove videos that feature supremacist views, as well as videos that deny the existence of “well-documented violent events, like the Holocaust or the shooting at Sandy Hook Elementary.”
According to YouTube, the videos containing hate speech content that were removed over the last month represented only about 3% of the views that videos about knitting had during the same time frame.
The video platform has had a year of policy updates due to a variety of different issues. In April, YouTube updated its harassment policy because of creator-on-creator harassment that was occurring on the platform.
In June, the Wall Street Journal reported that YouTube was considering significant changes to its recommendations algorithm in regards to videos aimed at children. According to reports, the Federal Trade Commission (FTC) was in the late stages of investigating the platform’s treatment of kids.
Then in August, YouTube was sued by a group of LGBTQ YouTube creators due to alleged discrimination toward the LGBTQ creators and community, according to The Verge.
Digital Trends reached out to YouTube to see when the new updates will officially be implemented to the platform, but we haven’t received a response.
- On World Suicide Prevention Day, Facebook announces it will ban self-harm images
- YouTube slapped with $170 million fine for violating children’s online privacy
- European court decides what Facebook can show to the rest of the world
- Facebook has strict new rules for political ads ahead of the 2020 election
- Reddit bolsters its rules against bullying and harassment of users