Reddit gave insight on Thursday into how much hateful content is taken down from its platform and how it’s ramping up the fight against hate speech.
Since June, Reddit has banned about 7,000 subreddits that contained or promoted hateful content like harassment, bullying, and violent threats. Reddit said about 365,000 users saw the material before it was taken down. The subreddit removals were a result of new policies put into place that same month against hateful content.
While 7,000 subreddits are a lot to take down, the platform said they are working on learning what constitutes “hate” within these subreddits.
“Defining hate at scale is fraught with challenges. Sometimes hate can be very overt, other times it can be more subtle,” the company said in a blog post. “In other circumstances, historically marginalized groups may reclaim language and use it in a way that is acceptable for them, but unacceptable for others to use. Additionally, people are weirdly creative about how to be mean to each other.”
Reddit has seen progress in defining and finding hate on its platform and said an 18% drop in users posting hateful content since those policies went into effect. Before banning this type of content, Reddit saw about 40,000 pieces of hate content on the platform per day.
To further this progress, Reddit said it’s working on developing new moderator tools to better automatically detect hateful content, instead of having subreddits take it down manually.
Aside from Reddit, Facebook also said it’s getting better at detecting hate speech. According to the social network’s latest Community Standards Enforcement Report, Facebook’s proactive detection rate for hate speech is now 95%.
- Inside job: Why Zoombombing isn’t as random as you might think
- Facebook’s automated hate speech detection is getting even better
- Hackers vandalize Reddit with pro-Trump banners and messages
- Facebook yanks large QAnon conspiracy group off platform
- Twitter permanently bans former KKK leader David Duke