Skip to main content

Facebook apologizes after report shows inconsistencies in removing hate speech

Facebook News Feed
Image used with permission by copyright holder
Facebook has taken plenty of criticism on the platform’s algorithms designed to keep content within the community guidelines, but a new round of investigative reporting suggests the company’s team of human review staff could see some improvements too. In a study of 900 posts, ProPublica reports that Facebook’s review staff was inconsistent about the posts containing hate speech, removing some but not others with similar content.

Facebook apologized for some of those posts, saying that in the 49 posts highlighted by the non-profit investigative organization, reviewers made the wrong choice on 22 of those posts. The social media platform defended 19 other instances, while eight were excluded because of incorrect flags, user deletions or a lack of information. The study was crowd-sourced, with Facebook users sharing the posts with the organization.

Justin Osofsky, Facebook’s vice president of Global Operations and Media Partnerships, said that the social media platform will be expanding review staff to 20,000 people next year. “We’re sorry for the mistakes we have made — they do not reflect the community we want to help build,” he said in response to the ProPublica investigation. “We must do better.”

ProPublica said Facebook is inconsistent on the treatment of hate speech, citing examples of two different statements that both essentially wished death on an entire group of people, with only one of them removed after being flagged. The second post was later removed after the ProPublica investigation.

“Based on this small fraction of Facebook posts, its content reviewers often make different calls on items with similar content, and don’t always abide by the company’s complex guidelines,” ProPublica said. “Even when they do follow the rules, racist or sexist language may survive scrutiny because it is not sufficiently derogatory or violent to meet Facebook’s definition of hate speech.”

On the flip side, the report also found posts that were redacted that shouldn’t have been. In one example, the image contained a swastika, but the caption was asking viewers to stand up against a hate group.

The study is far from the first time ProPublica, a non-profit investigative organization, has called out Facebook’s practices this year. This fall, Facebook changed its ad targeting after a study showed that when enough users typed in their own answers into the bio fields, racial slurs could become a category for a targeted ad. Just a week ago, ProPublica demonstrated that employers could discriminate by age using those ad tools. In the first, Facebook apologized and immediately paused the ad tool until the slip up could be fully corrected, while in the second, Facebook defended its practices.

Monitoring content from the largest social media network with more than 2 billion monthly active users isn’t an easy task and one that Facebook approaches with both artificial intelligence algorithms and human reviewers. Social media networks generally attempt to find a balance between banning hateful content and prohibiting free speech. Osofsky says the platform deletes 66,000 instances of hate speech every week.

The move to a review staff of 20,000 is fairly significant — when Facebook reported in May that it would be adding 3,000 more review staff members that brought the team to 7,500 people.

ProPublica says the investigation is important “because hate groups use the world’s largest social network to attract followers and organize demonstrations.”

Editors' Recommendations

Hillary K. Grigonis
Hillary never planned on becoming a photographer—and then she was handed a camera at her first writing job and she's been…
Facebook map shows you where people are reporting coronavirus symptoms
Facebook has an interactive map tracking people who say they are exhibiting coronavirus symptoms across the U.S.

Facebook has posted an interactive map that tracks where people are self-reporting symptoms of coronavirus, officially known as COVID-19, by county.

The map pulls data from a survey of Facebook users who self-reported symptoms to researchers from Carnegie Mellon University. Scientists say the information could be used to predict where coronavirus outbreaks will hit next.
What is the purpose of the map?
The map could help give public health officials and governments a clearer picture of the spread of the disease and might aid them in trying to decide when and how to reopen the communities.

Read more
Facebook says white supremacists ‘cannot have a presence’ on the social network
Facebook Logo

Facebook removed more than 200 white supremacist organizations from its platform for violating both terrorism and hate speech community standards, a Facebook representative told Digital Trends, as part of a broader crackdown on harmful content.

"If you are a member or a leader of one of these groups, you cannot have a presence on Facebook," Sarah Pollack, a Facebook company spokesperson, said Wednesday. The classifications of "terrorism" and "hate speech" are based on behavior, she said.

Read more
European court decides what Facebook can show to the rest of the world
facebook european union defamation remove globally wesley tingey 9z9fxr 7z k unsplash

A case against Facebook’s policies for removing posts in the European Court of Justice could have implications for users around the world. In the ruling published on Thursday, October 3, the court ruled that Facebook must remove content worldwide if the court determines that content to be illegal, despite the fact that different laws may mean that content isn’t illegal everywhere.

The ruling stems from a case of defamation brought by Eva Glawisching-Piesczek, an Austria Greens party chairperson. As politics on the social media platform tend to go, a user shared an article on Facebook with a slew of names. Glawisching-Piesczek requested that Facebook take down the post, which courts in Austria called defamatory, but the network refused.

Read more