Skip to main content

Facebook’s content moderators break their silence on terrifying work conditions

The moderators in charge of policing inappropriate content on Facebook have begun to speak out over work conditions that have caused them to fear for their lives and left them unable to enforce Facebook’s rules.

The conditions detailed in a shocking report by The Verge are so bad that one moderator collapse collapsed at his desk while on the clock and died of a heart attack. Three former moderators spoke out about their experiences despite nondisclosure agreements.

The content moderators of Facebook are the unsung heroes of the platform, working long hours to ensure graphic content is policed or deleted from Facebook users’ timelines.

These employees aren’t the ones working at Facebook’s campus headquarters in Menlo Park, California, which has a burrito bar, treadmill work stations, meditation rooms, and a 9-acre rooftop garden deck. Instead, they work for a firm contracted by Facebook, Cognizant, which has significantly less glitzy offices in places like Tampa, Florida and Phoenix, Arizona.

The worker who died at his desk last year worked the overnight shift at the Tampa site. According to the report, moderators only receive two 15 minute breaks, a 30-minute lunch break and a 9-minute “wellness” break. Not only overworked, content moderators are  forced to see images or videos of graphic violence, child pornography, hate speech, conspiracy theories, and even murder day in and day out, all so they can delete awful content before it reaches Facebook’s billions of users. 

According to Glassdoor, the average salary for a Facebook intern in the San Jose area is $6,625 per month, or $79,500 a year — a stark difference from the reported $28,800 yearly salary that content moderators make. 

Aside from moderating mentally taxing content and receiving low pay, these employees have also dealt with a bed bug infestation at their office, bodily waste appearing at workstations, frequent sexual harassment, and unsanitary bathroom conditions.

Without these employees, we would likely see unspeakable things on our Facebook timelines and (even more) fake news. In a statement, a Facebook spokesperson said the company works with its content review partners “to provide a level of support and compensation that leads the industry.”

“There will inevitably be employee challenges or dissatisfaction that call our commitment to this work and our partners’ employees into question,” the spokesperson said. “When the circumstances warrant action on the part of management, we make sure it happens.”

Cognizant did not immediately respond to a request for comment, but a spokesperson told The Verge that it takes “allegations such as this very seriously” and “strives to create a safe and empowering workplace.”

Aside from the human impact, the harrowing work conditions have left some of the moderators’ offices unable to meet their targets for enforcing Facebook’s policies. That means that more disturbing content could slip through the cracks — and into your feed.

Editors' Recommendations

Allison Matyus
Former Digital Trends Contributor
Allison Matyus is a general news reporter at Digital Trends. She covers any and all tech news, including issues around social…
Facebook and Twitter flag Trump’s post about mail-in voting
Donald Trump

Facebook and Twitter on Thursday flagged a post written by President Donald Trump about the mail-in voting process. 

In Trump’s Facebook post, he tells voters they may have to vote both through the mail and in-person to make their vote count, which is illegal in all states, and is even considered a felony in North Carolina, according to the National Conference of State Legislatures.

Read more
Facebook terms hint it could take down content that may land it in legal trouble
Mark Zuckerberg

An update to Facebook’s terms of service could enable it to take down content it thinks may potentially land the social network in legal or regulatory trouble.

The company sent out a notification to users in a number of countries, including Australia, the United States, India, and more, letting them know that the new policy will go into effect starting October 1.

Read more
Facebook will label controversial content, ban hate speech in ads after boycott
mark zuckerberg thinking

Facebook CEO Mark Zuckerberg said that the social network will be changing several content moderation policies after a number of major advertisers pledged to boycott the company. In a major reversal, Zuckerberg said the company would ban hate speech in paid advertisements on the platform and start cracking down on harmful posts by public figures.

Zuckerberg revealed the changes in a video and post on Facebook, where the CEO said that the social network is "prohibiting a wider category of hateful content in ads."

Read more