The moderators in charge of policing inappropriate content on Facebook have begun to speak out over work conditions that have caused them to fear for their lives and left them unable to enforce Facebook’s rules.
The conditions detailed in a shocking report by The Verge are so bad that one moderator collapse collapsed at his desk while on the clock and died of a heart attack. Three former moderators spoke out about their experiences despite nondisclosure agreements.
The content moderators of Facebook are the unsung heroes of the platform, working long hours to ensure graphic content is policed or deleted from Facebook users’ timelines.
These employees aren’t the ones working at Facebook’s campus headquarters in Menlo Park, California, which has a burrito bar, treadmill work stations, meditation rooms, and a 9-acre rooftop garden deck. Instead, they work for a firm contracted by Facebook, Cognizant, which has significantly less glitzy offices in places like Tampa, Florida and Phoenix, Arizona.
The worker who died at his desk last year worked the overnight shift at the Tampa site. According to the report, moderators only receive two 15 minute breaks, a 30-minute lunch break and a 9-minute “wellness” break. Not only overworked, content moderators are forced to see images or videos of graphic violence, child pornography, hate speech, conspiracy theories, and even murder day in and day out, all so they can delete awful content before it reaches Facebook’s billions of users.
According to Glassdoor, the average salary for a Facebook intern in the San Jose area is $6,625 per month, or $79,500 a year — a stark difference from the reported $28,800 yearly salary that content moderators make.
Aside from moderating mentally taxing content and receiving low pay, these employees have also dealt with a bed bug infestation at their office, bodily waste appearing at workstations, frequent sexual harassment, and unsanitary bathroom conditions.
Without these employees, we would likely see unspeakable things on our Facebook timelines and (even more) fake news. In a statement, a Facebook spokesperson said the company works with its content review partners “to provide a level of support and compensation that leads the industry.”
“There will inevitably be employee challenges or dissatisfaction that call our commitment to this work and our partners’ employees into question,” the spokesperson said. “When the circumstances warrant action on the part of management, we make sure it happens.”
Cognizant did not immediately respond to a request for comment, but a spokesperson told The Verge that it takes “allegations such as this very seriously” and “strives to create a safe and empowering workplace.”
Aside from the human impact, the harrowing work conditions have left some of the moderators’ offices unable to meet their targets for enforcing Facebook’s policies. That means that more disturbing content could slip through the cracks — and into your feed.
- Walmart cuts more than $300 off the price of this HP desktop and monitor bundle
- The 15 best tech jobs boast top salaries, high satisfaction, lots of openings
- Facebook launches new changes against hate and discrimination. Are they enough?
- A new Senate bill would fundamentally change the internet as we know it
- YouTube could make big changes to children’s content amid federal investigation