Skip to main content

Microsoft employees sue over trauma they claim was caused by viewing extreme content

microsoft shutting down docs com document sharing service ifa 2015
Image used with permission by copyright holder
Two Microsoft employees are suing the company for emotional trauma they claim to have suffered while employed as part of its online safety team. Henry Soto and Greg Blauert are alleging negligence, disability discrimination, and violations of the consumer protection act.

The suit alleges that Soto and Blauert were exposed to “videos and photographs designed to entertain the most twisted and sick-minded people in the world” during the time they spent as part of the team. Their role was to vet potentially offensive content, deciding what should be removed and what should be reported to law enforcement.

They were expected to view extreme content including bestiality, videos of people dying, and even child pornography. The lawsuit alleges that they were not given proper warning about the extent to which the job could affect their psyche, according to a report from Courthouse News.

Soto apparently suffered from nightmares and disturbing hallucinations as a result of his job. Blauert experienced an increase in feelings of anger, as well as nightmares, and eventually had a physical and mental breakdown. The Washington State Department of Labor and Industries apparently denied both men’s claims for workers’ compensation.

Rather than offering members of the online safety team access to trained therapists, Microsoft is said to have implemented its own wellness program. Employees were encouraged to take part in activities like taking walks, going outside for a smoke break, and playing video games to distract them from the disturbing content they were being exposed to.

The plaintiffs would like to see the team receive a level of support and protection that is similar to what is received by Microsoft’s digital crimes unit, which has access to a larger budget. Some of the changes being suggested by Soto and Blauert include mandatory rotations out of the program, mandatory weekly meetings with a psychologist, and a spousal wellness program.

Editors' Recommendations

Brad Jones
Former Digital Trends Contributor
Brad is an English-born writer currently splitting his time between Edinburgh and Pennsylvania. You can find him on Twitter…
Former Microsoft employee recounts racism at Mixer
microsoft mixer racism allegations maxresdefault

A former Microsoft Mixer employee recounted racism he says he experienced while at the streaming company, which was recently shuttered by Microsoft.

In a blog post on Sunday, Milan Lee called his two years at Mixer "the worst I've ever had professionally" and claimed the poor experience, which spanned 2017 and 2018, was "all due to racism."

Read more
How Windows 7 saved Microsoft from driving over a cliff — twice
Windows 7 Laptop

Windows 7 is dead. And yet, at the time of support ending for Windows 7, 26% of PCs worldwide were still running the nearly 10-year-old operating system. It was a beloved piece of software that people have been clinging to for years.

But Windows 7 also plays an important role in Microsoft's recent history. In two dire times of recent Microsoft history, Windows 7 was the stalwart operating system that kept the legacy of Windows alive and well.
Doing what Vista could not

Read more
Taylor Swift reportedly threatened to sue Microsoft over racist Twitter bot
why certain songs get stuck in your head taylor swift grammys 2016

When an artificially intelligent chatbot that used Twitter to learn how to talk unsurprisingly turned into a bigot bot, Taylor Swift reportedly threatened legal action because the bot’s name was Tay. Microsoft would probably rather forget the experiment where Twitter trolls took advantage of the chatbot's programming and taught it to be racist in 2016, but a new book is sharing unreleased details that show Microsoft had more to worry about than just the bot’s racist remarks.

Tay was a social media chatbot geared toward teens first launched in China before adapting the three-letter moniker when moving to the U.S. The bot, however, was programmed to learn how to talk based on Twitter conversations. In less than a day, the automatic responses the chatbot tweeted had Tay siding with Hitler, promoting genocide, and just generally hating everybody. Microsoft immediately removed the account and apologized.

Read more