scorecard
  1. Home
  2. Home
  3. Ex-employees are suing Microsoft because their jobs of monitoring for child porn and horrific violence may have given them PTSD

Ex-employees are suing Microsoft because their jobs of monitoring for child porn and horrific violence may have given them PTSD

Matt Weinberger   

Ex-employees are suing Microsoft because their jobs of monitoring for child porn and horrific violence may have given them PTSD
Home4 min read

woman walking by microsoft

Robert Giroux/Stringer/Getty Images

Two former employees of Microsoft's Online Safety Team are suing the tech titan. They say their jobs - which involved monitoring the company's online services like e-mail or Bing for illegal content such as child pornography and violent videos - may have given them post-traumatic stress disorder.

You can read the full complaint, filed in December 2016, here, as originally reported by Courthouse News.

But the upshot is that Henry Soto and Greg Blauert, who say they were employees of the Online Safety Team starting in 2008, claim they were inadequately prepared by Microsoft for the stresses and horrors that they would be exposed to every day.

The complaint also alleges that both men were denied worker's compensation, because their PTSD wasn't "an occupational disease."

The men are seeking damages, and also a commitment from Microsoft to improve how they handle employee wellness in this line of work, including rotating workers out and providing spousal counseling services.

Microsoft does not agree with the claims and says the health of its employees, particularly in this department, is a top priority.

The complaint

The role of the Online Safety Team is to facilitate cooperation between Microsoft and law enforcement, in accordance with legislation passed in 2007. The law in question states that if an online service provider comes across any illegal content, they have to delete the offending account and report it to child protection authorities. Service providers like Microsoft also scan for content that violates their own terms of service.

To that end, the complaint alleges that the Online Safety Team was given "God like [sic]" powers to go into any users' private communications and storage at any time and scan for offending content.

According to the complaint, Soto says he was reassigned to the Online Safety Team without his consent and performed in the job. But over time, it became too much - and the counseling that Microsoft provided to members of the team was inadequate for the levels of stress he was experiencing, Soto's lawyers allege.

At first, the complaint says, he was anxious, insomniac, and suffering from nightmares, among other symptoms, though he eventually went to a psychiatrist who gave him helpful medication. After watching a video of a young girl getting assaulted and murdered, the complaint says, he started having "auditory hallucinations," building to the point where he finally requested a transfer, and eventually, ended up taking medical leave.

Now he's allegedly exhibiting signs of PTSD, triggered by so much as going near a computer, which is hurting his ability to hold down a job, which is putting a strain on his wife and child.

Blauert says he had a similar experience. Only, as part of the company-mandated "Wellness Plan," he was allowed to go home early to help cope with the pressure. But going home early ended up hurting his employee reviews, the complaint says. Blauert ended up suffering what his lawyers call a "physical and mental breakdown," resulting in allegedly showing symptoms of PTSD.

Here's Microsoft's full response to the allegations, below:

We disagree with the plaintiffs' claims. Microsoft takes seriously its responsibility to remove and report imagery of child sexual exploitation and abuse being shared on its services, as well as the health and resiliency of the employees who do this important work.

Microsoft applies industry-leading technology to help detect and classify illegal imagery of child abuse and exploitation that are shared by users on Microsoft Services. Once verified by a specially trained employee, the company removes the imagery, reports it to the National Center for Missing & Exploited Children, and bans the users who shared the imagery from our services.

This work is difficult, but critically important to a safer and more trusted internet. The health and safety of our employees who do this difficult work is a top priority. Microsoft works with the input of our employees, mental health professionals, and the latest research on robust wellness and resilience programs to ensure those who handle this material have the resources and support they need, including an individual wellness plan. We view it as a process, always learning and applying the newest research about what we can do to help support our employees even more.

NOW WATCH: This is what you will find inside the dark web


Advertisement

Advertisement