Two employees are suing Microsoft, alleging their jobs gave them PTSD
{{#rendered}} {{/rendered}}
Two Microsoft employees claim the company made them look at photos and videos "designed to entertain some of the most twisted and sick minded people in the world." Now they're suing.
Courthouse News reports Henry Soto and Greg Blauert were part of Microsoft's online safety team whose job was to figure out what online content should be taken down and when it should be reported to police.
In that position, Soto and Blauert say they had to look at images of child pornography, murder, bestiality, and "indescribable sexual assault." They filed a lawsuit against Microsoft last month, accusing the company of negligence, disability discrimination, and violation of the Consumer Protection Act.
{{#rendered}} {{/rendered}}
Both Soto and Blauert claim they're suffering from PTSD. Soto says Microsoft transferred him to the online safety team in 2008, and company policy forced him to stay there for a year and a half.
More From Newser
He says he had nightmares and hallucinations after seeing a girl "being abused and killed." Blauert says he had a breakdown in 2013 and is still being treated for "acute and debilitating PTSD." They claim Microsoft didn't warn them about what to expect in the job and didn't provide psychological support.
While Microsoft's digital crimes unit was given protections and support, Soto, Blauert, and the rest of the online safety team were simply told to go for a walk, take a smoke break, or play video games to clear their heads, the lawsuit states.
{{#rendered}} {{/rendered}}
The men say their suggestions for improvements were ignored, per the Daily Beast. Microsoft responds: "We have put in place robust wellness programs to ensure the employees who handle this material have the resources and support they need.”
In a statement Microsoft said that it disagrees with the plaintiffs’ claims. "Microsoft takes seriously its responsibility to remove and report imagery of child sexual exploitation and abuse being shared on its services, as well as the health and resiliency of the employees who do this important work."
The tech giant says that it uses "industry-leading technology" to detect and classify illegal imagery of child abuse and exploitation that are shared by users on Microsoft Services. "Once verified by a specially trained employee, the company removes the imagery, reports it to the National Center for Missing & Exploited Children, and bans the users who shared the imagery from our services," it added.
{{#rendered}} {{/rendered}}
Microsoft says that while the work is difficult, it is crucial to a safer Internet. "The health and safety of our employees who do this difficult work is a top priority," it explained, in its statement. "Microsoft works with the input of our employees, mental health professionals, and the latest research on robust wellness and resilience programs to ensure those who handle this material have the resources and support they need, including an individual wellness plan. We view it as a process, always learning and applying the newest research about what we can do to help support our employees even more."
This story has been updated with Microsoft's response.
This article originally appeared on Newser: Microsoft Made Employees Watch Child Porn, Murder: Suit