Facebook moderators have PTSD-like symptoms from horrifying and violent images, fringe content

(Jaap Arriens/NurPhoto via Getty Images)

Facebook's low-paid army of content moderators, who are often subjected to poor working conditions, suffer PTSD-like symptoms as they are exposed on a daily basis to some of the vilest and fringe content posted to the social network, according to a scathing new investigative report by The Verge.

The tech publication starts off describing how Chloe, a content moderator at Phoenix, Ariz.-based Cognizant – where 1,000 people work to make very fast decisions under intense pressure about whether content that's been flagged is in violation of Facebook's rules – on that day has to moderate posts in front of her fellow soon-to-be-moderators as she's being trained.

"The video depicts a man being murdered. Someone is stabbing him, dozens of times, while he screams and begs for his life. Chloe’s job is to tell the room whether this post should be removed. She knows that section 13 of the Facebook community standards prohibits videos that depict the murder of one or more people. When Chloe explains this to the class, she hears her voice shaking," The Verge reports, adding that she later leaves the room and cries so hard she can barely breathe.

YOUTUBE PULLS ADS FROM ANTI-VAX VIDEOS WHILE PINTEREST BANS ALL 'VACCINE' SEARCHES AMID MEASLES OUTBREAK

Facebook, which has faced criticism from all corners for its content moderation mistakes and for the massive rulebook that guides for moderators, had more than 30,000 employees working on safety and security by the end of last year. Of those, about half are content moderators, and the tech giant relies on contract labor for most of that work. In the face of a never-ending firehose of content, moderators are expected to maintain a 95 percent accuracy rate while reviewing more than 1,000 posts per week to see if they violate Facebook's community standards.

The Verge's report, which is based on interviews with a dozen former and current Cognizant employees, depicts a soul-crushing, morbid environment where workers joke about self-harm, do drugs on the job, develop severe anxiety or have panic attacks because of the horrifying content they're forced to view. Most of the moderators interviewed quit after one year.

The Phoenix moderators, according to the report, make about $28,000 per year, while the average Facebook full-time employee earns $240,000. In contrast to the perk-filled life at Facebook's Frank Gehry-designed Menlo Park, Calif. headquarters, moderators in Phoenix are closely surveilled by managers and allotted very short breaks for using the bathroom or so-called wellness time.

In addition, moderators told the tech news site that some colleagues have even embraced the fringe, conspiracy-laden views of the memes and posts they're forced to view each day.

KALASHNIKOV UNVEILS NEW KAMIKAZE 'SUICIDE' DRONE

Mark Zuckerberg, chief executive officer and founder of Facebook Inc. attends the Viva Tech start-up and technology gathering at Parc des Expositions Porte de Versailles on May 24, 2018, in Paris. (Getty Images)

Both Cognizant, and Facebook, which is led by CEO Mark Zuckerberg, pushed back on some aspects of The Verge's reporting.

Bob Duncan, who oversees Cognizant’s content moderation operations in North America, told The Verge that recruiters carefully explain the graphic nature of the job to applicants. “The intention of all that is to ensure people understand it. And if they don’t feel that work is potentially suited for them based on their situation, they can make those decisions as appropriate.”

PENTAGON DEVELOPS F-35S TO ATTACK AND DESTROY NUCLEAR-ARMED ENEMY ICBMS 

At a later stage of the reporting, Facebook allowed The Verge's reporter to visit the Phoenix site after telling her that the moderators' experiences don't reflect those of most contractors, either in Phoenix or worldwide. New positive-message posters were put up and several content moderators who spoke to The Verge expressed satisfaction with their jobs and how they're treated, claiming that the very awful, violent content is only a small fraction of what they view.

When the reporter asks one of the on-site counselors about the potential for workers to develop PTSD, he tells the reporter about something called "post-traumatic growth."

The Verge concludes that the "call center model of content moderation is taking an ugly toll on many of its workers. As first responders on platforms with billions of users, they are performing a critical function of modern civil society, while being paid less than half as much as many others who work on the front lines. They do the work as long as they can — and when they leave, a non disclosure agreement ensures that they retreat even further into the shadows."

CLICK HERE TO GET THE FOX NEWS APP

A former contract content moderator sued Facebook in September, claiming that her work for the tech giant left her with PTSD.

Load more..