Viewing thousands of murder and rape videos takes its toll on the workers, the report tells.
Facebook content moderators resort to drinking alcohol, taking drugs and even having sex at work in an attempt to cope with the stress of viewing countless violent videos and photos people keep posting on the social media platform, a report by The Verge tells.
The moderators staff are hired by Cognizant, a contractor that operates a facility in Arizona. The people working in the facility are paid $15 an hour — $4 more than the minimum wage in Arizona — for viewing a constant stream of offensive jokes, possible threats, and videos of beheadings and murders, the report says.
According to The Verge, the moderators are allowed two 15-minute breaks, one 30-minute lunch break and nine minutes of “wellness time” during a work day. However, most of that time is usually spent in huge lines waiting to use a bathroom, since there are only three bathroom stalls for hundreds of workers.
Satisfying the bodily needs is complicated, however, since many workers use the bathroom stalls for having sex, as a way to cope with the stress that comes with viewing hours of offensive materials. Others prefer the facility’s nursing rooms instead; the issue has become so widespread the company has reportedly removed locks from said rooms.
However, having a quickie in a closet is not the only way Facebook moderators lighten up, the Verge writes. Some workers reportedly confessed to drinking alcohol and using drugs at work “to forget”, too.
The constant flow of social media media posts makes the moderators slowly embrace the ideas they are supposed to moderate. The gun-wielding former employee confessed to The Verge that he no longer believes 9/11 was a terrorist attack. He has been diagnosed with post-traumatic stress disorder and generali anxiety, he said in an interview.
Employees in training are effectively forced to watch the content, unable to turn off the audio or pause the video, the report says. Such privileges only become available to full-time employees.
The employees are subject to strict working restriction: neither phones nor smallest pieces of papers are allowed in the work area for fear that a moderator might write down a user’s personal data. All allowed items — such as hand lotion bottles — must be put in transparent plastic bags so that the supervisor can see them.
The rules that apply to moderation change constantly, the workers say. Sometimes the existing rules don’t make particular sense: such as allowing “my favourite ni***” comments as “explicitly positive content,” or allowing “autistic people should be sterilised,” since autism doesn’t count as “protected characteristic,” such as gender or race.
Sometimes, the supervisors can’t come to a single consensus regarding a particular topic, telling some moderators to delete a video related to a particular event, while telling others to keep it.
In the meantime, falling beneath just 95% of accurate moderation decisions might get you fired. Appealing the decision of a quality assurance manager (QA) is prohibitively complicated. As a result, terminated moderators often threaten the QAs with physical violence over their decisions. One QA manager told the Verge he used to bring a gun to the job, because, by the end of the day, there were always terminated moderators waiting for him at his car.
However, Cognizant denies any risk of their employees developing PTSD. Instead, the company says the employees are likely to experience “post-traumatic growth”, becoming emotionally stronger, The Independent report says.
A Facebook spokesperson released a statement in response to The Verge’s report, citing the company has certain standards concerning “well-being and support”.
“We value the hard work of content reviewers and have certain standards around their well-being and support,” the statement read. “We work with only highly reputable global partners that have standards for their workforce, and we jointly enforce these standards with regular touch points to ensure the work environment is safe and supportive, and that the most appropriate resources are in place.”
However, in another statement by Facebook, Cognizant admitted they will investigate allegations made in the report and take “appropriate action”.