Facebook has stepped up its efforts to remove posts such as violent crime, violent pornography and hate speech.
Image: 123RF/ymgerman
Loading ...

Facebook moderators who review disturbing content on the site are suffering panic attacks and mental breakdowns due to the stress of the job, according to a new report. 

The moderators, who are mostly contract labour on low-salaries, are coping with the stress by taking drugs, drinking alcohol, making offensive jokes, and having sex in the workplace, an investigation by The Verge claims.

Speaking to the technology site, employees said that they felt therapeutic activities and counselling provided by Facebook to cope with the exposure to inappropriate content online were inadequate. 

The report added that one moderator had been diagnosed with PTSD and now sleeps with a gun by his side, following trauma from seeing a video of a man being stabbed to death.

Other former contractors said that repeated exposure to conspiracy theory content on Facebook had made them more likely to believe in those theories.

Some moderators believed in conspiracies including the belief that the Holocaust was fake and that the 9/11 terrorist attack in New York was part of a conspiracy after reading related content on Facebook.

Loading ...

The Harvard Digital Journal of Law & Technology wrote last year that the increasing reliance on moderation contractors was “concerning.”

“One of the biggest problems in evaluating the existing systems is that we have very little information about them. The companies are intentionally opaque and resist any attempt by others to investigate the existing procedures."

“There is a growing body of evidence that content moderation, as currently constituted, entails considerable psychological risks to the employee,” it said.

Facebook has stepped up its efforts to remove posts such as violent crime, violent pornography and hate speech, relying on specialist partner firms to sift through the deluge of potentially offending content and deciding whether it needs to be taken down.

It provides moderation contractors with copies of its rules on which content should be removed and which posts are allowed to remain on the social network.

However, The Verge reported that it is often unclear in certain cases whether certain content is allowed on the site.

Facebook has since admitted that it needs to do more to support the wellness of moderators who remove harmful content from the social network.

"We are committed to working with our partners to demand a high level of support for their employees; that's our responsibility and we take it seriously," said Justin Osofsky, Facebook's vice president of global operations, following the report.

"We've done a lot of work in this area and there's a lot we still need to do."

Osofsky detailed how Facebook has already taken steps to ensure moderators are provided enough support, including making it explicitly clear in contracts that good facilities and wellness breaks are on offer, and making regular visits to partner sites to help address any issues.

"Given the size at which we operate and how quickly we've grown over the past couple of years, we will inevitably encounter issues we need to address on an ongoing basis," he continued.

Facebook has recently expanded its reliance on US-headquartered moderation firms after years spent relying on similar businesses in countries such as the Philippines.

It’s thought that Facebook is keen to expand the number of moderators in the US as contractors are then more likely to understand country-specific issues including local politics. - The Telegraph

Loading ...
Loading ...