Facebook addresses concerns around mental health of its content moderators

These content moderators aren't Facebook employees per se, but employees of organisations that Facebook has outsourced this work to

Facebook employs around 15,000 content editors to moderate abusive content on its platform. It is their job is to keep Facebook free of content which violates the platform's Community Standards. This content can include everything from misinformation, hate-inciting to often extremely graphic videos. These content moderators aren't Facebook employees per se, but employees of organisations that Facebook has outsourced this work to, such as Accenture, Cognizant, Genpact and other IT consultancy firms.

A detailed report in The Verge has highlighted the conditions under which a lot of these content moderators work in Arizona, US, office of Cognizant — the IT company with whom Facebook has partnered, to keep the platform clean.

Another report by Bloomberg, who spoke to content moderators from Accenture, states the random nature of content the moderators have to look at daily. According to one moderator, the site spoke to, there is a constant strain on employees as there is not only nudity or pornography to look at, but also beheadings and murder. The employees also periodically discussed 'body counts' ie. the number of dead bodies they saw on Facebook on their shift.

"But the secrecy also insulates Cognizant and Facebook from criticism about their working conditions, moderators told me. They are pressured not to discuss the emotional toll that their job takes on them, even with loved ones, leading to increased feelings of isolation and anxiety," said The Verge report.

Facebook addresses concerns around mental health of its content moderators

A Facebook panel is seen during the Cannes Lions International Festival of Creativity, in Cannes, France, June 20, 2018. REUTERS/Eric Gaillard - RC16103864C0

Some of the key takeaways of this investigative story included:

  • Content moderators have to account for every minor break they take. In fact, two Muslim employees were also ordered to not pray during their allotted 'wellness time.'
  • There is a constant fear among contractors of being fired even if they make a few errors a week.
  • Dark jokes about suicides, smoking weed during breaks, being high during work hours are common ways of compensating the mental trauma faced while moderating graphic content.
  • Some employees have started believing conspiracy theories that they are to be moderating.

Within hours, Facebook's VP of global operations, Justin Osofsky put out a post on the Facebook Newsroom highlighting the efforts the company is taking to ensure the mental wellbeing of its content moderators. Osofsky did not refer to The Verge or Bloomberg story at any instance in his write-up but just said that he was aware of the misunderstandings and accusations around Facebook's content review practices.

Facebook has acknowledged that it is partnering with organisations such as Accenture, Cognizant, Genpact among others, for outsourcing content moderation work as these organisations are known for their employee care standards.

"These partnerships are important because they allow us to work with established companies who have a core competency in this type of work and who are able to help us ramp with location and language support quickly," said Osofsky defending Facebook's stand on outsourcing this critical work.

He went on to mention some of the core mechanisms in place for content moderation partners. These include having an agreement with partners to provide employees with good working facilities including wellness breaks and mental health resources. Facebook claims it also conducts weekly calls, regular visits and monthly and quarterly business reviews with its partners. Osofsky also enlisted measures that Facebook is taking to ensure that its partners get all the support required from Facebook to ensure a healthy working environment.

"We encourage all partner employees to raise any concerns with their employers’ HR teams. Additionally, they can anonymously raise concerns directly to Facebook via our whistleblower hotline, and Facebook will follow up on the issue appropriately," read the post.

Tech2 is now on WhatsApp. For all the buzz on the latest tech and science, sign up for our WhatsApp services. Just go to Tech2.com/Whatsapp and hit the Subscribe button.






also see

science