Facebook's rulebook for content moderation leaked, shows little political understanding

The content moderation has to ensure that it adheres to simple yes/no rules. If no, the content is taken down


2018 surely has been a bad year for Facebook, as far as its public image is concerned. Sure, monetarily it is still one of the top social networks. It shall continue on that trajectory thanks to its sheer number of users. But Facebook does not seem to be doing too well when it comes to moderating questionable content on its platform if the latest reports are to be believed.

The New York Times has got access to over 1,400 pages of leaked Facebook rulebook which is used by moderators to monitor reported content on Facebook. A quick reading of the report reveals that Facebook's team of 7,500+ moderators do not have enough time to ponder over posts which could be politically devastating, have little understanding of nuances behind language and there is a lack of content moderators in native languages around the world.

Representational image. Reuters

Representational image. Reuters

In a lot of instances, the moderators are taking decisions which can be considered too biased. These decisions are taken and rules are added to the rulebook in Menlo Park by a bunch of engineers and lawyers, who may not necessarily have the right understanding of situations about which they are making decisions on — such as political matters in India or Myanmar or Pakistan.

These rules are then sent across to third-party companies which hire moderators. Most of the moderators are hired from call centres or other low paying grades. They are expected to understand the rules. The content moderation has to ensure that it adheres to simple yes/no rules. If no, the content is taken down. This, according to sources who have spoken to NYT, sometimes gives Facebook tremendous power to act as a content gatekeeper. Even Emojis have been categorised to identify indications such as condemnation, bullying, sexualised text and so on.

Think about it. You have to first memorise all the rules that could violate Facebook's community standards — rules which keep changing or getting updated regularly. Then these moderators have around 8-10 seconds per post to recall those rules and take action on a piece of content. It is an exercise in frustration according to moderators.

Facebook rules from the leaked documents. Image: New York Times

Just one of the many Facebook rules from the leaked documents. Image: New York Times

In the Indian context, one Facebook slide tells moderators that any post degrading an entire religion violates Indian law and should be flagged to be removed. The reality though is different. Indian law prohibits blasphemy in certain conditions, such as when the speaker intends to incite violence. According to Facebook, it is being extra cautious on such matters. But it is, in a way, regulating free speech, something that is far beyond its mandate.

Facebook has a whopping 2 bn plus user base. Having around 7,500 content moderators for that large a userbase is in itself too little. Sure, we hear Facebook talking about using Artificial Intelligence to scale these efforts. But we all know how pointless AI is in understanding nuance, and how it's easy to introduce bias within AI algorithms.

With general elections coming up in India, the issue of content moderation will rear its ugly head. If Facebook is thinking that it can prevent abuse of its platform, by drafting rules made by people sitting in Menlo Park, then it has nothing but more controversies heading its way.

 

2018 has been an eventful year and here's our comprehensive list of year ender stories.


Find latest and upcoming tech gadgets online on Tech2 Gadgets. Get technology news, gadgets reviews & ratings. Popular gadgets including laptop, tablet and mobile specifications, features, prices, comparison.