Here's everything you need to know about Facebook's controversial rulebook for moderators

Facebook has been in spotlight for all the wrong reasons these days and with a new leaked set of documents that reveal the social network's guidelines for moderators, things can only get better for the naysayers. The data comes from The Guardian in the form of 100 internal training manuals, spreadsheets, flowcharts and more, give some an insight into how the company deals with reported posts and what outcome users can expect when they flag and report on to the moderation team.

The topics indeed are sensitive ones, that are to do with violence, terrorism, pornography, racism, self-harm and even get stranger with cannibalism and match-fixing.

Indeed, Facebook seems to have a flowchart or a rulebook for every single topic out there. And it should, because many today would consider it to be a publication. Even if the social network does not want to see itself under that category, it does after all make money of its services, so those rules and regulations do apply to such a company or business.

Coming to the rulebook, there's plenty to see and ponder over. It will indeed make you wonder and keep you awe-struck as to how the Facebook (or its moderators) view the posts that go up on the social network.

Sexual content

Facebook's rule book seems to have a fair idea about how it deals with revenge porn. The current policy for moderators is to check whether the image shared is produced in a private setting, the "person is nude, near nude, or sexually active" and whether the caption, comments or page title showcase the lack of consent.

While this works in most scenarios, the part on Holocaust Nudity is absurd at best. Facebook's rulebook oddly allows for "posting of image of adult nudity in the context of the Holocaust." In short Facebook would be OK with posting of adult images, but had removed the iconic Pulitzer prize-winning photograph titled "naplam girl" from the Vietnam war because the 9 year-old girl was naked. Indeed after online outcry, Facebook changed these rules and now allows or some exceptions under its"terror of war" guidelines.

"Handmade" art showing nudity and sexual activity is allowed but digital art show sexual activity is not. Indeed it is a thin line that moderators have to understand and take decisions on.


Censoring content on a social media network as large as Facebook is tough, and the same can be said about violence, which can be viewed by different communities differently. The Guardian pointed out how Facebook claims that a post with the line "Some shoot Trump" should be deleted, but it was ok to say, "To snap a b$t*%@s neck, make sure to apply all your pressure to the middle of her throat" since it is not regarded as a confirmed threat.

The same goes to video content on violence. Violent accidents or deaths need not be deleted because they can supposedly create awareness on the same subject.

As for animal abuse, Facebook allows for "imagery of animal abuse" to be shared on the network. However, "extremely disturbing imagery" will need to be marked as "disturbing" to warn the viewer before they tap the play button. At the same time, "Sadism and celebration restrictions apply to all imagery of animal abuse."


Facebook has more of a "crowd-sourced" approach when it comes to the topic of animal abuse and non-sexual child abuse. While rulebook goes deep into details to explain to moderators how to identify child abuse. The explanation is detailed and clearly defined who qualifies as a child or a minor, an adult or a toddler. It event tries to explain what the general view on abuse is (physical force etc) before jumping into graphic violence.

As reported by The Guardian, Facebook's idea is once again similar to violence, where Facebook wants to remain open to example for other users, while banning horrific images. Moderators are to remove graphic content related to abuse only when these are reported by users. At the same, if a live video mocks the victim or celebrates the shooting, then that will be taken off.

Self Harm

Here is where Facebook's take on the topic is rather interesting. Facebook will allow livestreaming of videos that show self harm or those who are attempting suicide.

Why on earth would anyone want to see that? Well, there's good reason for this. Facebook allows for the livestream and will not censor such content as people in distress stand a chance of being helped out. Facebook wants to allow for real life support (which could come from a neighbour or friend) which is why the stream will run live until there is no longer "an opportunity to help the person".

This is however a touchy topic and Facebook is well-aware that a show like 13 Reasons Why (on Netflix) could coax inspire others to commit suicide. Add to this Facebook's response team that will alert the authorities so that attempts can be made to save the person in distress. So yes, this is one of the wider topics where Facebooks seems to have things covered.

Indeed it is hard to monitor so many posts popping up all around the world all at once. Facebook may be trying its level best, but as a publisher and a business (that it makes from ad revenue) it needs to be responsible as to what shows up on its website.

Facebook has been working on various fronts trying to counteract the various hurdles that come its way. Monitoring all of its posts on its social network is a herculean task of epic proportions, but it is one that Facebook will have to have a grasp on, else there's a long list of organisations and institutions waiting to give it the boot, if it does not clean up the mess... soon.

Published Date: May 22, 2017 14:32 PM | Updated Date: May 22, 2017 14:32 PM