Facebook's threshold for deleting Pages and Groups revealed in leaked documents

Facebook should publically share at least a basic minimum set of rules for taking down Pages and Groups

We know that Facebook is a data goldmine and some of its practices are not necessarily transparent. One of those activities is the reasoning behind banning certain Pages or Groups on Facebook. Sure, Facebook does give guidelines as to what could lead to the removal of content and how it identifies objectionable content, but there's also a certain threshold that Facebook allows before taking any action on Pages or Groups and taking them down or banning them.

In recently leaked documents, accessed by Motherboard, these threshold levels have got some specific numbers. According to a training manual for Facebook moderators, a section dedicated to bands, organisations, public figures and business Pages, the admin has to receive 5 strikes within 90 days for the Page to be deleted.

Facebook mods are also asked to remove a Page if at least 30 percent of the content posted by the Page members in 90 days, violates Facebook's Community Standards. The same policy holds for Groups as well.

When it comes to user profiles, the document states that if there are 5 or more pieces of content that hint at any sort of hate speech, hate propaganda, or other violations, then that user profile has to be taken down.

Pages or Groups that are found to be soliciting sexual favours also come under the radar. Moderators are expected to unpublish Pages or Groups if there are two elements, including Page description, title, photo or pinned posts containing explicit material such as nude imagery or a way of soliciting sex via methods of sharing location or contacts.

Considering Facebook changes its policies on a regular basis, this set of documents may not be written in stone. For instance, according to Facebook, anyone who shares content related to child exploitation will be banned immediately. Facebook has stated that it will take a call on a case by case basis in certain instances, based on the user behaviour and user history.

At a recent Congressional hearing, Facebook's president for Global Policy Management, Monika Bickert was questioned about InfoWars, an alt-right publication known for its conspiracy theories. On being asked how many strikes would it take for someone who spreads conspiracy theories attacking grieving parents and student survivors of mass shootings get, Bickert stated that Facebook will continue to remove any violations from the InfoWars Pages.

"If they posted sufficient content that violated our threshold, that page would come down. That threshold varies, depending on the severity of different types of violations," said Bickert.

While Facebook can take a call on addressing its threshold internally, the opaque process is not working to the benefit of anyone. Bickert states that threshold varies depending on severity. In that case, at least a basic minimum set of rules for taking down Pages and Groups should be shared, just like it shared its content moderation policies in the past. Facebook's opaque practices in these matters could certainly be troublesome.





also see

science