Facebook needs to democratise conversations, not play big brother

Facebook has multiple standards when it comes to deleting content from the social network.


Posting content on Facebook is a hit and miss affair. Content is taken down automatically by algorithms. Facebook enforces a content policy where blood and violence is acceptable, but nudity is not. Content and communities are taken down on the basis of Government requests. At times, Facebook blames missing content on bugs and glitches. For the end user, content disappears randomly, and there is no uniform enforcement of its content policy.

https://twitter.com/baalegibreel/status/752259098737979392

In Kashmir, activist Huma Dar's account was banned by Facebook. The ban came in the midst of a series of bans on Pro-Kashmir accounts and pages. Individual users who had set Burhan Wani as their profile pictures were also banned. It is possible that Facebook reacted to multiple reports against these profile pictures. Affected users can find out why their accounts were disabled, and attempt to recover it.

Tech2 had reached out to Facebook for a comment, and at that time, Facebook had said "Our Community Standards prohibit content that praises or supports terrorists, terrorist organisations or terrorism, and we remove it as soon as we’re made aware of it. We welcome discussion on these subjects but any terrorist content has to be clearly put in a context which condemns these organisations or their violent activities."

In Pakistan, at the request of Government authorities, the Facebook Page of the progressive rock group Laal (Red) was banned. "Taalibansarezalimans", an anti-Taliban forum, and Pakistani.meem, a secular community platform, were also banned. An official of the Pakistan Telecom Authority confirmed that the Government had made requests to take down these pages to Facebook. The leader of the band told the media that they were not informed of the decision.

Facebook has taken down over seventy percent of pro-Palastinian content, based on the request of the Israeli Government. The Israel government requested Facebook on the basis of a direct relationship it found between social media posts and on ground violence. According to Facebook, the deletions were in line with their content policy on terrorism.

In Falcon Heights, Minnesota, law enforcement officers pulled over a vehicle for having a broken tail light. Diamond Reynolds started live streaming the interaction on Facebook. The encounter escalated and resulted in the companion of Reynolds getting shot in the arm four times. The video was graphic, and showed police officers firing over a traffic violation. As the video went viral, Facebook took it down, and blamed the situation on a technical glitch, saying that there was a bug in the system.

In Chicago, a man was shot and killed while live streaming on Facebook. That video, however, remains on Facebook with a warning about its graphic nature. This shows a double standard when it comes to content with graphic content.

Facebook certainly has more tolerance for graphic content as against nudity. Videos and images showing blood and gore are allowed to stay. However, Facebook purges content related to sex and bodies of women. Breastfeeding women was a kind of image that Facebook controversially removes. Even hand drawn nudity is not allowed. However, the same standards are not applied to male bodies. Males can show off ass many protruding nipples as they want, and Facebook will not do anything about it.

Facebook banned a photo of a plus sized model in a bikini. The reason cited by Facebook was that its content policy does not allow undesirable photos of body parts. Soon after, Facebook admitted that it had made an error, and reinstated the picture. Facebook stressed that they did not want to carry ads that show either perfect humans or undesirable humans. Facebook intends to carry ads that are relevant to the event or activity being advertised, such as running or walking, instead of images that may make viewers feel bad about themselves.

Facebook also scrubbed the site of the image of a young Vietnamese girl escaping from a Napalm bombing. The image, known as "Napalm Girl" was removed from a number of accounts from around the world, including that of Norwegian Prime Minister Erna Solberg. The image was removed on grounds of nudity, but was restored after Solberg accused Facebook of censorship and attempting to edit history.

Perhaps the most bizarre instance was Facebook banning the phrase "Everyone Will Know" from being posted on the site. Apparently, the ban was a result of a bug in the spam detection system. This system is constantly updated by Facebook engineers, and apparently the phrase erroneously slipped into the spam filters. The issue has been fixed.

 Facebook needs to democratise conversations, not play big brother

Religious bigotry, offensive image macros, spammy game requests, off colour text porn are all types of content that Facebook does not ban. Content of this type continues to thrive on Facebook, and unless reported by many people, continue to persist on the web site.

After Facebook banned a page which advocated Sikh seperationism. A US court ruled that Facebook can ban any kind of content on the social media platform, without having to justify reasons for doing so. This allows Facebook the freedom to pick and choose what kind of content it wants to take down, without having to explain why to the people who were affected.


Find latest and upcoming tech gadgets online on Tech2 Gadgets. Get technology news, gadgets reviews & ratings. Popular gadgets including laptop, tablet and mobile specifications, features, prices, comparison.