Is Facebook really working to keep its users safe from fake, abusive content?

The TV report on Channel 4 in the UK raised questions about the policies and processes of Facebook.

Facebook has currently been in the media for all the wrong things possible. From the Cambridge Analytica data scandal to all sorts of user data privacy issues Facebook has dwindled the trust of a lot of users. But still, more than 1.4 billion people around the world use the social media platform to interact with each other and share ideas.

Recently Facebook has started trying to make amends for creating a space that encroached upon the privacy of its users, created a hostile environment online and offline and became one of the major platforms to spread fake news. But is it too late now to say sorry?

A recent revelation by the Channel 4 Dispatches which is a documentary series that sent an undercover reporter to work as a content moderator in a Dublin-based Facebook contractor brought out a lot of Facebook's regulatory issues to the surface.

Facebook Content reviewers in Essen, Germany. Image: Facebook Newsroom

Facebook Content reviewers in Essen, Germany. Image: Facebook Newsroom

According to Monika Bickert, the Vice President of Global Policy Management of Facebook says in a report that Facebook has drawn out clear rules jotting down what's acceptable on the platform, but they always don't get it right.

The TV report on Channel 4 in the UK, raised important questions with respect to the policies and processes, including the guidance given during the training sessions in Dublin.

Monika Bickert answering to this report says that, "We take these mistakes incredibly seriously and are grateful to the journalists who brought them to our attention. We have been investigating exactly what happened so we can prevent these issues from happening again. For example, we immediately required all trainers in Dublin to do a re-training session — and are preparing to do the same globally. We also reviewed the policy questions and enforcement actions that the reporter raised and fixed the mistakes we found."

Facebook also states that it includes opinions from experts like people from NGO's, academics and lawyers through the "three Facebook forums" they've hosted in Europe in May.

The social media platform also says that it has teams to help them manage and review content from various companies who work 24/7 across the globe. Also, they plan on doubling the number of people working on their safety and security teams this year to 20,000.

It says that they have also started to use technology to assist in sending reports to reviewers with the right expertise, to cut out duplicate reports, and to help detect and remove terrorist propaganda and child sexual abuse images before they’ve even been reported.

But how is Facebook planning to fix the issue of taking hard judgement calls and deciding to take down posts on complex issues from bullying, hate speech, terrorism to war crimes from the grass-root level? Is the question that still remains.




Top Stories


also see

science