Co-presented by


Lok Sabha polls: Facebook, WhatsApp should boost digital media literacy to supplement its efforts against fighting fake news

Facebook has made its first big move in the war against fake and manipulative content, shutting down pages linked to the Congress and the BJP.

While the bulk of these accounts, 687 out of the 700+, were linked to the Congress, it can be argued that the BJP was harder hit, because one of the primary BJP-linked informal groups, The Indian Eye, with 26 lakh followers, was among the ones banned. In comparison, the total number of followers of the Congress-linked pages, as reported by Facebook in their blog post, was only around 2 lakh.

 Lok Sabha polls: Facebook, WhatsApp should boost digital media literacy to supplement its efforts against fighting fake news

Image: Reuters

Facebook's stand is encouraging, but there is scope for improvement

This is encouraging for a few reasons. One is that Facebook has used a solution that does not involve checking every post for authenticity, a time-consuming and, given the volumes we are speaking of, a frankly impossible task. Instead, it is ignoring the content posted by the pages and focusing on the behaviour of the accounts concerned. Facebook is calling this ‘coordinated inauthentic behaviour’. Even if the accounts are posting accurate content, they may be lying about where they are based, or what their motivations are. Apart from this, Facebook is attacking ‘fake accounts or multiple accounts with the same names; impersonating someone else; posting links to malware; posting massive amounts of content across a network of Groups and Pages in order to drive traffic to websites’.

Another reason it is encouraging is that Facebook is finally using its smarts to stay ahead of the tactics evolved by unscrupulous users. From the very beginning, the gaming of Facebook by commercial marketers and the new breed of political marketers was an abuse of the platform. The fact that Facebook is fighting back is a welcome move. I hope that they have learnt their lesson, and now have a permanent team. It is certain that marketers will find new ways of gaming the new system Facebook has in place. This war will evolve quickly.

A woman checks the Facebook page of India's ruling Bharatiya Janata Party. Image: AP

A woman checks the Facebook page of India's ruling Bharatiya Janata Party. Image: AP

Then again, at one level, it already has.

Amit Malviya, the head of BJP’s IT cell, called the 2019 elections ‘WhatsApp elections’, not ‘Facebook elections’. The landscape on WhatsApp is arguably much harder to tackle, given that messages are encrypted, included those posted on large groups. WhatsApp’s new measure is a start, where it is providing a tipline for users to report potential fake news and verify whether something is true or not. However, as recent studies in the US have shown, belief in fact-checking is lower for low-knowledge citizens who need it more (a dismal 29 percent among Republicans, and just 36 percent among Democrats), and is not very high even among high-knowledge citizens (just 34 percent of Republicans and 59 percent of Democrats). In India, WhatsApp has released TV commercials, are doing street plays and has even partnered with agencies such as NASSCOM and Digital Empowerment Foundation, to educate community leaders on the right way of using WhatsApp.

The thing is, mankind takes time to adjust to every new technology. Social media has been revolutionary in the way it disseminates information; users’ media literacy has not kept pace.

Facebook and WhatsApp have the user base and the eyeballs: perhaps they should help this community they have built to develop the knowledge and skills required to use the platform well?

All technology apps have tutorials. Can’t there be a tutorial walking users through how to verify what they read on the platform?

In his 6 March post, Mark Zuckerberg referred to Facebook as ‘the digital equivalent of a town square’. The trouble is, Facebook and WhatsApp posts resemble media news stories both in their use of pictures and video and in the language used. They don’t have the air of a rumour you overhear in the town square. Should not Facebook make this difference explicit when onboarding users? Should it not go out to its existing users and state categorically that they should not believe everything they read on the platform? And should it not have a way to verify websites and news sources that can be held legally accountable for misreporting facts, and should it not withhold this verification from informal and illegitimate websites and sources?

Facebook and WhatsApp have tried doing that through full-page advertisements, but these have been reactive in nature. The need of the hour is to be more proactive on the platform itself.

Facebook CEO Mark Zuckerberg. Image: Reuters

Facebook CEO Mark Zuckerberg. Image: Reuters

Digital media literacy should start at school

In fact, it is time for schools to start training children on how to verify information on the Internet. A course in basic digital skills and digital literacy is vital, much more so than learning programming or the names of hardware components of a computer. Digital corporations like Facebook have the tools to develop such a course and provide it free-of-charge. They can even create digital certificates to reward children with. Maybe they can offer children a second certificate if they go home and teach their parents as well!

There are a million viral tests telling you which Hogwarts house you belong to and testing your typing speed. Can’t there be a viral test that measures how much people know about basic digital literacy? Can’t we award ‘Truthseeker’ and ‘Liebuster’ badges?

Political marketing tactics will continue to evolve

Certainly, keeping citizens up with political marketing is an uphill task. We know this from commercial marketing, with false promises and marketing frauds everywhere. Large corporations have consistently gamed laws against false marketing, from speeding up the warning to investors in Mutual Fund ads, to the entire surrogate advertising industry, promoting Kingfisher mineral water and the entire product category of ‘pan masala’. While not perfect, at least commercial marketing has some barriers, some hoops for marketers to jump through. Not to mention the fact the stakes are much lower in commercial marketing than in political propaganda, which can cause actual violence and monopolise power.

Facebook and WhatsApp, thus far, had kept it ridiculously easy to spread this kind of high-stakes misinformation. Any human system involves a battle between enforcers of norms society needs to defend, and those who try to work around them to get an advantage. By thinking it did not need to perform this function, Facebook was living in a la-la land that they are slowly realising was never sustainable. Hopefully, their self-policing measures will deliver adequate results and will make spreading misinformation hard enough to turn it from an epidemic to small, localised breakouts. Equally hopefully, their teams should work to stay one step ahead of their adversaries, the highly-paid political marketing and technology wizards employed by politicians.

The task of citizens, activists and journalists alive to the dangers of misinformation will also continue. Facebook has shown that it only responds to widespread public pressure. That pressure must be sustained to demand more from the corporation. Facebook has 2.3 billion ‘citizens’ of its ‘digital nation’. It’s time for those citizens to hold it accountable.

Your guide to the latest seat tally, live updates, analysis and list of winners for Lok Sabha Elections 2019 on firstpost.com/elections. Follow us on Twitter and Instagram or like our Instagram or like our Facebook page for updates from all 542 constituencies on counting day of the general elections.

Updated Date: Apr 04, 2019 14:19:57 IST