Earlier this week,
Facebook deleted over 700 pages in India
for “coordinated inauthentic behaviour” and “spam”. Most of these pages were related to the Congress and the BJP. The action, evidently with an eye on the upcoming elections, targeted what Facebook described as deliberate attempts to mislead. Facebook’s role in cleaning up mala fide actions antithetical to electoral democracy needs to be understood better. The Internet has made the process of coordinated action easier, thanks to the anonymity it allows and social media has brought new dimensions to the social engineering of thought and action. Digital virality has transformed the propagation and spread of news and views. Law enforcement agencies have had to turn to social media companies ever-so-often to restore ‘order’. In 2012, when there was a mass exodus of the North-Eastern community from Bengaluru following rumours of violence against them, social media companies
were asked to cooperate with the government
to screen websites spreading these rumours. Similarly, after its platform was used to fuel mob lynchings, WhatsApp
made some design changes
following government directions. [caption id=“attachment_5834751” align=“alignnone” width=“1024”]
Representational image.[/caption] However, it has not always been straightforward to resolve the issues of misinformation or ‘coordinated action to mislead’, on digital platforms. Anti-caste activists
have pointed out
that even as Facebook unfairly removes posts of those who speak against caste oppression, it does not act against those who use casteist slurs or tackle trolls against women in public life. In the case at hand, Facebook took suo moto action to identify and delete the Pages. Ostensibly, the company acted in accordance with its own policies against spam. Facebook says it took down pages for
behaviour, not content
. This behaviour, according to Facebook, includes deceiving people about the identities, locations or motivations of those running these Pages. As per Facebook’s categorisation, therefore, the case of deletion of Pages of political parties is distinct from the case of lynchings, or from what happened in Myanmar – where Facebook was used to
spread hate speech and rumours
. This claim of minding behaviour and not doctoring content needs some unpacking.
Facebook as a public sphere infrastructure
Appearing to cause trouble in elections does not augur well for Facebook’s reputation, and so, it does have a business interest in maintaining/appearing to maintain fairness. And while Facebook’s actions might be legitimate according to its own policies, we must not uncritically accept its self-appointed role as the custodian of Indian democracy. The platforms that Facebook controls – including WhatsApp – have now become an infrastructural service provided to countries around the world. And Facebook is technically not accountable to the people of these countries for how it manages this infrastructural service. [caption id=“attachment_5742111” align=“alignnone” width=“1280”]
WhatsApp. Reuters[/caption] Facebook is a company incorporated in the US, and as such is only accountable to the US government. The implication of the infrastructural nature of its service is that — as a platform that co-creates the public sphere encompassing not just ordinary people, but their political representatives and the lobbies that make and break political power, no individual or political party can afford to ignore it. Even if Facebook faces flak, as it did after the
infamous Cambridge Analytica scandal
, any reputational damage it may suffer in real terms is pretty much contained, thanks to its monopoly power. Facebook could, therefore act in allegedly fair and noble ways or not; it could choose to censor that which is inauthentic or not. In addition to its lack of accountability to the Indian electorate, Facebook’s actions also lack transparency. We do not know whether Facebook is telling the truth about why it took down these Pages, who the people were who controlled these Pages, and whether the behaviour was really inauthentic and coordinated. More importantly, we also do not know which other Pages were allowed to stay on despite exhibiting coordinated inauthentic behaviour. We cannot know, because Facebook controls how much data it releases about these actions. These glaring lacunae in the accountability and transparency of the infrastructure scaffolding the public sphere become more worrisome because they directly impact Indian democracy. This issue is not limited just to elections, as one can easily imagine other activity on Facebook that endangers Indian democracy; coordinated, vicious hate speech, for example.
Who must decide authenticity?
In the digital age, social media platforms are co-implicated in how society’s democratic contours evolve. Decisions by platform companies deeming certain activity authentic and certain others inauthentic beg the question about who determines what action is good for democracy. Facebook’s acts of selective commission and omission are part of its pick-and-mix game to use shifting standards as may be best for its business and in utter disregard of local laws. The fine line between content and behaviour may amount to nothing significant for social good and completely ignore democratic consensus about the same. In the Kathua rape case, the
Delhi High Court issued notices to Facebook, Google, Twitter and YouTube
for disclosing the identity of the victim, in contravention of the law. In the wake of the Christchurch shootings, the Privacy Commissioner of New Zealand asked Facebook to provide the police with the account details of everyone who had shared the gunman’s video, in an “egregious” breach of the victim’s privacy. Facebook refused to hand over the names and Global Policy VP, Monika Bickert,
held tha****t
she was following the law — presumably, US law.