On 8 November 2016, the Indian Government demonetised Rs 500 and Rs 1000 bank-notes as invalid currency, leaving much of the Indian population, 93 per cent of which works in the informal sector, high and dry. This was done citing populist justifications of curbing corruption, terrorist funding, and resolving Kashmir’s separatism conflict. What followed was increased media and public scrutiny on the aftermath of GoI’s decision. In the midst of what was primarily bad press, there were reports of paid tweets condoning the move, and absurd rumours of the scientific and security capacities of the replacement notes. These messages were disseminated through various social media, but also propagated by select news outlets, notably Zee News and Aaj Tak. The content of this reportage was generally aligned with the messages in PM Modi’s Mann Ki Baat on demonetisation, designed to resonate with the aspirations of the voting urban middle-class demographic.
Propaganda is not a novel political tool, particularly in the post-truth political era we seem to be residing in. Historically, governments from diverse socio-political philosophies have utilised propaganda as a means to direct public discourse. Lately, the Aam Aadmi party and Congress have upped their game in employing social media to disseminate their ideology, and shift their pre-election mudslinging to a virtual platform. Manipulating public opinion through media can be achieved through direct or indirect censorship, fake news outlets, suppression of facts or relevant context, historical revisions, etc. Absolute media objectivity might be unrealistic considering media houses are composed of human beings, each person a collection of conscious and unconscious biases and agendas; albeit some measure of objectivity could be achieved by investigating the primary source of regurgitated news items, and declaring sources of funding, ideological leanings, political agendas, and affiliations of top leadership.
There is an important difference between viewing ambiguously false information (disinformation) and too much of any information (i.e. an information overload). Most recently, Donald Trump’s easy lies before and during the elections derailed conversations from facts and serious discussions of his proposed policies, into the abyss of emotionally volatile rhetoric. Trump, a master rhetorician, even compelled the White House to release Obama’s certificate of birth, as evidence of his being born on American (‘Nation of Immigrants’) soil. In the ensuing battle to disprove Trump, attention was diverted from his slandering immigrants, Muslims, African-Americans, Democrats, and women as belonging to the collective ‘Other’. His speeches resonated with middle-class white Americans, already hassled with the neo-progressive left. The predominantly liberal media has easily been baited by many of his outrageous lies, some of them supposedly predicated in “alternative facts”.
The dangers lie not just in an epidemic of disinformation, but also in information overload. One consequence of having to navigate data in the digital age is that we are constantly inundated with a large stream of information, including online articles, tweets, trends, memes, soundbites, videos, satirical sketches, talk-shows, news, debates, radio and so on. Outrageous sound-bytes make for great headlines, but subsequent retractions and corrections don’t. This is critical considering headlines framed to be emotionally loaded strongly persuade our judgements regarding politics and policies, than neutral ones. Research demonstrates that anger and outrage elicit more potent responses than do sadness, pity, or positively-loaded emotional headlines. Subsequent perceptions about the issue are coloured by this fairly persistent affective judgment. Additionally, lies repeated often enough begin to seem factual, even to people who know the facts. This is called the Illusory Truth Effect. To be inundated with a constant stream of lies exhausts our brain — already functioning on a cognitive budget. Unable to handle the cognitive load of sifting through a large stream of information and then having to verify its accuracy, our brains begin to rely on mental short-cuts, or heuristics (such as stereotypes), which aren’t necessarily well-reasoned. To be overwhelmed by emotionally loaded headlines and content, never mind their accuracy, can eventually cause outrage fatigue and information avoidance. Essentially, your brain gives up.
So how does one survive in a world where accurate information is scarce but poorly verified reports thrive? To summarise an expansive to-do into a sentence-only suggestion — engage in evidence-based thinking.
First, bear in mind that a majority consensus does not signal the veracity of facts: even large numbers of people can be mistaken. Social media algorithms sift through your personalised web searches and past activity, to provide more of what you find agreeable. This corporatised system of data consumption creates an ideological echo-chamber, enabling the perception that your stance is commonly held, hence valid. This is exacerbated by people’s tendency to faster assimilate information that is congruent with their preexisting beliefs. The confirmation bias persists despite being shown evidence to the contrary. Such an illusory consensus might even inflate a group’s sense of ideological solidarity, and influence important decisions.
Second, the factual correctness of an online article is not dependent on how much it’s been cited. In the age of click-bait, trends and covertly sponsored marketing content masquerading as news, virality is seen as a measure of public attention. This is misleading, since various social media marketing agencies offer services to ‘trend’ opinions, creating an illusory consensus, as demonstrated in the demonetisation case.
Third, as difficult and counter-intuitive as it may seem, it is always important to scour for viewpoints ideologically different from your own. It’s not sufficient to be privy to non-partisan content to remain objective regarding all the information we receive. Since we don’t have control over the quality of information, particularly information in which correctness falls into a grey area, it makes sense to accumulate as much data on the problem at hand, including those ideologically different to your own. The idea is to form an alternative hypothesis to your perceived truth, and not fall for the confirmation bias. Researchers have found that persons who were more scientifically curious, were more likely to circumvent the trappings of partisan ideology and find their way to the factually correct conclusion more often than those who were not.
Fourth, there is perhaps a lot of merit in precise use of language in online and offline discussions. Facts and medicine do not have the luxury of being ‘alternative’. Claims to the contrary are engaging in subjectivist fallacy, a form of circular logic premised in “my truth is different from your truth.” As inclusive and open-minded as that sounds, it is unhelpful in drawing conclusions, particularly in governance and public policy, in which large populations are better off being well-informed. Deliberately ambiguous phrasing analogous to the Orwellian concept of doublespeak has culminated in phrases like “alternative facts”, “post-truth” and “fake news” that essentially mean misinformation, or bullshit.
Finally, the trouble with scapegoating based on nomenclature is that it dilutes the truth by legitimising conjecture as a possibility. The implication is that you, the reader, don’t just have to verify the veracity of what you are reading, but also scale it on a degree of ‘truthiness’. Lies should not be normalised under the guise of alternative facts; opinions and facts are not interchangeable.
To conclude, it is worth quoting Orwell on the deceptive nature of political language, “Political language — and with variations this is true of all political parties, from Conservatives to Anarchists — is designed to make lies sound truthful and murder respectable, and to give an appearance of solidity to pure wind. One cannot change this all in a moment, but one can at least change one's own habits, and from time to time one can even, if one jeers loudly enough, send some worn-out and useless phrase — some jackboot, Achilles’ heel, hotbed, melting pot, acid test, veritable inferno, or other lump of verbal refuse — into the dustbin where it belongs.”
Saloni Diwakar is junior research assistant at the Department of Psychology, Monk Prayogshala, a not-for-profit academic research organisation based in Mumbai
Published Date: Feb 19, 2017 08:15 am | Updated Date: Feb 19, 2017 08:15 am