Fake news and the wild web: Regulatory measures needed to make platforms accountable

Combating the rising instances of fake news on the internet has emerged as a significant challenge for governments and regulators across the world.

In a bid to curb fake news, Indian lawmakers are debating regulatory measures to make digital platforms more accountable. This has been spurred by numerous instances of incendiary messages being circulated through WhatsApp and social media posts inciting mob violence.

Further, with the 2019 elections looming large, political parties and propagandists have begun smear campaigns on social media and messaging platforms. In this scenario, the extent of responsibility of platforms which serve as a medium for such information needs to be examined closely.

Representational image.

Representational image.

The Information Technology Act, 2000, defines an ‘intermediary’ as an entity who on behalf of another entity ‘receives, stores or transmits’ an electronic message or provides services in relation to it. Certain social media applications and instant messaging platforms, although not explicitly included in this definition, may be interpreted as intermediaries under the law. They are exempted from liability if their role is limited to being a medium for information, i.e. if they do not initiate, modify or select the receiver of the information, if they have complied with the due-diligence obligations laid down in subordinate legislation, and if they do not possess ‘actual knowledge’ of any unlawful content being hosted by them.

It is important to understand that digital platforms such as social media websites, chat-rooms, instant messaging applications are content aggregators and not publishers, i.e. they do not exercise any editorial control over the content being shared on their platform. Accordingly, the liability of any digital intermediary should be limited to its minimum role in being a mere channel for communication.

Further, any legislative or policy intervention should not transgress the fundamental right to free speech and privacy of individuals. Stringent regulations and sanctions will push platforms to self-censor.

For instance, Germany has enacted the Network Enforcement Act which requires social media platforms to remove patently illegal content from their websites within 24 hours of receiving a notification. Non-compliance may lead to penalties which can run into millions of dollars. This has led to a situation where intermediaries are over-cautious and users posting content are afraid to publish information that may run the risk of being taken down. Imposing a general obligation on intermediaries to monitor, edit or takedown content on their platform, including private communications, will amount to censorship and political misuse. In cases where instant messaging platforms such as WhatsApp are being misused by individuals to further their own political agenda or incite violence, it will be impossible to impose an obligation on the platform to monitor person-to-person communication or communication that takes place on WhatsApp groups. Further, any regulatory solution proposed by the Government must not discount critical technological features of some of these platforms such as end-to-end encryption of messages and other privacy-enhancing tools.

A demonstrator holds placards during a protest against what the demonstrators say recent mob lynching of Muslims and Dalits who were accused of possessing beef, in Ahmedabad, India July 9, 2017. REUTERS/Amit Dave - RC1651CD9140

A demonstrator holds placards during a protest against what the demonstrators say recent mob lynching of Muslims and Dalits who were accused of possessing beef, in Ahmedabad, India. Image: Reuters

The spread of misinformation and hatred on the internet is a collective failure of society. Regulatory measures to curb it should take into account the roles and responsibilities of individuals creating content, intermediaries hosting such content and/or aggressively propagating it, advertisers and third parties supporting such initiatives, and ultimately, the consumers of the content.

The law must be careful in imposing obligations which are proportionate to the role of passive intermediaries such as social media websites and instant messaging platforms. To this effect, policy measures to promote accountability of platforms may be considered. For instance, ensuring that individuals using platforms such as Twitter, Facebook and WhatsApp are not operating through fake accounts, providing more accessible options to individuals using WhatsApp to flag or report messages which are incendiary or fake, generating widespread awareness campaigns to educate individuals about the repercussions of fake news and incendiary messages. Some of these measures are already being implemented in India.

Combating the rising instances of fake news on the internet has emerged as a significant challenge for governments and regulators across the world.

Tim Berners Lee, the founder of the World Wide Web, is of the opinion the tech giants of today are not programmed to maximise social good. He is hopeful that a legal or regulatory system which is cognisant of social objectives will go a long way in ensuring ethical and accountable participation of the tech giants in a digital society.

The author is a lawyer working with the Vidhi Centre for Legal Policy, New Delhi

Loading...




Top Stories


also see

science