Concretising the Right to Privacy in digital times: A technological reading of the SC judgment

The nine-judge bench of the Supreme Court has just ruled that “privacy is intrinsic to freedom of life and personal liberty” guaranteed in Article 21 of the Constitution of India and qualifies as “a primal natural right”. The Court has underscored that its task was to “impart constitutional meaning to individual liberty in an interconnected world”. The judgment seeks to elaborate an idea of privacy with special attention to “technological progress” that has rendered our “lives open to electronic scrutiny.”

Representative image

Representative image

The lengthy text engages with the moral and legal question of privacy. But law in digital times must also grapple with the material aspects of privacy, and the social and economic consequences of specific technological architectures and how they materialise/violate/respect privacy. Privacy or its lack thereof is built in into the design of techno artifacts and becomes implicated in particular ways with particular configurations of digital paraphernalia.

The honorable Court has acknowledged that “electronic tracks contain powerful means of information” and hence privacy concerns are a serious issue “in the age of information”. However, the judgment is only a beginning. The de facto manifestations of the right to privacy in the digital age will depend very much on norms development by the government, and rulings of the courts, in respect of the concrete design of the digital. As we herald the historic judgment, we need to remind ourselves that the real task lies ahead.

Privacy and the materiality of technology design

Particular design features of a technology open up particular normative and ethical issues and challenges pertaining to privacy.

The ad-based model of the Internet is based on an exploitative, ‘access-for-data’ order at the base of the global surveillance regime. It enables free-of-cost access to online information and communication services for users, while at the same time devouring incessantly every bit of their personal information. Platforms and their terms of service regulate the degrees of privacy, invariably claiming access to user data by default. In a research study of 26 safety apps for women, it was found that almost all of them, ironically, lack policies for privacy or terms of use, thus entailing a high risk of possible data and identity theft and egregious surveillance of users.

Every development in digital technology that becomes part of our social fabric forces us to confront a new question about the material design of privacy. The partnership between Google’s Deep Mind and National Health Service UK reveals that centralised information systems become vulnerable to breaches, despite data protection law and use limitation rules. The advent of personal digital assistants has ushered in back-door pathways through which recorded conversations are beamed back to hardware manufacturers. Drones for home deliveries are scanning customers’ residences to map new retail opportunities, and thanks to smart data and IoT developments, devices talk to each other, without human mediation. In the information age reality, policies or laws with respect to privacy, such as the ‘notice and collection’ principle, are thus rendered obsolete.

aadhar_Getty_social 1280

Representative image, Getty Images

The hard code of digital technologies also seems to take the debate on the very idea of privacy to new thresholds. Advancements in Big Data analysis enabled by cognitive computing enhance the risk of social profiling and discrimination for members of marginalised communities. As some scholars put it, “the sheer number and richness of databases and the increasing sophistication of algorithms” heightens not just individual vulnerability to state and corporate surveillance, but also jeopardises the informational security of entire social groups. But we are far from laws to govern algorithms or Artificial Intelligence.

Privacy must be reimagined also as a collective right (PDF), and not just an individual one. In the specific choices of techno-design made to release rations for the poor through biometric verification, the current Aadhaar-based system takes away control that an individual has over their own biometric data. Instead, it exposes them to the risk of identity theft, borne out by many anecdotes from different parts of the country. As has been pointed out, it was quite possible to opt for a design that was more decentralised through smart cards (PDF), with all the personal data held by individual beneficiaries themselves. But the debate is often presented as fait accompli – to be poor, it would seem, is to forgo the right to privacy.

Ensuring privacy by techno-design

Clearly, a generalised right to privacy is not enough, unless it is encoded in guidelines for techno-design. In the examples shared above, databases do not need to be centralised and personal data assistants or apps can well be designed to uphold privacy-by-default.

Reuters

Reuters

A privacy law for the “age of information” requires that we build the public norms to get to this frontier. Efforts along these directions should account for the fact that the right to privacy should be available not only against state overreach, but equally against corporate surveillance, something that the judgment also acknowledges.

The future of our constitutional right to privacy in digital times depends upon a two-pronged approach.

Attention to class of technologies
Privacy and data protection guidelines framed by the executive cannot be undifferentiated; they have to respond to the specific concerns emanating from material design aspects pertaining to specific classes of technologies. For example, messaging apps must be bound by guidelines that mandate end-to-end encryption (like what is guaranteed by Telegram). Similarly, IoT based technologies will need a new set of design guidelines, as will personal data storage systems, and so on and so forth. One area in which the Government of India is making progress in this direction is cloud computing, as evidenced by TRAI’s latest consultation paper on this issue.

Interpreting the ‘proportionality of interference’ principle
The Supreme Court’s judgment alludes many times to protection from arbitrary interference, citing Article 17 of the ICCPR. So, what then may be construed as “necessary and proportionate interference”? With respect to electronic data flows, in the Sabu Mathew case, the Court ruling drew a distinction between ads for sex determination and general content available online, highlighting that preemptive blocking was to be carried out only for the former type of content. The ruling reflects due cognizance by the court of the proportionality principle, addressing the intent of the law through governance of the techno-design. The rationale for exempting general content was that preemptive blocking as a technological instrument would be too blunt, and would result in over-censorship and unnecessary restrictions on freedom of information.

At the same time, with respect to data retention, the judicial response has been less evolved. Consider the petition challenging WhatsApp's decision to share all its subscriber data with Facebook, that the Supreme Court is currently hearing. The petitioners’ argument is that WhatsApp’s decision to share subscriber data that it retains violates the privacy of its 160 million Indian users. Initially, the Court nearly declined to take up the case, and the Bench expressed the view that a right to privacy argument may not be applicable to users of a free service who consent to part with their personal data. Fortunately, the Court appears to have revised its stance in subsequent hearings and admitted the quasi-public character of WhatsApp as a communication service that must be subject to user data protection principles.

The Supreme Court’s ruling, no doubt, is a significant milestone in norms-making for digitally mediated life. But the law can be adequate to its purpose only if we acknowledge that our lives are made and remade in the encounters with the digital. Managing this complexity is to understand not just the pervasiveness of electronic scrutiny. It is about knowing the devil in the detail. The material design of the digital must be governed for a concrete and inalienable right to privacy.

The authors are with IT for Change, an NGO working on technology and social change.


Published Date: Aug 25, 2017 10:33 am | Updated Date: Aug 25, 2017 01:16 pm