Understanding the Data Protection White Paper Part V: Clicking on the ‘I Accept’ button before proceeding to use a service or app

This article is Part 5 of a multi-part series explaining the recently issued white paper on data protection in India. The responses to the white paper will help in the formulation of India’s future data protection laws. You can read Part 1, Part 2, Part 3 and Part 4.

Consent has always been set as the cornerstone of any data protection legislation. The general assumption is that an individual should be given the greatest say in what should be done with his or her data. Yet, today, clicking on the ‘I accept’ button for apps, software, websites and other places has become an automated, default move. People are unaware that with this simple click they have given away hordes of extremely private information on themselves to the apps or service providers. This raises issues with the effectiveness of consent and notice as a means of protecting privacy.

Representational Image. Pexels

Representational Image. Pexels

Generic notices are unclear
The line you see when you install a new app — By clicking on 'I Accept' you agree to the Terms and Conditions — constitutes a notice. By clicking on the ‘I Accept’, you are consenting. Yet, clearly, this notice tells you very little about what exactly you have consented to. To ensure better privacy, companies like Apple and Google are insisting that apps provide notices each time the app collects data. However, an app may provide a generic notice like seeking ‘Access to camera’. This hardly informs you that the camera could be on continuously, recording your every move.

Data sharing you’ve consented to
Privacy violations with such technologies are more obvious when you consider the use of video capture of a woman in a public place, in a commercial ad without her permission. It is less obvious when you actively consented to the record and sharing of your data. Consider FaceID, where you have allowed Apple to collect and use your facial recognition data for authentication. Recent reports have debated that Apple could very easily share this data with other apps, now being used to prepare seemingly harmless things like Animojis and printing 3D models of yourself, but which potentially could be used for major privacy violations. Apple claims that all the 3D facemap data is stored on the device itself and is not shared with any third party.

The increasing ubiquity of facial recognition technology is in particular worrying since it can now be used to monitor and record your every facial expression, can identify racial details, sexual orientation and other personal characteristics, and can be used to identify and track your movements in stores or while walking down the street. Most CCTV cameras can very easily be equipped with such facial recognition technology.

Consider also data processing with respect to medical and health-related data. Looking at the at-home DNA testing kits, which use a simple cheek swab and mail-in kit to determine a person’s genetic information. However, the security and privacy implications for this data with such companies is unclear. For example, some insurance companies and workplaces are determining whether or not to provide health insurance and workplace rewards on the basis of gene information. Similarly, consider wearable devices like activity trackers and fitness devices. Consider if this information is shared with your health insurance provider. Hospitals are also increasingly processing data from patients on a large scale.

iPhone X comes with a True Depth camera module

iPhone X comes with a True Depth camera module

E-contracts are valid even if you haven’t read them
While considering notice and consent as a means of guaranteeing informational privacy, therefore, the effectiveness of the notice provided and legitimacy of the consent taken has to be considered. Taking consent essentially establishes an e-contract between the user and the company. The result of this is that notices today often mix up informing people of the use of the data with disclaiming liability, leading to complicated legalese in notices difficult for the user to understand.

The increasing penalties on companies for privacy violations make this move understandable, but it has the effect of people not having any idea what they are authorising. In fact, e-contracts are enforceable under the law even if it is proved that the person had never actually read or understood the terms, so long as he was aware that there were terms.

Consent fatigue
One solution to this problem is to separate notices for privacy violations with notices to set up an e-contract between the two parties. However, this solution adds to yet another problem — consent fatigue. This is the issue faced when people are overwhelmed with consent notices at every stage and every move, making it impossible for people to give meaningful consent to these.

Take-it-or-leave-it approach
Another issue with this consent is that it takes the form of a standard-form contract. People have to either take it or leave it, ie, they either consent and use the service, or they do without it. The result of this is again, that people don’t have a meaningful choice when consenting to the terms. The lack of bargaining power of the people questions the validity and enforceability of the contract being entered into. However, as a general rule, this format is perfectly legal, even under the Indian IT (Sensitive Personal Data) Rules, 2011. An example of this is the Whatsapp Facebook case, where Whatsapp gave people the option of either agreeing to the share of their data with Facebook or discontinue the use of their service.

Consent from children
Other issues in relation to consent also arise. One is the validity of consent taken from children. The general rule under Indian and other contract law is that a contract with minors, or anyone below the age of 18 years, is not enforceable. Yet, many children use Google, Facebook and other such websites. Consideration must be given to how to protect this extremely vulnerable group of people. Necessitating parental consent for every user can have a chilling effect on the child’s use of the internet. Thus alternative solutions need to also be found, such as varying consent requirements for different age groups. Prohibiting uses like tracking, profiling and marketing should be considered. Collection and disclosure of such data should be extremely limited.

Alternative grounds to consent
The last issue that arises is alternative grounds to consent for some uses of data. An example of this is processing data for the fulfilment of a contract. For example, when you buy insurance, the insurance company will have the authority to process your data, for the purpose of providing insurance. Another example of this is the fulfilment of a legal obligation, such as a bank providing suspicious transaction reports on an individual mandated under anti-money laundering acts. Another, more limited factor is to protect vital interests, or to share data when there is a threat to the life of the individual. These grounds will have to be carefully considered before they are included.

Some other grounds are public interest and legitimate interest, but these, in the view of the White Paper, are subjective and difficult to enforce.

Key questions raised in the White Paper
Despite these issues, consent and notice remain one of the best ways to give people autonomy over deciding how their data should be used. Thus, a solution is required to ensure that people understand better what they are consenting to and how their data is being used. In view of these issues, the White Paper has sought comments on the following key questions with respect to notice and consent:

  • Should consent be the primary ground of processing?
  • What are the conditions for valid consent (unambiguous/ freely given, etc.)?
  • How can consent fatigue be dealt with?
  • Should there be different standards of consent in different contexts?
  • How can notice be made more comprehensible to users?
  • Should the law prescribe notice formats? How can data controllers be incentivised to develop more effective notices?
  • Should privacy impact assessments and content dashboards be included in the law?
  • How should children’s data be protected? Should there be a separate provision for this?
  • Is the child capable of giving consent? Should parental consent be mandated?
  • Should processing of children’s data be prohibited? Would purpose based restrictions on collection be effective?
  • Should data controllers have a higher degree of responsibility for children’s data?
  • What should other grounds for processing apart from consent be permitted?
  • Should residuary grounds for processing be determined on a case by case basis?

Part I of the series explores the definitions of personal data and sensitive personal data, Part II of the series examines the jurisdiction and territorial scope of data protection laws, Part III of the series explores cross-border data flows and data localisation and Part IV deals with exemptions to data protection law.

The author is a lawyer and author specialising in technology laws. She is also a certified information privacy professional.

Published Date: Dec 05, 2017 11:28 am | Updated Date: Dec 05, 2017 11:28 am