Understanding the Data Protection White Paper Part X: Governing diverse industries through a co-regulatory approach

Is the co-regulatory approach appropriate for India? What about other models like self-regulation and ‘command and control’?

This article is Part 10 of a multi-part series explaining the recently issued white paper on data protection in India. The responses to the white paper will help in the formulation of India’s future data protection laws. You can read Part 1, Part 2, Part 3, Part 4, Part 5, Part 6, Part 7, Part 8 and Part 9.

Regulating and enforcing data protection faces two key challenges — the wide range of entities, sectors and industries which are subject to it and the rapid pace at which technology is developing. In view of the divergent needs of each, a suitable regulatory model needs to be adopted. Once established, it is essential to support this regulation with suitable enforcement mechanisms.

Representational Image

Representational Image

Customising regulation for diverse sectors

The diverse needs of various sectors demonstrate the need for customised regulation. For example, in the telecom sector, ensuring adequate encryption and other security requirements, such as for the various networks, is essential to make sure that there is no snooping on people’s communications. For the financial sector, it is crucial to maintain the confidentiality and security of the data in the possession of the financial entities. This will still need to be disclosed, routinely, to the apex financial regulator to ensure the smooth running of the industry.

Data in the medical sector is even more sensitive but still needs to be shareable with relatives, other health institutes and for research purposes. This data may even need to be shared without consent to save the life of the individual. For data with social media, protecting people’s personal information like photographs and messages from hackers and the government is one consideration and dealing with the privacy implications of a new technology like live streaming is another.

International regulatory models

When considering regulating data protection, different countries have adopted different versions of these models. One is the European model, ie, the general data protection regulation (GDPR), an all-inclusive regulation, applicable to all entities, whether private or public. The second is the US model, featuring an encouragement of self-regulation, through numerous state and sectoral level regulations, along with a few federal regulations. Another is the Singapore model, with one data protection regulation and several guidelines issued through a public consultation process, much like that of TRAI.

The co-regulatory approach for India

These models can be broadly classified into the ‘command and control’ approach, the self-regulatory approach, and the co-regulatory approach. The White Paper recommends the co-regulatory model for India, an approach that was previously also recommended by the Justice AP Shah Committee. This combines an overarching data protection regulation, along with sector specific self-regulation.

The benefit of this approach is that different industries can customise data protection regulations based on the kind of data they handle. Any aspects or industries that are left out will still be protected under the main regulation.

Inadequate enforcement mechanisms in India

Once a regulatory approach is decided, the next consideration is enforcing it. Prescribing privacy regulations can be effective only when they are backed by adequate enforcement mechanisms. This is, in fact, one of the primary drawbacks in India’s current privacy laws. The IT (Sensitive Persona Data or Information) Rules, for instance, embody most of the privacy principles, such as consent and notice, purpose specification, use limitation, and so on. However, there are next to no enforcement mechanisms.

The IT (SPDI) Rules require adequate security practices, require a regular audit of these practices, and require the body corporate to be able to demonstrate at any time that it has adequate security standards. This is, in fact, similar to the approach of the GDPR. There is, however, no system of checks to ensure implementation, nor is there any consequence or penalty for non-adoption. Thus, despite the rules being in place, the lack of enforcement makes their implementation is limited.

Representational Image. Pixabay

Representational Image. Pixabay

Accountability under Indian laws

The only consequence that will follow will be under Section 43A of the Information Technology Act when the lack of adequate security practices results in a failure to protect the sensitive data, and this leads to a ‘wrongful loss’ to a person.

The tie-up with the lack of security practices, on the one hand, and the need to prove ‘wrongful loss’ such as financial loss, on the other, further limit the scope of this section. For instance, many a time, the effects of a loss of data, say a hack of a social media site, are felt much later. This can take the form of another cyber attack, say a spearphishing e-mail targeting the head of a business, sent later using details acquired from the social media hack.

Ineffectiveness of Section 43A

For example, consider the major breach of ATM card details in India in 2016. A person whose debit card was compromised as a result of this hack, would not be entitled to compensation under this provision for the hack itself, for the loss of their data or for any mental agony that results. He will only be entitled to compensation if actual financial loss results to him, such as an actual stealing of funds from his account.

The bank itself, will also not be subject to penalties for any negligence under this provision. Since this example considers the highly regulated financial industry, there are penalties under financial laws, such as the Banking Regulation Act. The example nevertheless demonstrates the failure of Section 43A as a privacy law.

Who should be held accountable?

When considering accountability under the new law, resolving this issue with Section 43A is crucial. A key consideration for this is to define what kind of liability must be established under the new law?

Consider a mobile wallet — this will involve a number of different parties — the app developer, the telecom service provider (TSP) providing communication channels, the m-wallet provider itself, a bank that the m-wallet has tied up with, or the cloud service provider whose services the m-wallet provider is using. The operating system of the phone being used may also be transferring data. For a data breach from the m-wallet app, who should be held responsible — the m-wallet company itself, the app developer for a flaw in the app, or the OS manufacturer for any contributory vulnerability? In this case, should liability apply to all the entities (jointly and severally)?

Another option is to establish strict liability, where the data controller is liable for any loss of data, regardless of the adoption of security practices. This is recommended by the White Paper for inherently risky businesses. This may be considered, for example, for the central identities data repository (CIDR), which stores crucial biometric data.

Yet another option to be considered is the common practice of contracting to establish liability among various parties. The RBI, for instance, allows financial entities to enter into contractual arrangements with outsourcing providers to fix liability, but where the agents of the financial entity are concerned, the financial entity will remain responsible for any violations by them.

Determining security standards and codes of practices

The IT (SPDI) Rules require the adoption of security practices ‘commensurate’ with the nature of the business and the data in their possession. An international standard such as the IS/ISO/IEC 27001 has been recommended.

This approach has its advantages, in effect, avoiding a one-size-fits-all approach. Industries are thus also encouraged to self-regulate. Small businesses also, this way, are allowed a lower standard of security. Sector-specific regulations, such as the Information Security Guidelines of the RBI, or its Cybersecurity Framework for Banks, follow a similar format, where an extensive set of security standards have been prescribed, but each organisation is expected to customise these to their specific needs. A bank, thus, will have a different set of security practices from, say, a payment bank or a peer to peer lending platform.

This process also has drawbacks to be considered. TRAI’s latest consultation paper on data privacy, for instance, appears to be a case of regulatory overreach, unless its scope is restricted to the telecom sector. Another issue that arises is with overlapping obligations from various regulators, as seen for m-wallets, with RBI directions on prepaid payment instruments, and the Meity’s IT (SPDI) Rules as well as the proposed Meity rules for prepaid payment instruments (PPIs) specifically, all of which have privacy provisions. Such overlaps can create compliance issues for the governed entities.

Advertisements of Paytm, a digital wallet company, are seen placed at stalls of roadside vegetable vendors in Mumbai, India, November 19, 2016. Picture taken November 19, 2016. REUTERS/Shailesh Andrade - RTSTQTR

Advertisements of Paytm, a digital wallet company, are seen placed at stalls of roadside vegetable vendors in Mumbai, India, November 19, 2016. Picture taken November 19, 2016. REUTERS/Shailesh Andrade - RTSTQTR

Privacy by design

The drawbacks of the notice and consent system for privacy, particularly in view of the overwhelming amounts of data usage for which consent is required, has led to the ‘privacy by design’ approach. This imposes an obligation on businesses to implement ‘privacy by design’ or ‘privacy by default’. Through this, a business is obliged to consider data privacy at the initial design stages of a project as well as throughout the life cycle of the relevant data processing.

This approach is supported in the GDPR but is again not without issues. For example, OS manufacturers release common APIs based on which apps are developed. A flaw in this can lead to flaws in the whole system.

Consider the Apple HomeKit flaw which left smart gadgets developed using the software development kit vulnerable to attacks. Even OSes like Android and Apple have built-in privacy requirements, such as data which an app can and cannot access, raising issues with their conformability to prescribed privacy requirements.

Key questions raised in the White Paper

In view of these issues, the White Paper has presently sought comments on the following key questions with respect to the regulatory approach and enforcement mechanisms to be adopted:

  • Is the co-regulatory approach appropriate for India? What about other models like self-regulation and ‘command and control’?
  • What are your views on the principles of accountability?
  • What organisational measures and standards must be prescribed? Who will determine these standards?
  • Should lack of organisational measures be linked to the harm caused by establishing liability?
  • Should all data controllers be jointly and severally liable for harms caused, or can they have contractual arrangements?
  • Should there be strict liability for specific categories of processing?
  • Should data controllers be required to take out insurance policies to meet liability requirements?
  • What are subject matters for which codes of practice can be developed within a data protection law? What is the process of their preparation? Should stakeholders be mandatorily consulted?
  • Who should issue the codes of practice and how should they be enforced? What are the consequences for violation?
  • Any other views

Part I of the series explores the definitions of personal data and sensitive personal data, Part II of the series examines the jurisdiction and territorial scope of data protection laws, Part III of the series explores cross-border data flows and data localisation, Part IV deals with exemptions to data protection law, Part V deals with notice and consent, Part VI deals with the big data challenge to privacy principles, Part VII deals with processing of sensitive personal data, Part VIII deals with ensuring data quality and Part IX deals with new rights against discriminatory AI decisions, marketing, etc.

The author is a lawyer and author specialising in technology laws. She is also a certified information privacy professional.

Tech2 is now on WhatsApp. For all the buzz on the latest tech and science, sign up for our WhatsApp services. Just go to Tech2.com/Whatsapp and hit the Subscribe button.





Top Stories


also see

science