TikTok ban: The case highlighted the inability of Indian laws to prevent the exploitation of children

We need to take a holistic approach to kids' safety, and a law that addresses this would be welcome.

The ban imposed by the Madras High Court on the immensely popular app TikTok was lifted last week, though the app is yet to be available again on app stores. While the ban itself had raised concerns for content regulation in India, making the  Court’s decision to lift it a welcome one, a persisting issue is the vulnerability of children online. This is an issue that the TikTok ban brought to the fore, and a concern which the Court reiterated even as it lifted the ban.

In particular, the case exposed the weaknesses in Indian laws towards safeguarding children online. To remedy this, the Court had suggested the enactment of an act along the lines of the Children’s Online Privacy Protection Act, 1998, known as COPPA, of the US. Looking at the current laws in India as well as the upcoming Personal Data Protection Bill, 2018, in the context of TikTok’s case, it is clear that not just a privacy law, but an approach that protects children online as a whole is very much required.

TikTok ban: The case highlighted the inability of Indian laws to prevent the exploitation of children

TikTok was briefly banned in India following an interim order from the Madras High Court.

Vulnerabilities highlighted by the TikTok case

The TikTok case brought into focus certain vulnerabilities in Indian law, pointing to the need to strengthen not just privacy law but also intermediary law in this context. For instance, an issue that emerged was, firstly, the need to impose an obligation on intermediaries to prevent the exposure of children to certain types of content, such as pornographic content. This is an obligation which a privacy law like the PDP Bill does not impose. The Information Technology Act, 2000 does impose an obligation to take down content and exercise due diligence but does not have sufficient specific provisions for children’s safety.

For instance, a second issue in the TikTok incident was that BBC investigations against TikTok in the UK had pointed out that while TikTok took down harmful content, it failed to act against the users uploading it in the first place. The inaction meant that though the content was removed, other children continue to be exposed to possible paedophiles. Under Indian law, through intermediaries have a right to suspend such accounts under the IT (Intermediaries Guidelines) Rules, 2011, this is not in the form of an enforceable obligation.

Yet another issue the case brought out is the need to prevent/restrict the interaction of children with strangers, another obligation that is missing under the PDP Bill or any other Indian law at present.

Children’s laws traditionally require more responsibility

Notably, safe harbour for intermediaries has been provided due to the sheer volume of content that they deal with and the difficulty in addressing all issues instantly. Intermediaries, further, are not expected to perform judicial functions in relation to the content they host. Now, intermediary liability in itself is changing, but in the case of children, even traditionally, laws impose a lot more liability.

Section 67B of the IT Act, for instance, which deals with child pornography, prohibits not only sharing but also the viewing and the creation of such content (for general pornographic content, only sharing is prohibited). Section 67B also specifically punishes acts like sexual grooming or child abuse online.

Intermediaries like TikTok, however, are protected under Section 79, with only an obligation to take down content on receiving a police/governmental order to do so. There are no enhanced obligations on intermediaries in case of the discovery or suspicion of a paedophile online. The Protection of Children against Sexual Offences Act, 2012, for instance, punishes a failure to report an instance of child abuse, but there is no clarity about whether this provision applies to intermediaries as well.

Borrowing COPPA’s conditional safe harbour for holistic safety

A privacy law like the COPPA can only address some of these issues. Intermediary liability is an equally serious issue here. What is needed perhaps, is not specifically a law that protects privacy, but a law that holistically deals with the safety of children online.

Features from COPPA can certainly be borrowed for this. For instance, COPPA adopts a safe harbour approach where intermediaries are allowed to self-regulate, and safe harbour is accorded on the adoption of adequate measures for children’s privacy. Such conditional safe harbour could be adopted for the holistic safety of children online in India as well.

The advantage of conditional safe harbour

The advantage of the conditional safe harbour is, firstly, that the self-regulatory mechanism allows the adoption of norms specific to issues that arise with the specific intermediary. TikTok’s concern with interaction with strangers is different from, say, Netflix’s concern with accessibility of content by children, or even PUBG’s concern like causing addiction in them. The measures that each site would need would be different.

What is needed perhaps, is not specifically a law that protects privacy, but a law that holistically deals with the safety of children online.

What is needed perhaps, is not specifically a law that protects privacy, but a law that holistically deals with the safety of children online.

Secondly, the law will lay down certain basic norms, thus setting a minimum standard that intermediaries will need to meet.  This could include, for instance, an obligation to remove content immediately, instead of waiting for a court order, and an obligation to suspend the accounts of suspected paedophiles, abusers, etc. This should, of course, be backed up with proper and quick recourse to the user as well, to deal with mistaken or false allegations as well. Another obligation would be of mandating the reporting of abusive and other content in relation to children, along the lines of the requirement under the POSCO Act.

Uniform restrictions can impose an unfair regulatory burden on different types of or smaller companies while a flexible approach will prevent this and not hamper innovation. Many critics of COPPA, for instance, argue that when a high regulatory burden is imposed in relation to children, companies deal with it by simply barring children below a certain age (in the case of COPPA, 13 years) from using the site. This is an easy way out since the companies escape liability under the law and the costs of compliance. Children, however, tend to find their way around such restrictions simply by lying about their age or even getting their parents to do so. Flexibility would allow companies to devise a method that works for them, while the minimum standards would prevent avoidance of responsibility.

PDP Bill and some changes needed

That is not to say that the specific provisions on privacy for children are not an important step towards children’s safety online. Section 23 of the PDP Bill, in fact, requires parental consent for processing the data of all children below the age of 18 years. The provisions proposed here, however, fall short.

The PDP Bill, for instance, does propose restrictions on profiling, tracking and targeted advertising in relation to children’s data. These, however, are to apply only to ‘guardian data fiduciaries’, which refers to data fiduciaries which will be identified as such, if they direct services at children or if they process large volumes of data on children. A bigger issue that this brings out is that while the law aims to protect children, the safety it ensures appears to be aimed at preventing violations at an organizational level, while children’s safety needs more protections at an individual level.

Consider, for instance, provisions of the kidSAFE+ Certification, a COPPA approved safe harbour certification. This includes provisions such as requiring verifiable parental consent specifically for:

  • Features allowing sharing of information in a public setting or with other users (e.g., chat, in game-messaging, community features, etc.)
  • Send-to-friend features that allow the sharing of information
  • Granting access to third-party plugins that take information, etc.

This indicates, for instance, how a privacy law like the PDP Bill can be used to prevent children from interacting with strangers, as had emerged in the case of TikTok.

Taking a holistic approach to children’s safety online

On the whole, a holistic approach to children’s safety is required, and a law that addresses this would be very welcome. It is hoped that the government will act on the Madras High Court’s directions and consider enacting such a law.

The author is a lawyer specializing in technology, privacy and cyber laws.

Tech2 is now on WhatsApp. For all the buzz on the latest tech and science, sign up for our WhatsApp services. Just go to Tech2.com/Whatsapp and hit the Subscribe button.






also see

science