WhatsApp child sex abuse report: Self regulation of social media isn't working; need govt norms

Cyber Peace Foundation report claims that WhatsApp is being used to share child sexual abuse videos. IT ministry has written to WhatsApp

Social media sites are becoming a veritable hub of sexually explicit content and those related to Child Sexual Abuse Material (CSAM). These platforms which are meant for adults and allow children over 13 years of age, end up becoming platforms for paedophiles to scour. YouTube, WhatsApp, TikTok and many other platforms have dark corners where a lot of content featuring young children is running riot — and a lot of it is sexual in nature.

For instance, TikTok was banned by the Madras High Court on 3 April on the suspicion of hosting 'pornographic content'. The ban on the app was lifted on 24 April, but only after getting an assurance from TikTok's parent company ByteDance that measures would be taken to ensure obscene content wasn't hosted on the platform.

Now the Information Technology ministry has issued a notice to WhatsApp based on a report by Cyber Peace Foundation, which claimed that the platform was being used to share child sexual abuse videos. The ministry has asked the Facebook-owned company to take steps to prevent such misuse.

WhatsApp's limit of five forwards per person which started in India has now expanded globally. Image: tech2

Representational image. Tech2

The ministry's latest move comes following a report that an investigation, conducted by Cyber Peace Foundation, allegedly found that chat groups on WhatsApp continue to be created and used to disseminate CSAM in India. The Foundation has also come up with a second report on CSAM and chat groups.

According to Cyber Peace Foundation, “The report on WhatsApp groups highlights how invite links to groups are used to share CSAM across the world. People join groups using these links and active solicitation of lewd content happens. Hence, government regulation is the need of the hour because self-regulation by platforms is not helping.”

“WhatsApp cares deeply about the safety of our users, and we have zero tolerance for child sexual abuse. We rely on the signals available to us, such as group information, to proactively detect and ban accounts suspected of sending or receiving child abuse imagery. We have carefully reviewed this report to ensure such accounts have been banned from our platform. We are constantly stepping up our capabilities to keep WhatsApp safe, including working collaboratively with other technology platforms, and we'll continue to prioritise requests from Indian law enforcement that can help confront this challenge," a WhatsApp spokesperson told Firstpost.

The Cyber Peace Foundation’s second report titled, ‘We have a BIGO problem on the Internet and it needs to be stopped’. The report explores the proliferation of several types of unlawful content and activity through a live streaming app, paving the way for several legal violations and their harmful impact on society at large.

“The report on Bigo Live is about the lawlessness and lewd content on the live streaming application. For instance, exposure to obscene content for children. Though the app is marked safe for use by people above 12 years of age, the content certainly isn’t,” according to Cyber Peace Foundation.

Cyber Law expert Dr Pavan Duggal puts the raging debate in perspective. “The IT Act’s Section 67 is clear that publishing or transmitting pornography is an offence. Publishing and transmission of child pornography or even browsing of child pornography is also an offence. Intermediaries have to exercise due diligence, and they must remove it. Since the Ministry of Electronics and Information Technology (MeitY) is a governmental agency, they are entitled to tell WhatsApp that this kind of child pornographic content is an offence under section 67, therefore they should remove or disable access over the network."

There are a variety of kinds of content which users are obliged not to host, share or upload on platforms. These are given under Rule 3(2) of information technology intermediaries’ guidelines rules 2011. But after Shreya Singhal Vs Union of India judgement, the service providers have got into a futile argument.

The Supreme Court (SC) says do not act till you get a court order or instructions from a law enforcement agency to remove or disable the content. Now, the intermediaries don’t act even if mayhem happens on their network because of the SC order. No wonder, most of the platforms don’t do anything even in cases of child pornography until a court order.

On the subject of appointment of grievance officer, Duggal cites the rule book. “Under the IT Act Intermediaries guidelines, they are duty-bound to appoint a grievance officer. However, most of the companies haven’t done it. But, of late they started to appoint a grievance officer. For example, WhatsApp has its grievance officer in the US."

The crux of the issue: Should these social media sites be banned? But, banning may not be a solution because of the bigger menace of illegal online content. Besides, it’s technologically impossible to stop the uploading of illegal content.

Be that as it may, the government has to bring these sites under the purview of some sort of regulation lest they become vehicles of illegal content.

Here's the complete PDF file of the report by Cyber Peace Foundation.

We Have a BIGO Problem on t... by on Scribd

Find latest and upcoming tech gadgets online on Tech2 Gadgets. Get technology news, gadgets reviews & ratings. Popular gadgets including laptop, tablet and mobile specifications, features, prices, comparison.