hiddenFeb 24, 2017 10:12:29 IST
By Asheeta Regidi
The Supreme Court of India has questioned Google on whether something can be done to deal with the uploading of obscene content, such as videos with sexual violence, on YouTube. As Google argues, there is no legal obligation on it to discover and monitor such illegal content. At the same time, progress in technology and cutting-edge artificial intelligence have created a possibility of doing just this, and prevent the very upload of such content.
In fact, YouTube’s very own Content ID system is evidence of this. While a legal obligation at this stage imposes too much of a burden on the intermediaries, there is huge potential in the use of artificial intelligence to resolve this problem.
No legal obligation to monitor content
The present issue arose in a case before the Supreme Court dealing with the increasing menace of rape videos on the internet in India. Legally, Google and YouTube, as intermediaries, have absolutely no obligation to ensure the legality of the content uploaded on their websites (Section 79, Information Technology Act, 2000). The obligations imposed are restricted to timely removal on being informed of illegal content, and the non-involvement of the intermediary in the upload of the content.
YouTube’s Content ID monitors copyright infringement
YouTube’s primary objection is the impossibility of monitoring the millions of videos that are uploaded every day, and the impediment to free speech if it self-polices its content. YouTube, however, is already doing both these things with respect to copyright violations. Copyright infringement suits, such as Viacom’s 2007 suit demanding $ 1 billion in damages, have been filed against YouTube, on the grounds that it is profiting from illegal content on its site (ad revenues). Though YouTube won the suit, it found the need to protect itself from such suits in future. To this end, YouTube developed an innovative, technological solution to the problem - Content ID.
Content ID uses a technique called digital fingerprinting. Every video, and also every image, generate a unique digital hash, similar to fingerprints for a human being. Content ID keeps a library of copyrighted works, provided to it by the owners. From this, it creates a catalogue of the unique hashes of these videos. Every new video that is uploaded is run against this catalogue. If the hashes match, the new video contains copyrighted content, and is not uploaded. Technology has progressed so far that even edited or modified videos can be detected.
Digital fingerprinting software to prevent illegal uploads
This method shows promise in detecting illegal uploads of a similar nature, including rape videos. In fact, YouTube and Facebook are already using these techniques to curb the spread of child pornography and terrorist propaganda. Software systems similar to the Content ID system have been developed to deal with these issues, like Project Arachnid in Winnipeg, Canada, which has created a similar software for child pornographic videos and images, and made it available to law enforcement agencies and NGOs.
The effectiveness of this system is so far restricted to removing reuploads of videos already known to be illegal. However, technology has progressed so far that recently, researchers at the University of Manitoba, Canada, have developed a new artificial intelligence software, Project Cease, which can detect even new content containing child pornography, and prevent its upload.
Replication of content on the internet
The problem of distribution, sharing and reuploads of illegal content is as big as the upload of original content. Nowadays, any content that is put up is instantly replicated across several websites. Even if the videos are taken from YouTube, YouTube is not legally liable for the spread, even if it failed to remove it on time. This is where software systems like the Content ID system is effective. Once the video is discovered, using such software, further spread of the content can be prevented.
Legally, some changes are needed to deal with the issue of replication. As per current law, intermediaries are obligated to remove illegal content within 36 hours, but only when the request is accompanied with a court order or official government request. Precious time can be lost in acquiring these official orders. This need for an official order should be removed for a limited list of aggravated offences, such as rape videos and child pornography. This can expedite the removal of such videos, and prevent the viewing, sharing and redistribution on account of the time lapse.
Implementation easier for host websites than ISPs
From an enforcement perspective, ideally such software should be implemented by the internet service providers, so all such content is covered. Practically speaking, however, this is very difficult. The sheer volume of data that passes through the ISPs, coupled with the fact that most of it is encrypted, and thus inaccessible, are just some of the problems with implementing this.
At this stage, all websites which host the data, and thus have access to it (like YouTube), are in a better position to monitor it. Research needs to be directed towards developing technology that is advanced enough for dealing with this without overburdening the websites. The Content ID system itself was not without hitches, and took over 10 years to fine tune and reach its present stage of considerable accuracy.
Make rape videos an offence on par with child porn
Legally, some immediate steps can be taken towards resolving the issue of rape videos. One issue is that rape videos are governed under obscenity laws (Section 67A, IT Act). This punishes only the publication and sharing of such content, but not browsing, viewing, downloading, etc.. The issue of rape videos needs to be brought on par with child pornography, which have more stringent laws in place (Section 67B, IT Act).
For example, anyone, including service providers, are obligated to report a discovery of child pornography (Section 21, the Protection of Children from Sexual Offences Act, 2012). These laws, both in India and internationally, are driving the researchers and websites to find a technological solution for such uploads. Similar pressure needs to put for rape videos.
Use technology to solve the problem
It must be noted again that legally, there is no such obligation on the websites to do this. Imposing such an obligation at this stage will be highly detrimental to the free use of the internet. At the same time, the problem of rape videos, child pornography and the like on the internet is too huge to be ignored.
Artificial intelligence shows promise in solving this problem, and a combined, concentrated effort from companies like YouTube, the governments and research organisations needs to be directed towards this. The problem of use and distribution of such content on the dark web and through e-mails, still needs to be tackled, but a start can be made in public modes like YouTube and similar websites.
The author is a lawyer with a specialisation in cyber laws and has co-authored books on the subject.
Find our entire collection of stories, in-depth analysis, live updates, videos & more on Chandrayaan 2 Moon Mission on our dedicated #Chandrayaan2TheMoon domain.