By Asheeta Regidi The Tamil Nadu police are considering charging Facebook with the abetment of the suicide of a 21 year old woman in Salem. Morphed pictures of the woman were put up on Facebook. The woman’s family filed a complaint with the police on June 23. The police sent a ‘Law Enforcement Online Request’, Facebook’s official system for requests sent by law enforcement authorities around the world, for the removal of the pictures and asking for the IP address of the offender. The morphed images were finally removed by Facebook on June 27, but unfortunately the woman committed suicide the previous day. Charges of abetment of suicide will not hold The law on abetment of suicide is under Section 306 of the Indian Penal Code, 1860. An essential ingredient to prove abetment of suicide is ‘mens rea’, or ‘guilty mind’. This means that in order to prove that Facebook is guilty of abetment of suicide, it first has to be proved that Facebook deliberately instigated the victim to commit suicide. The act of Facebook, which is its delay in removing the morphed pictures despite receiving a request from the police, should have been done deliberately, with the intention that the victim commit suicide. It’s quite clear that Facebook could not have had this intention. One can consider the case of Ajay Patodia vs State of MP, where a company refused to make a payment that they owed to the victim. The victim, who was being harassed by other creditors for payment, committed suicide. The Court held that it was not possible to hold the company liable for abetment of suicide since there was no active suggestion or direct intention on the company’s part that the victim commit suicide. Similarly, Facebook did not in any way intend for the victim to commit suicide. Therefore, these charges will not hold. Other charges against Facebook are difficult due to intermediary immunity This case brings to light a problem that has arisen because of the change in law and policy regarding the removal of illegal content. Just a few months ago, Facebook changed its policies regarding the removal of illegal content , like the morphed pictures in this case. The updated policies stated that Facebook would remove content that violated local laws and its community guidelines, but that it would not remove content that was not in violation of its guidelines without a direct order from the government or Indian courts. This was based on an Indian Supreme Court judgment in 2015, which said that intermediaries, or websites like Facebook, would be obligated to remove illegal content within 36 hours only in case of receiving an official court or government order. Prior to this judgment, an intermediary had to remove any illegal content within 36 hours, on being told about it by anyone, including the victim themselves. An intermediary who fails to do so would lose his immunity under Section 79 of the Information Technology Act, 2000. Now, an official court/government order is required, which can be very time-consuming to acquire. The official police request sent in this case does not count. The result is that Facebook’s immunity in this case remains. Facebook breached its duty of care towards the victim The act by Facebook of not removing the content on time may not constitute the crime of abetment of suicide, but is certainly an act of gross negligence, or a breach of the duty of care that Facebook owed to the victim. For holding Facebook responsible for this particular act, it will first have to be proved that Facebook lost its immunity. Section 79 of the IT Act imposes a responsibility of ‘due diligence’ on Facebook, the violation of which can lead to a loss of immunity. The obligation of 36 hours imposes a time frame in case of an official order, but surely that does not mean that the intermediary can take any amount of time to remove illegal content in the absence of an official order. The intermediary still has a duty not to host harmful content. The fact that there was a delay of 4 days in the removal of content in this case, shows a breach of this duty. If the Courts agree that this delay can result in the loss of its immunity, it is possible to hold Facebook responsible for gross negligence in this particular case. Facebook should take on greater responsibility towards its users While holding Facebook liable for this specific act may prove to be difficult, that does not mean that Facebook should be allowed to continue to be negligent about performing its duties in general as an intermediary. It’s very surprising that neither Facebook’s page on ‘Information for Law Enforcement Authorities’ or its ‘Community Standards’ lists any time frame for the removal of such content. This is highly unfortunate as victims are left with no option but to wait unendingly for Facebook to choose to take action. In the cyber world that we live in, every moment that such an image remains published online only increases the number of people who can view the image. There is also nothing to stop a viewer from retaining a permanent copy of such images. Action against such content has to be instant. Governments around the world should mandate that Facebook adopt greater responsibility towards its users. Facebook itself ought to either commit to a time frame for the removal of such content, or should make a provision for emergency cases. Additionally, the Supreme Court’s limitation on the obligation to remove illegal content within 36 hours of an official court/government orders should be reconsidered, or at least expanded to include requests from law enforcement agencies. The author is a lawyer with a specialisation in cyber laws and has co-authored books on the subject.
The Tamil Nadu police are considering charging Facebook with the abetment of the suicide of a 21 year old woman in Salem.
Advertisement
End of Article
Written by FP Archives
see more