A person who shares defamatory content should be alone liable for the consequences, and holding anyone else accountable for it is simply unfair. Now, it is only right to curb speech that could lead to violence or defamatory statements made against a person/situation, but the government fails to understand how WhatsApp functions and the limitations that the technology brings. A law or policy cannot be made without understanding the technology being used.
A few years back we'd say there's an app for everything! Well, now-a-days, there is a WhatsApp group for everything, be it social connectivity or businesses. But who would have ever thought the onus that lies on the WhatsApp group admins would be large enough to make them liable and punishable for actions by the members.
So, if you are a group admin, as per the IT Act, you become an 'intermediary' who is liable for every message sent/shared on the group. In the past, we've seen arrests made by Maharashtra police as an admin couldn't curb content posted in the group. Similar incidents continued in Jammu and Kashmir and Jharkhand, among others.
Now, it is only right to curb speech that could lead to violence or defamatory statements made against a person/situation, but the government fails to understand how WhatsApp functions and the limitations that the technology brings. A law or policy cannot be made without understanding the technology being used.
Too many loopholes
Placing the onus on group admins for any irresponsible remark has just too many loopholes. To begin with, the IT Act in question is an age old law that was developed in 2000 and then amended in 2008. Now, it should be noted that WhatsApp was launched only in 2009. And, understanding the limitations that come with this technology is extremely important before penning down a policy.
To begin with, a WhatsApp admin doesn't really have control over what is shared. He or she doesn't even have the ability to delete a remark or editorial powers to pull down any content shared by a group member. If the admin chooses to simply leave the group, the role is automatically assigned to the next person whether he or she likes it or not.
There is a prohibition on sharing irresponsible content, but who gets to decide what can be anti-social. Irresponsible content may vary on a wider spectrum and include several activities, and what could be offensive to one may not necessarily be for others. Moreover, asking an admin to remove content could only mean a form of censorship, something that we have been fighting all along, aren't we?
High Court breather, and a relapse
In a significant move, the Delhi High Court had introduced a breather to these admins. The court had said that admins won't be responsible for comments made by the members. According to the New Indian Express, Justice Rajiv Endlaw had said, "I am unable to understand as to how the administrator of a group can be held liable for defamation, even if any, by the statements made by a member of the group."
Interestingly, and rightly so, the court said that making an administrator of an online platform liable for defamation is akin to making the manufacturer of newsprint used to publish defamatory statements liable for defamation. Precisely talking about a case involving admin of messaging app Telegram and a Google group, the statements were made.
However, a new report now claims that the quashed decision, has erupted its ugly head. With concerns raising over fake news, morphed photographs and disturbing videos with fabricated local narratives being shared on social media, a joint order issued by District Magistrate Yogeshwar Ram Mishra and Senior Superintendent of Police Nitin Tiwari, it has been made clear that any factually incorrect, rumour or misleading information on a social media group could result in a FIR against the group administrator.
the orders says that if any statement is made by a group member that is fake, may cause religious disharmony or rumour, the admin must remove the member from the group. And, the post must be reported to the nearest police station.
Who decides what is defamatory or hurtful
This would mean, a WhatsApp admin would have a full-time duty of monitoring the groups. What if he or she hasn't visited the group in the longest time and missed on a few posts? Other issue is who decides what is defamatory or hurts religious sentiments. What one may find hurtful, some one else may not.
Rumoured recall and edit features may be on the way, but that's not the solution.
With WhatsApp becoming such an integral part of our daily lives, and the virtual world that we live in inching closer to reality, the new rumoured features could come to the rescue. The new features, if make their way to users, will allow recalling and editing sent messages. Off late, we've been hearing about 'Unsend', but an unsend will only allow a user to retract his post. It won't necessarily give permission for the group admin to do so.
While edit/delete rights to admins (and just about everyone in the group) the right to edit/pull down what is shared in their group, but that is still not a solution.There is no concrete report if and when such features will be rolled out. Government needs to understand how the technology works before proposing such rules. Now, will group admins keep vigil of what is being shared at all times. Well, we don't think so. To put it plainly, a person who shares defamatory content is alone liable for the consequences, and holding anyone else accountable for it is simply unfair.