View Details << Back

With their hypocrisy and irresponsibility, how intermediaries hurt India

  Social media — the barrier-free, communication enabler — has its dark side. The Chennai case, currently in Supreme Court (SC), is a case in point. Two observers of jallikattu appointed by the Animal Welfare Board of India were threatened on social media. Following no redressal on their complaint, they filed two public interest litigations (PILs) in which they asked for pre-requisitioning Aadhaar before a social media account is created. Besides being invasive, in my opinion, the Aadhaar/KYC requirement is a myopic approach to address cyber crimes since they operate in nebulous spaces beyond sovereign boundaries. There are possible dangerous consequences in sharing secure data.
However, harmful content and its arbitrary circulation needs regulation. As any agency that enables permeation of illegal activity through its spaces, intermediaries (tech companies) too are liable for content posted if they do not take corrective action. The high court’s genuine intervention of exploring options, common in all PILs with wide ramifications, has suddenly raised questions about the invasion of privacy, curtailment of freedom of speech, judicial overreach and over-regulation. Camouflaged under all these arguments is an effort to oppose any external attempt to regulate illegal content.
Since 2015, I have been arguing a PIL (In Re: Prajwala case) in the SC pertaining to arbitrary and rampant circulation of Child Sexual Abuse Material (CSAM) and Rape and Gang Rape (RGR) imagery on social media. Through the case, and as a committee member appointed to explore technical solutions to pre-empt the circulation of these materials, I have closely engaged with intermediaries on the question of regulating clearly illegal content. While opposing regulation and steadfastly advocating zero accountability, they also talk about privacy and freedom of speech.
Relying on a judgment of the SC in the Shreya Singhal case, intermediaries claim no liability. This argument is completely misplaced. Section 79 of the IT Act which grants exemption to intermediaries in some cases, makes a clear exception for illegal content expecting the intermediary to observe “due diligence while discharging his duties under this Act and also observes such other guidelines as the Central government may prescribe in this behalf”. In the Singhal case, the SC, while refusing to strike down intermediary responsibility under Sec 79(3)(b), narrowed illegal content to include only the exceptions to freedom of speech under Article 19(2) of the Indian Constitution.
The argument of privacy is ironical. On social media, surveillance is by default and privacy is an option. Disregarding user privacy, significant resources are invested to enable advertisements to appear seamlessly by developing algorithms to tap into personal preferences of users. Purportedly developed to enhance the quality of services, it is actually a ruse to promote their economic interests. It is appalling that they refuse to invest or commit to anything substantial that would assist in identifying the genesis of illegal material.
During the course of the research for Prajwala case, one found several initiatives across the globe to mitigate the damage at least for CSAM. To name a few, in the US, it is a mandatory reporting process in which intermediaries report all CSAM in their portals to the National Centre for Missing and Exploited Children. In Canada, there is an agency which uses crawler technology developed by Microsoft to weed out CSAM. Quite a few of the intermediaries contribute to the Internet Watch Foundation in the UK which traces CSAM. However, for reasons one cannot fathom, they refuse to take any tangible action in India. For instance, Microsoft, which licenses its photo DNA technology, a technology to detect known images, free of charge, refused India the same and instead, is selling a similar software. While intermediaries proudly claim to mandatorily report CSAM in the US, they refuse to comply with a similar statutory requirement in India. In the Prajwala case, almost all suggestions made to mitigate damage like pop-up warnings, reporting RGR content since it is not reported elsewhere, retaining India-based data, were rejected by intermediaries on specious grounds. For instance, small changes like activating an easily accessible “report” prompt took many months of sittings and a few court hearings for WhatsApp. Even today, the consequence of reporting to WhatsApp is not clear. Similarly, in order to prevent mechanical forwarding, a suggestion made to WhatsApp to add to their existing automated message, a line to indicate that the person forwarding the message is liable for the contents forwarded, was vehemently opposed. Except for making some changes in the internal reporting system, after the Court compelled them, the intermediaries have taken no effective steps to mitigate the problem.
Cyber cell officers across the country have shared that it is next to impossible to get full cooperation from the intermediaries even in serious cases like human trafficking. We are dealing with a crime scene that is offering unimaginable advantage to the perpetrators. Regulation is necessary. Cooperation is key. It is time the intermediaries stop their double speak and take action.
Aparna Bhat is a Supreme Court lawyer
The views expressed are personal



© All rights Reserved. The south Asian, Published Weekly from New york.