ADVERTISEMENTREMOVE AD

Does India Need Inspiration From the EU’s Proposed Regulation to Combat CSAM?

The EU's Chat Control Law, however, raises significant concerns regarding privacy and freedom of expression.

Published
story-hero-img
i
Aa
Aa
Small
Aa
Medium
Aa
Large

Countries around the world, including India, face increasing pressure to address and control online child sexual abuse. The scope of the problem is broad, involving various forms of exploitation and abuse, including the production and distribution of child sexual abuse material (CSAM), online grooming, and live-streaming abuse. In 2022, the National Centre for Missing & Exploited Children (NCMEC) received more than 31 million reports of suspected child sexual abuse material (CSAM).

During the same year, the Internet Watch Foundation (IWF) identified and addressed nearly 40,000 webpages containing CSAM. Research published in the Journal of Interpersonal Violence revealed that approximately 30 percent of children who were groomed online also reported experiencing physical sexual abuse, highlighting a troubling connection between online grooming and actual abuse. Europol's 2021 report documented a 500 percent increase in the detection of live-streaming abuse over the previous five years, indicating a sharp rise in this type of exploitation. 

In India too, according to NHRC, CSAM  has increased by 250 to 300 percent on social media in India. There were 204056 cases reported in the year 2022, 163633 in the year 2021, and 17390 in the year 2020 of child sexual abuse material on social media in India.  
ADVERTISEMENTREMOVE AD

India does not have any concrete law to control CSAM. However, the European Commission has proposed a law that establishes guidelines for preventing and combating online child sexual abuse. Specifically, it entails requirements for service providers to identify and detect known child sexual abuse materials and advocates for the establishment of a European Centre dedicated to preventing and combating child sexual abuse. The proposal mandates that messaging apps must scan all images and links to detect and report child abuse material and groom conversations between potential offenders and minors.

This means that when users upload pictures and links through these apps, they will be scanned, and users will be notified of this practice in the terms and conditions. Users who do not comply with these requirements will be prevented from sending pictures and links. Even highly secure apps that use end-to-end encryption, such as WhatsApp, Signal, and Messenger, would need to implement these measures. The draft proposal, however, excludes "accounts used by the State for national security purposes."

However, the most contested aspect of the proposal is the obligation on tech companies to deploy client-side scanning (CSS) technology to scan the messages of users, including end-to-end encrypted communications on platforms such as Meta’s WhatsApp, when a risk is identified.  

With the introduction of laws like the Information Technology Act 2000 and the recent Digital Personal Data Protection Act, India has been increasingly focusing on digital regulation. India’s current laws do not fully address the nuances of mandatory content scanning and privacy trade-offs. Harmonising these laws with new regulations similar to the EU's could be complex. Therefore, India too can look forward to adopting this law.  

Let’s first see what the law is and what are the issues in it.  

Proposed Chat Control Law

The obligations as provided under the proposed regulation apply to four types of service providers.

Firstly, on the hosting services providers that store information on behalf of users, often for the purpose of making such information accessible to third parties, including social media platforms such as Twitter, or Instagram. Secondly, on the Interpersonal Communications Services providers that facilitate the direct exchange of information between designated individuals, comprising email services and instant messaging applications such as WhatsApp. Thirdly, on the Software Application Stores which provide platforms for downloading software applications and lastly, on the Internet Access Services providers that offer access to the internet. 

Under this law, an obligation is mandated to providers of hosting services to remove specific depictions of sexualised violence or block access to such depictions if removal is unfeasible due to the distribution of such content by non-cooperative hosting services located in non-cooperative third countries. This obligation to actively seek out potentially incriminating content may be mandated by a detection order.

A detection order may pertain to content that is already identified and catalogued as depicting sexualised violence, any new depictions of sexualised violence or any content involving the solicitation of minors for sexual purposes. The said order also applies in cases of interpersonal communications meaning thereby that the authorities can oblige providers of communications services such as WhatsApp or Signal to monitor their users’ private communication. 

Further, the proposed Chat Control legislation mandates that digital messaging services implement a content moderation system for uploading content. All content including photos, videos, and links must be subjected to scanning prior to transmission to the recipient. Such scans will involve comparing the content against a government database of known Child Sexual Abuse Material (CSAM) through the use of AI-powered algorithms to identify potential matches. If found suspicious, it will be flagged for further examination by human moderators. During this review process, the message shall be withheld and remain undelivered to the intended recipient until it has been determined to be safe. 

ADVERTISEMENTREMOVE AD

Issues in the Law

The major effect of the regulation revolves around the infringement of the fundamental right to expression because of indiscriminate mass surveillance. This effect arises irrespective of whether service providers monitor the content of private communications through a backdoor in encryption technology or by scanning the content on the user’s device prior to encryption. Interpersonal communication services such as WhatsApp provides end-to-end encryption which implies that the communication between the originator and the addressee of the message is secure from any access by a third party.

The proposed regulation suggests scrutinisation of the content even before they are encrypted or sent. The orders infringe upon the confidentiality of communications, which is safeguarded by the fundamental right to privacy. Thus, the integrity of the communication is severely hampered, compelling the users to restrict themselves while exercising their freedom of communication. 

The regulation further stipulates the implementation of age verification by the service providers as stated above. This simply means that the said obligation applies to all the email and messaging services that enable communication between adults and minors. This clause severely restricts the fundamental right to communication of the minor.  

The “user consent” policy of the regulation asking for the user’s prior consent for scanning the message even before it is actually sent also appears farce. Because, once the user refuses to give consent, he/she will be blocked from sending or receiving photos, videos, and links on the platform. As a result, the users are left with a Hobson's choice. Another shortcoming of the proposed legislation is that the same technology that is being used to track down CSAM, can be used to censor content that relates to political dissent or any personal views against the government. 

Conclusion

In conclusion, the rising tide of online child sexual abuse material (CSAM) presents an urgent challenge that demands innovative and effective solutions. Although India does not have a law to prevent the CSAM, India does have a law (POSCO Act) that prohibits watching and storing child pornography (also clarified in Just Rights for Children Alliance & Another v S. Harish & Others (2024)).

In fact, the Supreme Court asked the Parliament to replace the word child pornography with child sexual exploitative and abuse material in the POCSO Act. Moreover, it directed all the courts and judicial authorities to use this word. The European Union's proposed Chat Control Law offers a potential model for addressing these issues by mandating content scanning and reporting mechanisms across various digital platforms. However, this approach raises significant concerns regarding privacy, freedom of expression, and the potential for misuse of surveillance technologies.

(Ravi Singh Chhikara is a practicing advocate at the Delhi High Court. Vaishali Chauhan is a practising advocate at Hon’ble Supreme Court and Delhi High Court. This is an opinion piece and the views expressed above are the author’s own. The Quint neither endorses nor is responsible for the same.)

(At The Quint, we question everything. Play an active role in shaping our journalism by becoming a member today.)

Speaking truth to power requires allies like you.
Become a Member
×
×