
An online fight where children need to be saved
The Hindu
India needs an appropriate strategy to fight the production, the spread and the sharing of online Child Sexual Abusive Material (CSAM)
Last month, the Central Bureau of Investigation (CBI) conducted searches across States and Union Territories as part of a pan-India operation, “Megh Chakra”. The operation, against the online circulation and sharing of Child Sexual Abusive Material (CSAM) using cloud-based storage, was supposedly based on inputs received from Interpol’s Singapore special unit, in turn based on the information received from New Zealand. In November 2021, a similar exercise code-named “Operation Carbon” was launched by the CBI, with many being booked under the IT Act, 2000.
In India, though viewing adult pornography in private is not an offence; seeking, browsing, downloading or exchanging child pornography is an offence punishable under the IT Act. However, Internet Service Providers (ISPs) are exempted from liability for any third-party data if they do not initiate the transmission. As the public reporting of circulation of online CSAM is very low and there is no system of automatic electronic monitoring, India’s enforcement agencies are largely dependent on foreign agencies for the requisite information.
The National Center for Missing & Exploited Children (NCMEC), a non-profit organisation in the United States, operates a programme called CyberTipline, for public and electronic service providers (ESPs) to report instances of suspected child sexual exploitation. ISPs are mandated to report the identity and the location of individuals suspected of violating the law. Also, NCMEC may notify ISPs to block transmission of online CSAM. In 2021, the CyberTipline received more than 29.3 million reports (99% from ESPs) of U.S. hosted and suspected CSAM.
In the United Kingdom, the mission of the Internet Watch Foundation (IWF), a non-profit organisation established by the United Kingdom’s Internet industry to ensure a safe online environment for users with a particular focus on CSAM, includes disrupting the availability of CSAM and deleting such content hosted in the U.K. The IWF engages the analysts to actively search for criminal content and not just rely on reports from external sources. Though the U.K. does not explicitly mandate the reporting of suspected CSAM, ISPs may be held responsible for third party content if they hosts or caches such content on their servers. In 2021, the IWF assessed 3,61,062 reports, (about 70% reports had CSAM) and seven in 10 reports contained “self-generated” CSAM.
INHOPE, a global network of 50 hotlines (46 member countries), provides the public with a way to anonymously report CSAM. It provides secure IT infrastructure, ICCAM (I- “See” (c)-Child-Abuse-Material) hosted by Interpol, and facilitates the exchange of CSAM reports between hotlines and law enforcement agencies. ICCAM is a tool to facilitate image/video hashing/fingerprinting and reduce the number of duplicate investigations.
In 2021, the number of exchanged content URLs stood at 9,28,278, of which 4,43,705 contained illegal content. About 72% of all illegal content URLs were removed from the Internet within three days of a notice and takedown order.
In India, the Supreme Court of India, in Shreya Singhal (2015), read down Section 79(3)(b) of the IT Act to mean that the ISP, only upon receiving actual knowledge of the court order or on being notified by the appropriate government, shall remove or disable access to illegal contents. Thus, ISPs are exempted from the liability of any third-party information.

Former CM B.S. Yediyurappa had challenged the first information report registered on March 14, 2024, on the alleged incident that occurred on February 2, 2024, the chargesheet filed by the Criminal Investigation Department (CID), and the February 28, 2025, order of taking cognisance of offences afresh by the trial court.