We don’t have tech to automatically detect, remove non-consensual intimate images without specific URLs: Microsoft, Google
The Hindu
Microsoft, Google informed the Delhi High Court that currently they don’t have the technology to automatically detect and remove non-consensual intimate images without specific URLs.
Microsoft, Google on Thursday informed the Delhi High Court that currently they don’t have the technology to automatically detect and remove non-consensual intimate images (NCII) without specific URLs.
“AI [Artificial Intelligence] has not yet reached the level to figure out consent,” the tech giants said, while challenging an order passed by a Single Judge Bench of the High Court in April last year.
The Single Judge had then said that since the social media intermediary entities already possessed tools for prevention of child pornography, it can be deployed to reduce NCII abuse.
Microsoft has developed a software — Photo DNA — which is currently being used to identify CSAM and is also being used by platforms such as Google and Twitter. YouTube has also developed CSAI (Child Sexual Abuse Imagery) Match which is used by NGOs and other companies to identify against the database of known abusive content.
The Single Judge had cautioned social media intermediaries that they risk losing their liability protection if they fail to adhere precisely to the time frame outlined in the Information Technology Rules (IT Rules) for removing NCII.
On Thursday, counsel for the tech giants told the court that, “It’s possible that in a year’s time, may be a year-and-half, AI reaches that level, but to saddle me now and say I will lose my immunity, that’s the problem area.”
The counsel stated that for Child Sexual Abuse Material (CSAM), the technology already exists but not for ‘non-consensual sexually explicit images’.