What Are Non-Consensual Intimate Images? Can Google, Microsoft Identify & Remove Them? Case Explained – News18

Are search engines not capable enough to identify and remove non-consensual intimate images (NCII)? Apparently, not. Tech giants Microsoft and Google have told the Delhi High Court during an appeal filed by them last year that Artificial Intelligence (AI) tools are not perfect enough, and are, however, “in the process of being developed”.

The high court asked the tech giants to file a review petition of the single-bench order on April 26, 2023 that had directed them to “employ the already existing mechanism with relevant hash-matching technology”.

The division bench of acting Chief Justice Manmohan and Justice Manmeet Pritam Singh Arora said, “This court is of the view that the appellants should file a review petition and bring these facts to the notice of the learned single judge. In the event the appellants are aggrieved by the order passed by learned single judge in the review petitions, the appellants will be at liberty to seek revival of the present petitions.” It further clarified that in case Google and Microsoft file a review petition before the single-judge bench, it will not be dismissed “on grounds of delay”.

What Microsoft and Google Told Court?

Senior advocate Arvind Nigam appearing for Google said search engines are a “different cup of tea”. Nigam clarified that he was appearing for the search engine Google, and not YouTube, which has accepted the single judge bench’s order.

“If it is removed by the hosting platform it will not show up in the search…I’m the bridge to the place where the content is; the content is not with me…The more permanent method according to the statute is a direction to the hosting platforms to remove such content. The learned single judge has unfortunately saddled search engines with the requirement of significant social media intermediaries,” Nigam said, as quoted by The Indian Express.

Nigam explained that an automated algorithm may not be able to read what appears identical to the human eye. He said in cases of “Child Sexual Abuse Material (CSAM), the technology exists because the computer recognises the child”. But for NCII, the AI is not good enough to figure out consent.

Google is “voluntarily” using its systems to detect and remove copies of such content from image search, but the photos can be easily modified and evade detection via current hash-matching technology, he added.

Google had previously told the court that the legality of non-consensual intimate images depends on the context in which it was captured or shared while Child Sexual Abuse Material is universally illegal, reported The Hindu. Due to this, non-consensual intimate content may appear on the internet despite efforts to prevent it from happening, clarified Google.

Advocate Jayant Mehta, appearing for Microsoft, said, as quoted by Scroll.in, “To say that you [search engines] are required to do it today otherwise your immunity is gone [under the Information Technology rules], that cannot be. “It is work in progress. I am endeavouring to reach it. But to say that I must do it today is not fair.”

What was the April 2023 Court Order?

The April 2023 order was passed in a plea of a woman seeking to block certain sites carrying her intimate images and register an FIR against a man who she allegedly acquainted himself with her through social media.

The April 2023 judgment stated, “They cannot be allowed to avoid their statutory obligations by stating that they do not have the necessary technology, which is patently false as has been exhibited during the course of hearing.”

The single judge had stated, “Non-Consensual Intimate Images abuse, which includes revenge porn, violates the right to privacy and causes psychological damage to the victim”.

The single judge had asserted that search engines were obligated to observe due diligence while discharging their duties under Rule 3 of the IT Rules, including making reasonable efforts to prevent hosting, displaying, uploading or sharing of any information that is invasive of another person’s privacy and violates any law for the time being in force, or they will lose the protection from liability accorded to them under Section 79 of the IT Act.

What are Non-Consensual Intimate Images?

Non-consensual intimate images are the intimate pictures of a person uploaded on the internet by a third party without the consent of the one who is in the photo.

Google and Microsoft can remove URLs from appearing on their platforms (Google Search and Bing) by de-indexing them but the solution fails in the absence of a URL.

If an individual chooses to share their own intimate images with someone, for example, through sexting, this does not give the recipient the right to distribute the content. The non-consensual sharing of consensual images is a peer-related form of sexual exploitation and is in fact the most frequently reported type of NCII abuse.

Another reason why an offender gets an NCII is for sextortion in which the person demands sexual favours or money or other benefits under the threat of sharing intimate images or videos.

Laws Against NCII

Section 354C of IPC tackles voyeurism, penalising the act of watching or capturing the image of a woman in a private act without her consent. This section addresses the violation of privacy in the digital age, where such acts can include placing cameras in private spaces or circulating intimate images without consent. The punishment for voyeurism can extend up to three years of imprisonment and a fine. However, the offence is bailable.

A pertinent example would be the case of Shivam Sharma v. State of MP and Anr., wherein the accused shared intimate pictures of the victim with her father without her consent. The court observed that these facts made out a prima facie offence under the Section.

As per the National Crime Records Bureau, there were 1,513 cases of voyeurism in 2021. The state with the most cases was Maharashtra (210), followed by Andhra Pradesh (159) and Odisha (148). The city with the most cases was Mumbai (73), followed by Delhi (22), Hyderabad (18) and Chennai & Kolkata (17 each).

The gender-neutral Section 66E of the Information Technology Act, 2000 applies in case of violation of privacy intentionally or unknowingly, capturing, publishing or transmitting the image of a private area of any person without his or her consent.

‘Under circumstances of violating privacy’ means (i) he or she could disrobe in privacy, without being concerned that an image of his private area was being captured; (ii) any part of his or her private area would not be visible to the public, regardless of whether that person is in a public or private place.