Apple will scan images of child sexual abuse; Winning praises are worried by privacy advocates

New Delhi: Apple Inc. has revealed a new tool that will scan US iOS phones for images of child sexual abuse, while child protection groups have welcomed the new technology, with many security researchers reporting that it could be misused.

Apple also plans to scan its users’ encrypted messages for sexually explicit content as additional child protection measures alarmed privacy advocates.

Read also: Hindu temple attack in Pakistan: PM Imran Khan commits to ‘restore the temple’ as India expresses ‘grave concern’

The tool, called “NeuralMatch,” detects known images of child sexual abuse, scanning the images before they are uploaded to iCloud. If it finds a match, the image will be reviewed by a human, if it is confirmed to be pornography the user’s account will be disabled, and the National Center for Missing and Exploited Children will be notified.

Now, there are concerns regarding privacy, said Matthew Green, a top cryptography researcher at Johns Hopkins University who said innocent people could be implicated by sending them images that could trigger the system, fooling Apple’s algorithms, and may alert law enforcement. Researchers have been able to easily dodge such systems, he said.

Various tech companies such as Microsoft, Google, Facebook, and others have been sharing digital fingerprints of known child sexual abuse images for years. Apple has used those to scan user files stored in its iCloud service, which is not as securely encrypted as its on-device data, for child pornography. However, Apple has been under pressure from authorities for access to information to help it commit crimes such as terrorism or child sexual abuse. It was one of the major companies to adopt “end-to-end” encryption.

Researchers are also concerned about government surveillance, especially of dissidents or protesters. The Washington-based non-profit Center for Democracy and Technology called on Apple to drop the changes, which it said effectively destroy the company’s “end-to-end encryption” guarantees. Scanning messages for sexually explicit material on the phone or computer effectively breaks down security, it was said by the AP.

However, child rights activists and protection groups have praised the device.

“Apple’s expanded protections for children are a game-changer,” John Clark, president and CEO of the National Center for Missing and Exploited Children, said in a statement, according to an Associated Press report. “With so many people using Apple products, these new safety measures have life-saving potential for kids.”

Similarly, Julia Cordua, CEO of Thorne, a nonprofit founded by actors Demi Moore and Ashton Kutcher that uses technology to help protect children from sexual abuse, said that Apple’s technology is “for children”. balances the need for privacy with digital security.

.

Leave a Reply