Apple will investigate iCloud photo uploads for child abuse images

Apple Inc. said Thursday it will implement a system that checks matching photos with known images of child sexual abuse on iPhones in the United States before uploading them to iCloud storage services.

If enough child abuse image uploads are detected, Apple will initiate a human review and report the user to law enforcement officials, the company said. Apple said the system is designed to reduce false positives to one trillion.

With the new system, Apple is trying to address two imperatives: requests from law enforcement to help prevent child sexual abuse, and the privacy and security practices the company has made a core tenet of its brand. Other companies such as Facebook Inc. use similar technology to detect and report child sexual abuse.

Here’s how Apple’s system works. Law enforcement officers maintain a database of known child sexual abuse images and translate those images into “hashes” – numerical codes that positively identify the image but cannot be used to reconstruct them.

Apple has made its own implementation of that database, using a technique called “NeuralHash” that is designed to be edited but is identical to the original spec. That database will be stored on the iPhone.

When a user uploads an image to Apple’s iCloud storage service, the iPhone will create a hash of the uploaded image and compare it with the database. Apple said only photos stored on the phone are not checked.

The Financial Times had previously reported some aspects of the program.

One key aspect of the system is that Apple checks photos stored on the phone before they’re uploaded, rather than checking the photos after they arrive on the company’s servers.

On Twitter, some privacy and security experts expressed concern that the system could eventually be expanded to scan phones for prohibited content or political speech.

“Whatever Apple’s long-term plans are, they have sent a very clear signal. In his (very impressive) opinion, users are not allowed to restrict content,” Matthew Green, a security researcher at Johns Hopkins University, wrote in response to earlier reporters. It’s safe to build systems that scan K phones.” Right or wrong hardly matters at that point. This will break the dam – governments will demand it from everyone.”

read all Breaking Newshandjob breaking news And coronavirus news Here

Leave a Reply