Apple will scan all US iPhones for images of child sex abuse

Apple unveils plans to scan the US iphone Images of child sexual abuse drew applause from child protection groups, but raised concerns among some security researchers that the system could be misused, with governments seeking to survey their citizens. A tool designed to detect known images of child sexual abuse, called “NeuralMatch,” will scan the images before uploading them iCloud. If it finds a match, the image will be reviewed by a human. If child pornography is confirmed, the user’s account will be disabled and the National Center for Missing and Exploited Children will be notified.

Separately, Apple plans to scan users’ encrypted messages for sexually explicit content as a child safety measure, which also worries privacy advocates. The detection system will only flag images that are already in the center’s database of known child pornography. Parents snapping innocent pictures of a baby in the bath probably don’t have to worry. But the researchers say the matching tool – which does not “see” such images, only the mathematical “fingerprints” that represent them – could be put to more nefarious purposes. Matthew Green, a top cryptography researcher at Johns Hopkins University, warned that the system could be used to frame innocent people by sending them spontaneous images designed to trigger matches for child pornography. This could fool Apple’s algorithm and alert law enforcement. “Researchers have been able to do this very easily,” he said of such systems’ ability to deceive.

Other abuses may include government surveillance of dissidents or protesters. “What happens when the Chinese government says, ‘Here’s a list of files we want you to scan,'” Green asked. “Does Apple say no? I hope they say no, but their technology says no.” including tech companies Microsoft, Google, Facebook And others have been sharing digital fingerprints of known child sex abuse images for years. Apple has used those to scan user files stored in its iCloud service, which is not as securely encrypted as its on-device data, for child pornography.

Apple The government has been under pressure for years to allow increased surveillance of encrypted data. Coming up with new security measures required Apple to strike a delicate balance between cracking down on the exploitation of children while maintaining its high-profile commitment to protecting the privacy of its users. But a frustrated Electronic Frontier Foundation, the online civil liberties pioneer, called Apple’s agreement on privacy protections “a shocking face for users who have relied on the company’s leadership in privacy and security.”

Meanwhile, the computer scientist who invented PhotoDNA more than a decade ago, a technology used by law enforcement to identify child pornography online, acknowledged the potential for abuse of Apple’s system, but said that It is much more than an imperative to combat child sexual abuse.

“Is that possible? Of course. But is it something I’m worried about? No,” said Hani Farid, a researcher at the University of California at Berkeley who argues that the key to securing devices from various threats is “This type of mission creep” is not seen in many other programs designed. For example, WhatsApp offers users end-to-end encryption to protect their privacy, but also employs a system to detect malware and warn users not to click on harmful links. Apple was one of the first major companies to adopt “end-to-end” encryption, in which messages are scrambled so that only their senders and recipients can read them. However, law enforcement has long pressed the company for access to that information to investigate crimes such as terrorism or child sexual abuse.

Apple said the latest changes will roll out this year as part of an update to the operating software for the iPhone, Mac and Apple Watch.

“Apple’s expanded protections for children are a game changer,” John Clarke, president and CEO of the National Center for Missing and Exploited Children, said in a statement. “With many people using Apple products, these new safety measures have lifesaving potential for children.” Thorn CEO Julia Cordua said Apple’s technology balances “the need for privacy with digital security for children.” Thorne, a nonprofit founded by Demi Moore and Ashton Kutcher, uses technology to help protect children from sexual abuse by identifying victims and working with tech platforms.

But in a blistering critique, the Washington-based nonprofit Center for Democracy and Technology called on Apple to drop the changes, which it said would effectively destroy the company’s “end-to-end encryption” guarantees. . Scanning messages for obscene material on phones or computers effectively breaches security, it said.

The organization also questioned Apple’s technology for distinguishing between dangerous content and something in the form of art or memes. CDT said in an emailed statement that such technologies are notoriously error-prone. Apple denies that the change amounted to a backdoor that degrades its encryption. It says they are carefully considered innovations that do not disturb user privacy but strongly protect it.

Separately, Apple said its messaging app will use on-device machine learning to recognize and blur sexually explicit photos on children’s phones and can even alert parents of young children via text message . It also said that its software will “interfere” when users try to search for topics related to child sexual abuse.

To receive warnings about explicit sexual images on their children’s devices, parents must enroll their child’s phone. Children over the age of 13 can cancel enrollment, which means parents of teens will not receive notifications.

Apple said neither feature would compromise the security of private communications nor notify police.

read all Breaking News, breaking news And coronavirus news Here

.

Leave a Reply