New iPhone child safety feature is ‘dangerous’: 7 reasons experts tell Apple – Times of India

Apple Last week it announced a new child safety feature that it intends to roll out with the upcoming iOS 15 update. This feature will allow Apple to identify child porn images stored in the iPhone and other devices connected to the user iCloud Accounts like iPad, Apple Watch and Mac. The new child safety will initially be rolled out in the US. Apple has said that it may be rolled out in other countries in the future depending on local regulations and whether the government wants such a tracking system or not.
If the system is implemented, Apple would be allowed to manually intervene and block a user’s Apple ID if child porn is detected. Also, in the US, Apple can alert the police and National Center for Missing and Exploited Children (NCMEC) regarding potential abuse. If child porn material is found on a child’s iPhone, their parents will be notified.
While Apple has good intentions, there is a big concern about what will happen if people are falsely accused or local governments try to trick Apple into using the same technology to spy on potential political opponents, protesters and whistleblowers. Forcing.
Security and privacy experts, cryptographers, researchers, professors, legal experts and individuals have signed an open letter not to implement the feature as it would undermine user privacy and end-to-end encryption. The letter states that the biggest concern is that “both checks will be performed on the user’s device, they have the potential to bypass any end-to-end encryption that would otherwise protect the user’s privacy.”
Here are some reasons why experts don’t want Apple to implement the new child safety feature
Apple is opening the door to widespread abuse: Electronic Frontier Foundation
“It is impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children. As a result, a well-intentioned effort to build such a system has also been a part of Messenger.” would break the key promises of encryption and open the door to widespread abuse. […] That’s not a slippery slope; It is a completely built system, just waiting for outside pressure to change a bit,” it said.
Apple’s changes actually pose new risks for children and all users: Democracy and Technology Center
“Apple is replacing its industry-standard end-to-end encrypted messaging system with an infrastructure for surveillance and censorship that will be vulnerable to abuse and scope-creep, not only in the US, but around the world,” Greg Nozem says Co-director of CDT’s Security and Monitoring Project. “Apple must abandon these changes and restore the confidence of its users in the security and integrity of their data on Apple devices and services.”
What if child porn tracking extended to state-specific censorship: Open Privacy Research Society
“If Apple succeeds in offering it, how long do you think it will be before other providers expect it? Before the walled garden bans apps that don’t? Before it’s in law How long do you think it will be before the database expands to include “terrorist” content? “Harmful-but-legal” content? State-specific censorship?, Open Privacy Research Society executive Warned director Sarah Jamie Lewis.
‘Apple is already bowing to local laws’
“Apple sells iPhones without FaceTime in Saudi Arabia, because local regulation prohibits encrypted phone calls. This is just one example of the many where Apple has succumbed to local pressure. What happens when local regulations in Saudi Arabia mandate that messages should be scanned not for child sexual abuse, but for homosexuality or for crimes against the monarchy?,” Dr. Nadeem, a researcher in security and privacy issues Kobesi said.
‘Apple’s new child safety feature will lead to abuse around the world’
In a statement, the Electronic Frontier Foundation said, “Take the example of India, where recently passed regulations include alarming requirements for platforms to identify the origin of messages and pre-screen content. In Ethiopia in 24 hours” New laws requiring the removal of “information” may apply to messaging services. And many other countries—often with authoritarian governments—have passed similar laws. Apple’s changes to this type of screening, takedown, and reporting may apply to its Will enable end-to-end messaging. It’s easy to imagine cases of abuse: governments outlawing homosexuality may need to train classifiers to prohibit explicit LGBTQ+ content, or an authoritarian regime The classifier may demand to be able to see popular caricatures or protestants.
Apple doing extensive surveillance around the world: Snowden
“No matter how well-intentioned, Apple is doing mass surveillance all over the world with this. Make no mistake: If they can scan for kiddie porn today, they can scan anything tomorrow.” They turned a trillion dollar devices into iNarcs without asking,” tweeted Edward Snowden.
What if spyware companies find a way to take advantage of this software: WhatsApp CEO Will Cathcart
WhatsApp CEO Will Cathcart said, “Apple needs to do more to fight long csamBut the approach they are taking is something very important in the world. Instead of focusing on making it easier for people to report the content they’ve shared with them, Apple has created software that can scan all the private photos on your phone — even the photos you’ve shared with them. Haven’t shared with anyone. That’s not privacy.
Could this scanning software running on your phone be error proof? Researchers have not been allowed to find out. Why not? How do we know how often mistakes are infringing on people’s privacy? What will happen when spyware companies find a way to take advantage of this software? Recent reporting has shown the cost of vulnerabilities in iOS software. What if someone figures out how to take advantage of this new system?”

.

Leave a Reply