Policy groups ask Apple to drop plans to scan iMessage, images for abused images

More than 90 policy and rights groups around the world published an open letter Thursday urging Apple to drop plans to scan children’s messages for nudity and adults’ phones for images of child sexual abuse. “While these capabilities are intended to protect children and reduce the spread of child sexual abuse material, we are concerned that they will be used to censor protected speech, threatening the privacy and security of people around the world, And there will be disastrous consequences for many children,” the groups wrote in the letter, first reported by Reuters.

The largest-ever campaign on the issue of encryption in a single company was organized by the US-based non-profit Center for Democracy and Technology (CDT). Some foreign signatories in particular are concerned about the impact of the changes in nations with different legal systems, with some already hosting heated fights over encryption and privacy.

“It is very disappointing and disturbing that Apple is doing this because they have been a staunch ally in protecting encryption in the past,” said Sharon Bradford Franklin, CDT’s co-director of the Security and Surveillance Project. An Apple spokesperson said the company had addressed privacy and security concerns in a document Friday outlining why the complex architecture of scanning software should resist efforts to remove it.

Signatories included several groups in Brazil, where courts have repeatedly blocked Facebook’s WhatsApp for failing to decrypt messages in criminal investigations, and the Senate passed a bill that would require traceability of messages. which would require marking their contents in some way. A similar law was passed in India this year as well. “Our main concern is the outcome of this mechanism, how it can be extended to other situations and to other companies,” said Flavio Wagner, president of the independent Brazil chapter of the Internet Society, which signed off. “This represents a serious weakening of encryption.”

Other signatories were in India, Mexico, Germany, Argentina, Ghana and Tanzania. Unfazed by earlier outrage after its announcement two weeks ago, Apple has offered a series of explanations and documents to argue that the risks of false identities are low.

Apple said it would decline demands in multiple jurisdictions to expand the image-recognition system beyond photos of children flagged by clearinghouses, though it hasn’t said it will exit the market instead of complying with a court order. Will go Although most of the objections so far have been to device-scanning, the Coalition’s letter also blames changes to iMessage in family accounts, which would attempt to detect and blur the nudity in children’s messages, notifying them only to parents. Will allow it to be viewed only when done.

The signatories said the move could put children living in intolerant homes or those seeking educational materials. More broadly, he said the change would break end-to-end encryption for iMessage, which Apple has strongly defended in other contexts. “Once this backdoor feature is built-in, governments will be forced to extend the notification to other accounts and detect offensive images for reasons other than being sexually explicit,” the letter said. can do.”

Other groups that have signed include the American Civil Liberties Union, the Electronic Frontier Foundation, Access Now, Privacy International, and the Tor Project.

read all breaking newshandjob breaking news And coronavirus news Here

.

Leave a Reply