HomeWorldApple's plan to search for child abuse images 'tears in the heart...

Apple’s plan to search for child abuse images ‘tears in the heart of privacy’ | Privacy

Facebook
Twitter
Pinterest
WhatsApp

A technology like the one proposed by Apple to search iPhones for images of child sexual abuse would open the door to mass surveillance and be vulnerable to exploitation, world-leading security and crypto experts have said.

Client-side scanning (CSS) provides access to data on user devices, including stored data, “taking surveillance to a new level,” according to an analysis by academics at the Harvard Kennedy School, the Massachusetts Institute of Technology (MIT) and the University. from Cambridge, among others.

They write that the technology, which introduces software in the background to users ‘devices, “tears at the heart of individual citizens’ privacy,” but is also fallible and could be evaded by those who are destined to be attacked and misused. .

In Bugs in Our Pockets, the risks of client-side scanning, an analysis of 46 pages of CSS posted on an open access website arXiv on Friday, the authors say: “In reality, CSS is a mass interception, albeit automated and distributed… CSS makes law-abiding citizens more vulnerable with their personal devices that can be searched on an industrial scale.

Simply put, it is dangerous technology. Even if it were initially implemented to search for child sexual abuse material, content that is clearly illegal, there would be enormous pressure to expand its reach. Then we would have a hard time finding some way to resist its expansion or to control the abuse of the system. “

Apple’s plans, unveiled this year, involve a technique called “perceptual hashing” to match photos to known images of child abuse when users upload them to the cloud. If the company finds enough matches, it would manually review the images before flagging the user account to the police.

Apple halted implementation after a backlash from privacy advocates last month, but not before researchers succeeded build very different images which produced the same fingerprint and therefore would appear identical to Apple’s scanning system, creating false positives.

Others managed to do the opposite: change the mathematical output of an image without changing its appearance at all, thus creating false negatives.

The report’s authors say that people can also try to disable scanners or avoid using devices like iPhones with CSS. They added: “You have to trust the software vendor, infrastructure operator, and goal curator. If any of them, or their key employees, misbehave or are corrupted, hacked or coerced, system security can fail. “

While CSS can be discussed with the intention of targeting specific content, the report warns: “Come the next terror scare, a little push will be all that is needed to reduce or eliminate current protections.”

He notes that Apple appears to have given in to state pressure before, such as moving the iCloud data of its Chinese users to data centers under the control of a Chinese state company and removing the tactical voting app of jailed Russian opposition leader Alexei Navalny. from its Russian app store.

Ross Anderson, one of the report’s co-authors and a professor of safety engineering at the University of Cambridge, said: “It’s a very small step from there. [targeting child sexual abuse material] to various governments saying ‘here is a list of other images that we would like to put on the naughty photo list for iPhones in our country’ “.

When asked for comment, Apple referred The Guardian to a statement that said: “Based on feedback from customers, advocacy groups, researchers and others, we have decided to take more time over the next several months to collect feedback and make improvements. before posting these critical child safety features. “

Facebook
Twitter
Pinterest
WhatsApp
RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

- Advertisment -

Most Popular