More than a dozen prominent cybersecurity experts Thursday criticized plans by Apple and the European Union to check people’s phones for illegal material, calling the efforts ineffective and dangerous strategies that would encourage government surveillance.
In a 46-page study, the researchers wrote that Apple’s proposal, aimed at detecting child sexual abuse images on iPhones, as well as an idea by members of the European Union to capture similar abuse and terrorist images on encrypted devices tracks in Europe, ‘dangerous technology’ used.
“It should be a national security priority to resist attempts to spy on and influence law-abiding citizens,” the researchers wrote.
The technology, known as client-side scanning, would allow Apple — or, in Europe, possibly law enforcement — to detect child sexual abuse images on someone’s phone by scanning images uploaded to Apple’s iCloud storage service.
When Apple announced the planned tool in August, it said a so-called fingerprint of the image would be compared with a database of known child sexual abuse material to look for possible matches.
But the plan caused an uproar among privacy advocates and raised fears that the technology could compromise digital privacy and eventually be used by authoritarian governments to hunt down political dissidents and other enemies.
Apple said it would reject such requests from foreign governments, but the outcry led it to pause the scan tool’s release in September. The company declined to comment on the report released Thursday.
The cybersecurity researchers said they started their investigation before Apple’s announcement. Documents released by the European Union and a meeting with EU officials last year led them to believe that the bloc’s governing body wanted a similar program that would scan not only for images of child sexual abuse, but also for signs of organized crime and indications of terrorist ties.
A proposal to enable photo scanning in the European Union could come as late as this year, according to the researchers.
They said they were now publishing their findings to inform the European Union of the dangers of its plan, and because the “extension of the state’s surveillance powers really crosses a red line,” said Ross Anderson, a professor of security engineering. at the University of Cambridge and a member of the group.
Aside from surveillance concerns, the researchers said, their findings indicated the technology was ineffective at identifying images of child sexual abuse. Within days of Apple’s announcement, they said, people had pointed to ways to avoid detection by slightly editing the images.
“It allows scanning of a personal private device without any probable reason that something illegal is being done,” added another member of the group, Susan Landau, a professor of cybersecurity and policy at Tufts University. “It is extremely dangerous. It is dangerous to business, national security, public safety and privacy.”