Apple’s announcement that it’s going to start scanning photos for child abuse material is a big deal. (Here are five news stories.) I have been following the details, and discussing it in several different email lists. I don’t have time right now to delve into the details, but wanted to post something.
EFF writes:
There are two main features that the company is planning to install in every Apple device. One is a scanning feature that will scan all photos as they get uploaded into iCloud Photos to see if they match a photo in the database of known child sexual abuse material (CSAM) maintained by the National Center for Missing & Exploited Children (NCMEC). The other feature scans all iMessage images sent or received by child accounts — that is, accounts designated as owned by a minor — for sexually explicit material, and if the child is young enough, notifies the parent when these images are sent or received. This feature can be turned on or off by parents.
This is pretty shocking coming from Apple, which is generally really good about privacy. It opens the door for all sorts of other surveillance, since now that the system is built it can be used for all sorts of other messages. And it breaks end-to-end encryption, despite Apple’s denials:
Does this break end-to-end encryption in Messages?
No. This doesn’t change the privacy assurances of Messages, and Apple never gains access to communications as a result of this feature. Any user of Messages, including those with with communication safety enabled, retains control over what is sent and to whom. If the feature is enabled for the child account, the device will evaluate images in Messages and present an intervention if the image is determined to be sexually explicit. For accounts of children age 12 and under, parents can set up parental notifications which will be sent if the child confirms and sends or views an image that has been determined to be sexually explicit. None of the communications, image evaluation, interventions, or notifications are available to Apple.
Notice Apple changing the definition of “end-to-end encryption.” No longer is the message a private communication between sender and receiver. A third party is alerted if the message meets a certain criteria.
This is a security disaster. Read tweets by Matthew Green and Edward Snowden. Also this. I’ll post more when I see it.
Beware the Four Horsemen of the Information Apocalypse. They’ll scare you into accepting all sorts of insecure systems.
EDITED TO ADD: This is a really good write-up of the problems.
EDITED TO ADD: Alex Stamos comments.
An open letter to Apple criticizing the project.
A leaked Apple memo responding to the criticisms. (What are the odds that Apple did not intend this to leak?)
EDITED TO ADD: John Gruber’s excellent analysis.
EDITED TO ADD (8/11): Paul Rosenzweig wrote an excellent policy discussion.
EDITED TO ADD (8/13): Really good essay by EFF’s Kurt Opsahl. Ross Anderson did an interview with Glenn Beck. And this news article talks about dissent within Apple about this feature.
The Economist has a good take. Apple responds to criticisms. (It’s worth watching the Wall Street Journal video interview as well.)
EDITED TO ADD (8/14): Apple released a threat model
Click to Open Code Editor