I think it is all about end-to end encryption of iCloud. Apple can’t do this and protect the privacy of everyone without law enforcement going nuts. iCloud remains a backdoor for our data, allowing say China to look at the iCloud accounts of the Chinese. Or any government. End-to-end encryption would close this back door. And this new process is the first step.
Jeez. Lucky for Apple some people are easily led. Apple will be able to answer every one of these questions. This is actually a well thought out way to catch child pornographers while maintaining the privacy of everyone else.
I believe Apple is doing this because it wants to close the one big privacy loophole in iCloud – there is no end-to-end encryption so any government can get (and many already have already gotten) access to any of your photos. Any of them. They want to make Apple give up people’s photos and invade their privacy. Usually by saying a crime has been committed but we have seen how this can be corrupted by different governments. Any attempt at encrypting all of iCloud would result in governments whining “ What about child pornography?” “Apple is for protecting pedophiles.”
They have done this before over the passcodes for an iPhone, something Apple has now secured so that even Apple cannot get your passcode to your phone because it never has the key. It closed that backdoor.
Now I think it is getting ready to do the same thing for iCloud.
So, how to protect the privacy of billions while also answering the questions of chid pornography? Find the pornography in ways that prevent governments from forcing Apple to access people’s photos.
And forestall any complaints by coming up with a way to secure things but also find CSAM.
So, IMHO, what Apple is doing here is preparing the road for end-to-end encryption. But if it did that first and then instituted these policies, it would have 2 things to argue at once – encrypting iCloud and the CSAM protections. I think Apple decided to do the latter first. Get that argument out of the way. Even if it causes some problems PR wise.
Then adding encryption becomes easier. Apple has already shown it cares about child porn and come up with ways to find it. All without actually allowing governments to get access to any photos it wants.
With end-toend encryption with Apple holding no keys, governments cannot do fishing expeditions through anyone’s photos hoping it finds something useful. It cannot force Apple to because Apple has no way to access your files.
Facebook already scans every picture you upload and compares them to a CSAM database. That means that they look at EVERY photo. They know EVERY photo.That is what server-side checking allows (something many other company’s use). And it does not easily allow 3rd party vetting of the process. And it prevents end-to-end encryption, meaning any government anywhere can probably try to get access to all your photos.
In Apple’s approach, everything important happens on the iPhone. It never looks at your photo. In fact, no one will ever see any of your photos if they are not examples of known photos in vetted databases. No one can. The known algorithm converts all the megabits in a photo into a few kilobits of a hash fingerprint. This is done is a way to essentially make every photo produce a unique hash fingerprint. There is no scanning of the photo in a readable form. And there is no way to reconstruct the photo from the hash fingerprint. That fingerprint is then compared to a database of hash fingerprints of CSAM images, created with the same algorithm.
A database that is on your phone, not on a server somewhere. A server that a government might be able to get to and insert its own hash fingerprints.
Again, there is no way to know what your photos are. Only the hash is compared to the database. Just 2 numbers. Apple never sees them.
And it only looks for hashes of known pictures. The process never looks at the picture so a photo of your son in the bathtub is not going to get flagged.
And only if the photo is uploaded to iCloud will the photo be flagged. If the hash from an uploaded picture matches the hash from a known CSAM photo in the database on the phone, then an encrypted flag is sent to Apple. Still not the photo. No one has seen the photo yet but you.
If about 30 flags for different photos accumulate at Apple, only then will an Apple group examine the photos uploaded to iCloud. For even then, they will not look at the photo itself but at a blurred version of it to compare with the one in the public database.
As I understand it then, Apple never sees or needs access to your original photos. All it will ever see, and only if about 30 examples of known child pornography, are blurred images. And only if they are uploaded to iCloud.
Only then will authorities be notified.
This deals then with false positives. The chance of getting to this point by matching hashes for photos that are really different is extremely small. And this would have to happen say 30 times, meaning the probabilities for all 30 to be actually incorrect are exponentially smaller.
And this same process would likely work even with end-to-end encryption of iCloud..Apple never sees any photos so encryption will have no effect. It only ever sees a blurred photo from an encrypted tag.
Only if the Apple group validates all 30 photos as porn are the authorities notified. This only adds humans in the mix if a large number of matches show up. But even then no one has seen the original photo.
So, the tags are only added if the database on the phone finds a matched hash. And that database comes with the OS, not on a separate install. And that database can be vetted by third parties. So the ability of government to insert non-CSAM hashes to the database is very limited, if possible at all, since this would require different OS in different countries. Apple only releases a single global OS. Apple would have to know about the insertion, also knowing that anyone could examine the database on their phone.
And since the tagged photos have to be vetted by Apple employees, sneaking in non-CSAM images will not easily work. Too many people would have to be aware to do it secretly.
This is actually a pretty elegant way to find child porn while maintaining the privacy of the large majority. Apple does not see anyone’s images. The authorities are only alerted if the process turns up a lot of images, not just one match. And it will only happen when the images are uploaded to iCloud.
Now, I had wondered why this last step was included. As Apple states, anyone can opt out of the process if they do not upload photos to iCloud. Seems to me this is a big loophole if one is after child porn. (Although previous cases indicate that a lot of people with child porn on their phones do some pretty stupid things.)
I figure even pedophiles use the cloud like everuone else – they want to access their photos from other devices.
But, if one’s goal is end-to-end encryption of everything into iCloud, this makes sense. This would protect everyone’s privacy. No backdoor into your data for anything you put into the cloud. Apple can never see any of your data, can never use any of it without your permission.
And it can protect the privacy of billions while still having an elegant way to catch child porn. Now if governments complain about end-to-end encryption of iCloud Apple can likely say “We just spent a lot of effort showing how we can catch child pornographers, and now you are still going to complain. seems like you have other motives.”
We will see but I think Apple is willing to purposefully do the fight now, spending lots of energy showing it is doing the right thing and has processes in place to actually prevent inappropriate access. It may well show itself as a champion of this balance, all while Facebook and others do nothing at all.
Then when it is all done here, it says, well we have a process to catch criminals, now we can do end-to-end encryption.
[Image: Richard Patterson]