It doesn't necessarily mean that all flagged photos would be of explicit content, but even if it's not, is Apple telling us that we should have no expectation of privacy for any photos uploaded to iCloud, after running so many marketing campaigns on privacy? The on-device scanning is also under the guise of privacy too, so they wouldn't have to decrypt the photos on their iCloud servers with the keys they hold (and also save some processing power, maybe).
Devolving the job to the phone is a step to making things more private, not less. Apple don’t need to look at the photos on the server (and all cloud companies in the US are required to inspect photos for CSAM) if it can be done on the phone, removing one more roadblock for why end-to-end encryption hasn’t happened yet.
This is extremely disingenuous. If their devices uploaded content with end to end encryption there would be no matches for CSAM.
If they were required to search your materials generally, then they would be effectively deputized-- acting on behalf of the government-- and your forth amendment protection against unlawful search would be would extended to their activity. Instead we find that the both cloud providers and the government have argued and the courts have affirmed the opposite:
In US v. Miller (2017)
> Companies like Google have business reasons to make these efforts to remove child pornography from their systems. As a Google representative noted, “[i]f our product is associated with being a haven for abusive content and conduct, users will stop using our services.” McGoff Decl., R.33-1, PageID#161.
> Did Google act under compulsion? Even if a private party does not perform a public function, the party’s action might qualify as a government act if the government “has exercised coercive power or has provided such significant encouragement, either overt or covert, that the choice must in law be deemed to be that of the” government. [...] Miller has not shown that Google’s hash-value matching falls on the “compulsion” side of this line. He cites no law that compels or encourages Google to operate its “product abuse detection system” to scan for hash-value matches. Federal law disclaims such a mandate. It says that providers need not “monitor the content of any [customer] communication” or “affirmatively search, screen, or scan” files. 18 U.S.C. § 2258A(f). Nor does Miller identify anything like the government “encouragement” that the Court found sufficient to turn a railroad’s drug and alcohol testing into “government” testing. See Skinner, 489 U.S. at 615. [...] Federal law requires “electronic communication service providers” like Google to notify NCMEC when they become aware of child pornography. 18 U.S.C. § 2258A(a). But this mandate compels providers only to report child pornography that they know of; it does not compel them to search for child pornography of which they are unaware.
- All cloud providers scan for it. Facebook, Google, Amazon, Apple, Imgur ... There's a list of 144 companies at NCMEC. There must be a damn good reason for that consensus...
- Because they scan for it, they are obliged (coerced, if you will) to report anything they find. By law.
- Facebook (to pull an example out of the air) reported 20.3 million times last year. Google [1] are on for 365,319 for July->Dec and are coming up on 3 million reports. Apple reported 265 cases last year.
- Using e2e doesn't remove the tarnish of CSAM being on your service. All it does is give some hand-wavy deniability "oh, we didn't know". Yes, but you chose to not know by enforcing e2e. That choice was the act, and kiddy-porn providers flocking to your service was the consequence. Once the wheels of justice turn a few times, and there becomes a trend of insert your e2e service being where all the kiddy-porn is stored, there's no coming back.
The problem here is that there's no easy technical answer to a problem outside the technical sphere. It's not the technology that's the problem, it's the users, and you don't solve that by technological means. You take a stand and you defend it. To some, that will be your solution ("It's all e2e, we don't know or take any ownership, it's all bits to us"). To others, it'll be more like Apple's stance ("we will try our damndest not to let this shit propagate or get on our service"). Neither side will easily compromise too much towards the other, because both of them have valid points.
You pays your money and you takes your choice. My gut feeling is that the people bemoaning this as if the end-times were here will still all (for reasonable definitions of "all") be using iCloud in a few months time, and having their photos scanned (just like they have been for ages, but this time on upload to iCloud rather than on receipt by iCloud).
[1] https://transparencyreport.google.com/child-sexual-abuse-mat...
This new iCrap is like a toaster that reports you if you put illegally imported bread in it. It will be just like the toaster which will have no measureable impact on illegal imports. Even if $badguys are so dumb to continue using the tech (iCloud???) and lots go to jail, lots more will appear and simply avoid the exact specific cause that sent previous batch to jail. They do not even thave to think.
The problem with all this is that everyone is applauding Apple for their bullshit, and so they will applaud the government when they say "oh no, looks like criminals are using non-backdoored data storage methods, what a surprise! we need to make it illegal to have a data storage service without going through a 6 month process to setup a government approved remote auditing service".
Then there's also the fact that this is all a pile of experimental crypto [1] being used to solve nothing. Apple has created the exact situation of Cloudflare Pass: they pointlessly made $badip solve a captcha to view a read-only page, and provided a bunch of experimental crypto in a browser plugin to let him use one captcha for multiple domains (they would normally each require their own captcha and corresponding session cookie). They later stopped blocking $badip all together after they realized they are wrong (this took literally 10 years).
1. https://www.apple.com/child-safety/ "CSAM detection" section