More accurately put, their intent is to scan cloud photos for exact matches with known child pornography material (like every other cloud provider, including Google), and then have the case reviewed by a human only after multiple positives, and only then forwarding the case to law enforcement (based on photos you chose to upload to the cloud)
corrected: their intent is to scan all photos in your photo library, on your device, including images automatically pulled in from from various sources such as messages, if you have iCloud Photo enabled.
As far as I am aware, this is false and there is no mechanism on iOS by which images are "automatically pulled into" the photo library from anywhere, Messages or otherwise. Do you have a source or an example of how that could happen?
(edit: people are mentioning Whatsapp, which I guess has an option to auto-save received photos. Fair enough, but that's a third-party app and requires you to enable photos access anyway, so it's pretty clearly not what the parent meant).
> their intent is to scan all photos in your photo library, on your device ... if you have iCloud photos enabled
Yes, that's what I said. Enabling iCloud photos uploads your photo library to the cloud, so it's scanning your cloud photos.
Per Apple [0]
>Shared with You works across the system to find the (...) photos, and more that are shared in Messages conversations, and conveniently surfaces them in apps like Photos (...) making it easy to quickly access the information in context.
---
>Yes, that's what I said. Enabling iCloud photos uploads your photo library to the cloud, so it's scanning your cloud photos.
Being disingenuous about it is still a thing though. You stated
> More accurately put, their intent is to scan cloud photos (...) (like every other cloud provider, including Google)
which makes it appear that the photos are only scanned server side "like every other cloud provider". Client side scanning is something that no other provider does, in contrast to what you stated.
[0]: https://www.apple.com/newsroom/2021/06/ios-15-brings-powerfu....
But yes, I agree with the comment, there's no reason to hide between details: Apple plans to introduce the capability of scanning photos on your local device and comparing hashes against an opaque (non-reviewable) list of hashes that they (along with governments) control (details about how they plan to initially employ this capability are irrelevant).
If you want to "correct" the claim to say their intent is to scan every photo, citation needed.
Google, on the other hand, has been scanning the entire contents of your account for the past decade.
>a man [was] arrested on child pornography charges, after Google tipped off authorities about illegal images found in the Houston suspect’s Gmail account.
https://techcrunch.com/2014/08/06/why-the-gmail-scan-that-le...
Apple claims to not scan your pictures, but that's unrelated to whether they scan your pictures
In reality they probably have a "photoscanner.so / .dylib" that currently is only linked in by the iCloud uploader thing, but at any time could be called in by any other part of the system (or offer exploits new avenues for data exfiltration), which was actually spelled out in their initial announcement (there will be a system API for accessing it).
So they absolutely have the ability to scan photos on your phone; the fact that they don't intend to currently use it outside of the iCloud uploader is totally immaterial to this debate (the thing I don't want on my phone is photoscanner.so or any such capability).
Not exact matches. Hashes. Hashes that were quickly show to have collisions that the company brushed off.
With Google you can be absolutely sure that their intent is to eat all your personal information and data for short-term profit. With Apple it was "just" a stupid attempt at legal (over?) compliance.
Its not the law enforcement that's the main issue, but various greedy 3-letter agencies who are already well known to have ambition to have profile on every person in this world (not unlike Facebook but for different purposes).
This is not privacy anymore no matter how you bend it, it has been cancelled and Apple realizes this very well. And it still doesn't care. Literally the only serious selling point for many new buyers not invested in ecosystems, blowing it off with a nice double barreled shotgun shot.
Good, descent people, waking up screaming, cold shakes, permanently damaged from what they could not unsee.
You couldn't pay me enough to go through images of such sickness.
Outside of all the yes/no, on/off phone stuff, how are they going to hire, and keep staffed, a department of people having to look at this stuff.
How are they going to insure it?!
Try getting that behavior from Google, a company who's existence is dependent on surveillance advertising.
Also who is reviewing this known child pornography list? Hopefully nobody because it is Child pornography but also hopefully somebody because what if somebody slips something in there… Say a offensive political cartoon or a ethnic group symbol or a picture of Tiananmen Square. This list of “offensive images” needs to be auditable.
Also it is crossing a line in the sand because it is on your personal device not in their servers. All you can hope for is that they don’t alter the deal further.
For some definition. Russia's FSB might have a very different idea of what this is. Anti-Putin memes, for instance. Navalny support materials or brochures. You'll have to watch what you download, because your phone might upload it and incriminate you.
Or China's MSS. Winnie the Pooh, Tiananmen Square, Free HK, etc.
Or even the FBI. Financial or political leaks, Wikileaks, etc.
Once they know who you are and why they don't like you, they can incriminate you in other ways. This helps them find and flag you. They don't even need to monitor and decrypt traffic - they can just upload hashes of things they don't like and let Apple's dragnet do all the work.
Don't buy into "CSAM" scare. It's never the intent. The powers that be don't give a damn about children. It's about power.
This is the EXACT SAME database EVERY cloud provider has been using for about a decade. Look up Microsoft PhotoDNA.
The only difference is that the company doing it was Apple, who wanted to do the checks on-device BEFORE upload. And with multiple redundancies and human review.
Not like Microsoft, who have - for example - shut down the MS account of a German man for having photos of his own children on a beach. No human review, no way to complain. Everything gone from Outlook mail to Xbox account.
[0] https://en.wikipedia.org/wiki/National_Center_for_Missing_%2...
I'll eat my hat if this system doesn't hurt someone innocent.
Our defense of privacy should be paramount, and we shouldn't defend the fruit company for assailing it just because we like the pretty things they make.
Every word of Stallman's warnings about computing freedom was right. He was prescient. And just like his arguments, there are many people that view this move by Apple as a huge erosion of privacy. We all have a very legitimate fear that shouldn't be dismissed.
You can attack and trivialize my arguments, but mark my words, history will show we're making a huge mistake here.
If you turn the iCloud Photos feature off, no more scanning is happening.
This seems pretty simple to me.