It's absolutely ridiculous what apple has become. The exact opposite of what they used to represent, when I loved them. God rest Steve Jobs soul, his 1984 ad is exactly what apple is now. Screwed devs on app store, strongarmed into compliance, cooperation with china, worse and worse UX on phones, and now this...
Really disappointing.
So I’ve been on the verge of doing this for years so this was the final push and motivation.
Yesterday I sold my iPad and Apple Watch. They are being shipped today. I’m just waiting on refunds on my AppleCare for my MacBook and iPhone now and I will sell them.
Yesterday I had a Nokia 215 arrive as a replacement phone. Also a monster pile of PC bits arrived which have been assembled into a Ubuntu running desktop. I am spending today migrating my data over carefully. When the MacBook sells I will buy a Nikon DSLR.
At the end of this I lose perhaps 20% convenience for an immeasurable privacy gain, lose a big chunk of the distractions from my life and end up with some cash left over which I will use to go on holiday.
The only thing I will miss is Apple Music but it’ll give me a chance to curate my music collection without distraction again.
I'm not sure how is that better?
Wouldn't AOSP/Lineage with Signal installed be better?
I sold my DSLR a couple of years after getting my G9x Mark II. The DSLR was always gathering dust compared to the G9x which with a small belt case could easily be taken anywhere.
That said these cameras are definitely not as flexible as a full SLR nor will you get the same performance. Its a large sensor when compared to a camera or other point and shoots but its still nothing compared to APS-C.
At this point I'm going back to owning a seaparate dedicated music device that is totally divorced from the computer. There's just something intentional about walking over to a CD player or record player, picking out an album, and putting it on compared to mindlessly browing Spotify playlists.
Are you switching from a laptop to a desktop machine? Do you have no use for the portability anymore?
Can Apple force people to install this even on devices they already sold?
heres my photo gallery all shot with the z5
Not having to edit the pictures is a huge plus, and JPG files in Nikon, even with dynamic range on, are pretty mediocre compared to Pixel phone.
Yes, backdooring E2E encryption in general is a bad idea. However, consider two things:
* iCloud Photos was never E2E encrypted in the first place. They already can scan your photos all they want server-side, and they have been scanning for CSAM since 2019, while Google has been scanning for it since 2009. Yes, if iCloud Photos were to become E2E encrypted leaving in a backdoor like this could be bad, but it's still the lesser of two evils. Would you rather they keep photos non-E2E forever and have even more unfettered access to them than a "backdoor" allows? It does NOT scan photos that are not uploaded to the cloud, despite being on-device. And it's important to note the threshold and manual human review system put in place before the authorities receive any notification at all.
* For iMessage, all this entails is warning children under 18 about explicit content, and optionally notifying parents if the child is under 13 and the parent opted in. (I don't think it even sends the photo itself to the parents, but that's not explicitly clarified anywhere.) At no point do Apple or the authorities learn the contents of E2E encrypted iMessages. (Also worth noting: if you use iCloud Backup, your messages are no longer E2E encrypted in the backup, as Apple holds the keys to that. This was true even before the new system was introduced.)
Yet. Once it's on the device, it's a MUCH smaller step to use it in other ways. It's certainly easier fro governments to argue that they should be able to force it to be used arbitrarily... you know, for the children/terrorists/etc.
> And it's important to note the threshold and manual human review system put in place before the authorities receive any notification at all.
Until it's not. Once again, once it's in place, it's a lot easier for malevolent actors (governments) to force it to be used other ways.
This a back door. Plain and simple. The fact that it's not _currently_ going to be used for evil (depending on your definition of evil) does not mean it won't be in the near future. Back doors are bad. How many times does this need to be said?
Yes I'd rather they do this. The fact that they're implementing on device checks doesn't suggest to me that they will be deploying E2E encryption. It suggests to me that they will be expanding on device scanning to all content in the future.
If they were going to make iCloud E2E encrypted, it would be a clear win to announce this at the same time as deploying on device scanning.
but you are forewarned - you can blew through way more then a weekend de-oppressing you digital life.
Google, Microsoft, Facebook, Twitter, etc. have all been scanning content for those same child porn images for darn near a decade now.
>The system that scans cloud drives for illegal images was created by Microsoft and Dartmouth College and donated to NCMEC. The organization creates signatures of the worst known images of child pornography, approximately 16,000 files at present. These file signatures are given to service providers who then try to match them to user files in order to prevent further distribution of the images themselves, a Microsoft spokesperson told NBC News. (Microsoft implemented image-matching technology in its own services, such as Bing and SkyDrive.)
"There are two opportunities to look at content," when it's going into a cloud-storage account and when it's leaving, she said. "There is technology to do this," Grant added, pointing out that file signatures — unique hashes or fingerprints — could be used to confirm the nature of the files.
https://www.nbcnews.com/technolog/your-cloud-drive-really-pr...
If this is the case, then It is coming to every device (not just apple) or E2E will be made illegal(or a backdoor).
End-to-end encryption is intended to prevent data being read or secretly modified, other than by the true sender and recipient(s). The messages are encrypted by the sender but the third party does not have a means to decrypt them, and stores them encrypted. The recipients retrieve the encrypted data and decrypt it themselves.
Because no third parties can decipher the data being communicated or stored, for example, companies that provide end-to-end encryption are unable to hand over texts of their customers' messages to the authorities.
Would it even be considered end 2 end encryption based on this Wikipedia definition? I don’t think it meets the definition if apple can determine certain files exist in a conversation.
> It's definitely the last straw for me in terms of apple products.
Uhh and where else will you go where the grass is so much rosier privacy-wise?
Then HN taught me that any company storing images on their infrastructure in the US must report pedophilic images to the US government.
At this point, the approach taken by Apple seems like the best one to me, if you don't want to store pictures in clear on your servers.
What other technical approach are people advocating for?
Another point it is to try to change the law, but this is beyond the scope of the conversation.
Server scanning makes it clear that the company running the servers has access to your photos. So you can either find a form of encrypted storage, or be okay with that, depending on your privacy stance. Having device with ability to scan your photos removes that choice. It is a privacy invasion.
ios already does on-device ml-based photo categorisation for some time, afaik no way to turn it off.
Windows already does this via Windows Defender. This is a basic AV functionality and much more privacy preserving.
The CCP have already throughly demonstrated that they don’t need manufactures consent to build these systems.
Look at the Uyghur population in China. They already have their phones scanned on device for dissident material, not by coercing manufacturers, but by forcing the population to install a surveillance app. Then making it illegal to use a phone without it.
Being caught at checkpoint without the app installed and working is grounds for immediate arrest and re-education.
IMO this appears to be Apple either a) trying to preempt future criticism or regulation or b) responding to some behind-closed-doors pressure/bargaining with US authorities.
It's certainly been going on for the past decade.
For example:
>a man [was] arrested on child pornography charges, after Google tipped off authorities about illegal images found in the Houston suspect's Gmail account
https://techcrunch.com/2014/08/06/why-the-gmail-scan-that-le...
1. Encrypt everything.
2. Don't store images on your servers at all.
There's nothing to report if all you have is some encrypted blob. Alternatively, just don't consume any user data at all. Data is and should be a massive liability.
My thoughs as well.
If you don't want there very dangerous weapon you have thought out to be abused, don't create a physical assembly of it and don't tell anyone who has a habit of abusing powerful weapons.
Apple just erroneously said "it's safe" despite the fact that it clearly can be abused.
[0] https://blog.cryptographyengineering.com/2020/03/06/earn-it-...
That’s exactly what you do if you plan to enable E2E.
That being said, one of two things is true. Either Apple does exactly what they say, in which case they are not able to perform server-side content / fingerprint scanning, or Apple is outright lying about only using their key on behalf of law enforcement. This latter case would open them to all sorts of legal liabilities, like a suit from shareholders for false reports. It would also require the silence of every Apple engineer who has ever been involved in at least their iCloud Photo program, and probably a bunch of server infrastructure as well. Additionally, they'd be legally obligated to report their scan results to the NCMEC but would have to do so in a way that doesn't give away that they're lying about how their systems work.
The functionality to detect CSAM uploaded to Apple’s servers or sent to pre-teens?
> And it basically means we sell out our democratic principles
What democratic principle is being sold out?
Reading https://support.apple.com/en-us/HT202303 , it seems that Apple may encrypt pictures on their servers, but they have the key. The list of what's actually end-to-end encrypted doesn't include photos. So, they may be scanning on your phone, but they can scan on their servers if they wanted to.
In this way the can get rid of the keys on their servers and still find pedo pictures.
This is just farce.
Ones they know of…
> What other technical approach are people advocating for?
Apple already has a technical solution, encryption.
How does encryption help prevent porn being sent to pre-teens?
That's not true though.
Reduce user data stored in cloud data centres as much as possible. This is the approach taken by Whatsapp, so not surprised they are the ones most vocal against it.
And at the risk of appearing to be supportive of a Facebook product, I think this is the right way to take computing. We don't need a central place to put stuff or to do compute when we can do it on our own devices. We just need orchestration.
Apple‘s approach is the only way to - at the same time - act lawfully regarding to EARN-IT act in the US and provide E2E in iCloud.
I really hate these laws, but Apple is not the problem here. Read up on EARNIT and the EU laws currently in the works. All communication WILL HAVE TO BE SCANNED by the provider. Beating the drum against Apple will just lead to E2E encryption being forbidden. What needs to be forbidden instead is any access to communication.
https://www.apple.com/child-safety/ https://en.m.wikipedia.org/wiki/EARN_IT_Act_of_2020 https://ec.europa.eu/info/law/better-regulation/have-your-sa...
https://www.patrick-breyer.de/en/posts/message-screening/?la...
In one part, the pro-privacy part of me is of course aghast at the whole idea.
However...
If you "read the room", there have been increasing noises from the global political world in recent years, and perhaps especially in the US.
So if you think about it that way, it might be a case of Apple jumping before they were pushed.
I mean, let's face it, if you wait for the politicos to come up with a solution and force it through with legislation, they really would put in actual backdoors and encryption bans given half the chance.
I suspect others, such as WhatsApp, might begrudgingly follow in due course.
There's always GPG and a whole litany of other tools and apps for those who know what they are doing in terms of privacy.
And it's always there also for whoever they claim to want to catch, so this measure is useless.
This is not protecting anyone, Apple might very well be anticipating the regulation, but that does not automatically deserve our praise. We should fight against this implementation and any regulation requiring similar measures.
Right. This will, a) catch the low hanging fruit type of criminal and b) keep honest people honest while forcing them to give up something for nothing.
They've tried to do this for decades and have failed. If they're going to do it then let it be on record. Let's see how voters like it.
ETA: in short, about a month ago they did get the votes, at least in the EU, and it's now "allowed" for providers to scan all content. In a little while, they're going to have a vote to change "allowed" to "required", and we have no reason to think it'll go differently.
If they ban encryption tech sector will kick up enough noise that even non tech people will at least notice.
This way, it essentially opens a backdoor. Changing this from scanning hashes of pictures stored locally, to scanning for arbitrary things stored locally probably is not monumental task ( next in line probably hate speech ).
And once you have that capability it's hard to argue to governments that you cant let them scan the content of either particular phone, or all of the phones, for whatever they want, which could be: .*
This way, the message to non techies will be, we are protecting children, but bunch of online weirdos and maybe pedophiles don't want us too.
https://en.wikipedia.org/wiki/FBI%E2%80%93Apple_encryption_d...
I am pretty sure all of Big Tech is in collusion among themselves and with various governments, after what Snowden showed us.
https://www.theguardian.com/world/2013/aug/23/nsa-prism-cost...
Everything starts off with "won't anyone think of the children?". Next thing you know, Apple is scanning your photos for faces of known "terrorists", etc.
I have children, and hate CP with a passion, but know that this is not the answer.
At this point, I have no doubt that in the future, a more ambiguous excuse such as "hate speech" will be used and under that umbrella, the elites will have a huge margin for pursuing any kind of "dissidence".
Finding and protecting even a few children from becoming victims of pornography is clearly something that is well worth my not having 100% privacy.
Even knowing that there might be false alarms.
Despite what the strident discourse has been, individual privacy is not some sancrosanct idea that cannot ever be tread upon. There are some things that are far far more important than that.
I'm surprised he went ahead with this considering how much privacy goodwill they have built up over the years.
What my SO says that he's just the manager of sales interested in selling their products and nothing else; the "else" is done by others - with marketing and pr and actual work.
This tweet here gives interesting options to learn more about it https://twitter.com/yoyoel/status/1424154582372872192?s=20
I have no imagination of the suffering of the kids behind it and it's definitely good that we fight it. But why not for older kids?
Apparently this is already happening on all major platforms in the moment. Apples implementation is the most privacy friendly one, or isn't it?
> Not only searches for known pictures and videos are to be legalised, but also error-prone “artificial intelligence”, for example to automatically search text messages for “luring” of minors. If an algorithm reports a suspected message, message content and customer data could be automatically forwarded to law enforcement agencies and non-governmental organizations worldwide without human examination.
> WhatsApp’s owner, Facebook, has reasons to pounce on Apple for privacy concerns.
>> The idea that parents are safe people for teens to have conversations about sex or sexting with is admirable, but in many cases, not true. (And as far as I can tell, this stuff doesn't just apply to kids under the age for 13.) — Kendra Albert (@KendraSerra) August 5, 2021
>> EFF reports that the iMessage nudity notifications will not go to parents if the kid is between 13-17 but that is not anywhere in the Apple documentation that I can find. https://t.co/Ma1BdyqZfW — Kendra Albert (@KendraSerra) August 6, 2021
I'm against Apple forcing a backdoor onto every device, but this argument falls totally flat to me. Yes there are shitty parents out there, but despite that, parents still need the ability to parent. If Apple's "think of the children" arguments for their backdoor are wrong, then this "think of the children" argument against it is wrong too. There's nothing wrong with notifying parents that their pre-teen is doing someone they shouldn't be doing with their phone.
IMHO, I'd support the existence of such feature, but only as long as it's a user-installable option, not installed on every phone as part of the OS.
Yet the privacy implications will last forever. Once it's implemented, it only takes a rubber-stamp warrant to compel Apple to scan your device for anything the government deems concerning. In fact, no warrant needed in most countries.
Privacy advocates don't seem to get it. There can not and will not be a future where these people can hide in the net. Either somebody figures a way to catch them without hurting the privacy of the innocent, or we will use systems that hurt the privacy of the innocent.
There is no third option, so put your energy into finding a solution that satisfies the first option if you care about this so much.
Given how ubiquitous cameras are, I can fully imagine more pictures being taken, though.
Guess I'm gonna have to buy a semi terrible privacy focused smart device with phone capabilities in the near future.
- Most instances of "child abuse" involve something that matches the legal term, but involves teenagers and is almost certainly not abuse
- Lots of conservatives want to punish said teens and anyone involved for sexuality and go along with the sophistry of calling people abuse victims when they have consensual sex or post their nude photos online
- Naturally, there is no incentive to look at naked 5 year olds, because that's not how the human body works. This is an edge case and is what the media makes out to be the norm
Stop pretending to have a "mature perspective". Companies should literally never touch your data unless there is a search warrant. Now that I read this article I'm concerned about what WhatsApp is doing.
[0]: https://www.reddit.com/r/apple/comments/p0i9vb/bought_my_fir...
1: https://www.hackerfactor.com/blog/index.php?/archives/929-On...