https://support.apple.com/en-us/HT202303
End-to-end encrypted data -> - Apple Card transactions (requires iOS 12.4 or later) - Home data - Health data (requires iOS 12 or later) - iCloud Keychain (includes all of your saved accounts and passwords) - Maps Favorites, Collections and search history (requires iOS 13 or later) - Memoji (requires iOS 12.1 or later) - Payment information - QuickType Keyboard learned vocabulary (requires iOS 11 or later) - Safari History and iCloud Tabs (requires iOS 13 or later) - Screen Time - Siri information - Wi-Fi passwords - W1 and H1 Bluetooth keys (requires iOS 13 or later)
They can claim that the device is secure and always encrypted, and all the messaging is encrypted, and they don't collect user data. This is all true (i assume, but did not audit).
If you care about security, all you have to do is turn off iCloud backup, and everything is secure. If you don't care, well then you have a great feature.
They upload plain-text versions of messages, etc to iCloud so if law enforcement asks, they can still comply with the juicy data. They don't need to back-door the iphone for the Gov. which was a major PR issue a few years ago.
No, each conversation has at least two endpoints, and it's unlikely that the people you iMessage with have disabled iCloud Backup.
It's sort of like switching from gmail to avoid Google having access to your correspondence: they'll get it from the mailbox of the people still using gmail (so, everyone) that you correspond with.
https://www.reuters.com/article/us-apple-fbi-icloud-exclusiv...
Apple provided user data on over 30,000 users in 2019 to the US federal government without a warrant or probable cause, per Apple's own transparency report (see FISA orders). All the feds have to do is order the data from Apple, and they get all of it, on anyone they like.
You're going to be waiting a long time; it's a design goal for Apple (and by extension the feds) to be able to read your every stored text, iMessage, and iMessage attachment out of your device backup without your consent/knowledge.
It's not really that different from the situation in China, where Apple provides the same sort of backdoors to the CCP to be able to sell devices there. (There, the CCP requires that it be physically stored on state-owned and state-operated hardware, as I understand it.)
Do you not know a FISA order is a court order?
1) The vast majority of Apple's users care more about getting their data back than they do E2E encryption. Encrypting backups does introduce failure modes that put more burden on the user (to have an emergency key, etc). Apple also cares deeply about things "just working", and so this is a space that was always going to be incredibly difficult to balance.
2) The FBI thing is also true. Given Apple's former plans for true E2E encryption somewhat gave way to what they have now, with little explanation, it's hard not to speculate that they backed away from the original initiative after some...involvement...from the feds.
Some sort of “checked C” in iBoot: https://support.apple.com/guide/security/memory-safe-iboot-i...
Data is encrypted with your security policy, so if that changes (e.g. you disable SIP) it doesn’t expose it: https://support.apple.com/guide/security/sealed-key-protecti...
Details on what the SRD is and how it works: https://support.apple.com/guide/security/apple-security-rese...
Frankly, I'd like to see them go even further and put in place a policy that all user-created-and-consumable content can only leave the device in end-to-end encrypted format and have those keys managed by my AppleID so not even Apple can decrypt.
They can introduce it at an API level without having to dictate storage providers. If a web-version of an app needs show my photos they can let the end-user browser decrypt it. This works for private data, 1:1 and 1:Many shared data.
I should have a choice with who hosts my encrypted data, who manages my keys/identity and who provides a service that uses that data. Let's get back to providing value through services and away from leaching value through hoarding data and controlling protocols.
Yes - this will force companies to change their business models if they rely on access to my data. Will it make for better software - Yes hands down. More companies can compete and we'll start to see more creative solutions.
0: https://www.macrumors.com/2021/02/01/iphone-apple-watch-unlo...
- Face ID detects a mask - Your Apple Watch is nearby - Your Apple Watch is on your wrist - Your Apple Watch is unlocked - Your Apple Watch has a passcode enabled
https://9to5mac.com/2021/02/04/iphone-face-id-unlock-apple-w...
0: https://support.apple.com/guide/security/apple-security-rese...
From what a sales/dev person for a Saas MDM app for macOs told me, the M1s do not have a lock device feature. You can only wipe the device.
If an employee was terminated, we could remote send a lock command with a numeric code. The only way to remove the lock is to get the code from us or have Apple reset it in person. The in person visit you have to prove you’re the owner or have authorization from the company to have Apple unlock it.
My only option now is to wipe it. So now I have to find a cloud backup provider to back these devices up in case I need an important file from an employee who decides to go rogue.
Coming to Alexa, it might be totally different approach, Ability to find the devices on your network and may be with a combination of bluetooth Beacons.
Fortunately, you need to install full app to read this information. Unlike a Facebook, Twitter or Google Analytics library(Framework) can track you across other apps with the same Library or Framework.
For Second One, with iOS14 Apple prompts a Privacy Alert for Connecting to Other devices on network, You can simply turn it off.
Detecting Alexa App on the device used to be possible before, but it does not go unnoticed by Apple these days without some co-ordination between Amazon and Spotify.
I don't really know why anyone would take Apple's hardware security claims at face value after this.
edit: more links, though they're all pretty similar.
https://www.wired.com/story/apple-t2-chip-unfixable-flaw-jai...
https://appleinsider.com/articles/20/10/05/apples-mac-t2-chi...
https://www.zdnet.com/article/hackers-claim-they-can-now-jai...
https://www.theregister.com/2020/10/08/apple_t2_security_chi...
edit 2:
If this is wrong, I'd like to know the truth! Really! Was it a hoax? Is there a patch? What happened?