This is a one line of code change for Apple and would take them a few minutes. FWIW, there are people in the iOS jailbreaking community who could do this without the source code rather quickly. I'll even go so far as to say that we actually already have all the tools for this lying around for the iPhone 4, and with only minimal changes made by even less qualified engineers they would probably work on the iPhone 5C.
> With a sufficiently complex passphrase, the FBI is still SOL.
Most people use the 4- or 6- digit PIN number. One presumes that in this case the user did so (and you can tell, as the UI is different depending on the kind of passphrase used), or the FBI wouldn't be quite so excited to bother here. It takes mere minutes to crack a 4- digit PIN code on the iPhone 4.
> The fact that only Apple is in a position to sign firmware that could do this is a positive thing in this context. The only alternatives are no firmware signing at all (so everyone could run this attack), no updates at all, or enforcing the rate-limiting in a HSM (which is what they're doing on the latest generation iPhones).
You have conveniently removed "allow the user to lock everyone out from firmware updates except themselves" from the list of possible options :/. While I am perfectly happy with the idea that some people might want to allow Apple to update the firmware on their device, I would much rather no one be able to do that unless they go through me, and as I own the hardware and it is my data that is on the line, I should have the right to make that decision. Apple is selling locks, claiming them to be secure, while not only sitting on a master key but now claiming that it isn't really a master key, which is not just disingenuous but outright dishonest at this point.
Firmware signing and how updates are delivered are one thing. I would argue that having only one possible adversary is preferable to everyone being able to create firmware that runs on your device. If there's a practical and secure approach that would allow users to install only firmware updates they approve of, I'd be all for that[1]. In the end - please correct me if I'm wrong - this would require a user-generated key or passphrase of some sort, and then we're back at a brute-force problem and the question of how secure is that passphrase and how are rate-limits enforced.
The iPhone's disc encryption, however, does not rely on this so-called master key. That's why I think calling this a backdoor isn't a fair assessment. It's entirely reliant on the complexity of your passphrase. The iPhone's security architecture, including the firmware signing and in newer versions the secure enclave, make attacks against this significantly harder (or next to impossible, if the secure enclave firmware is actually read-only ... something that definitely needs to be clarified). Compare this to your typical desktop full-disk encryption, where you usually have no countermeasures whatsoever against this kind of thing.
[1]: Speaking as a developer. I'm not qualified to answer this for sure, but my gut feeling is that such a feature in the hands of typical end-users might actually be a bad thing for security.
I think users should be allowed to make the security tradeoffs they consider relevant. Many people leave a key to the door of their house somewhere outside but nearby, yet I don't think the people who build locks should decide that that is never acceptable and decide to play parent and come up with a solution to this problem: I would prefer people to be informed about the tradeoffs they are making, but they should be allowed to do what they want. Meanwhile, this enables the people who want more security than "I trust Apple, all of Apple's employees, Apple's security from hostile third parties, and the government under which Apple does business" to go "above and beyond".
> That's why I think calling this a backdoor isn't a fair assessment.
I am using this term because Apple is using this term: they said "They [the FBI] have asked us [Apple] to build a backdoor to the iPhone." when what the result would be would still require brute forcing a passcode to get the data in question. They make it sound extremely hard, but in fact it is really easy for them to do this: it is a single line of code changed; what makes it possible for them to do this is not that they haven't bothered to build it, it is that they are moral enough to not want to do it, and they are the only people with the key... but the key, fundamentally, is equivalent to the power the FBI wants. The FBI could "build" this backdoor for themselves if Apple handed them that key.
Then there's even less reason to use All Writs to make Apple do it, unless it's to make the precedent to force the device makers to backdoor their products.
Just do it, for all of us, make that tool for 5C. But don't support FBI using this case to make "All Writs able to change products" precedent.
I can build the tool. What I can't do is sign the result. The only thing any of us are missing is the 4096-bit RSA encryption key used to sign the firmware. The way we load this tool onto the iPhone 4 is using a vulnerability in their bootloader that lets us bypass the signature check. There is only 512 bytes of data at question here, not some insurmountable amount of work.