I think users should be allowed to make the security tradeoffs they consider relevant. Many people leave a key to the door of their house somewhere outside but nearby, yet I don't think the people who build locks should decide that that is never acceptable and decide to play parent and come up with a solution to this problem: I would prefer people to be informed about the tradeoffs they are making, but they should be allowed to do what they want. Meanwhile, this enables the people who want more security than "I trust Apple, all of Apple's employees, Apple's security from hostile third parties, and the government under which Apple does business" to go "above and beyond".
> That's why I think calling this a backdoor isn't a fair assessment.
I am using this term because Apple is using this term: they said "They [the FBI] have asked us [Apple] to build a backdoor to the iPhone." when what the result would be would still require brute forcing a passcode to get the data in question. They make it sound extremely hard, but in fact it is really easy for them to do this: it is a single line of code changed; what makes it possible for them to do this is not that they haven't bothered to build it, it is that they are moral enough to not want to do it, and they are the only people with the key... but the key, fundamentally, is equivalent to the power the FBI wants. The FBI could "build" this backdoor for themselves if Apple handed them that key.