The Apple and FBI affair regarding the San Bernadino Killer’s iPhone has blown up and there’s a lot of confusing stuff going on. For today, I’m going to try to clarify the bigger picture. My intention is not to write this as a faq or polemic, because a lot of the arguments going on right now are due to confusion with semantics. For instance, Apple is saying the FBI is asking them to install a back door onto the phone. I would describe it by saying the phone has a security flaw (an unintended back door) that the FBI wants to exploit, and wants Apple to facilitate using it directly on the phone.
The Puzzle Box
iPhones are encrypted with AES encryption, a fairly robust standard that is for all intents and purposes impossible to crack (a brute-force hack attempt would take eons), but only so long as it’s backed by a strong password (a password that is long and difficult to guess).
Apple realized that most iPhone users don’t like to meddle with strong passwords for their phones, so they tried to make a system that would be (more) secure with easy passwords (specifically, a four-to-ten digit PIN). To do this, the PIN is hashed with a unique phone identification code* to generate an AES key that is robust enough to keep the encryption secure.
But this makes for another flaw: If you can access the unique phone identification code, cracking the phone becomes a matter of brute-forcing the PIN which is easy-peasy for a pro-cryptanalytic team with a robust mainframe (yes, the FBI cybercrime division has such a team, and the DHS and NSA are each probably happy to lend theirs).
Apple slowed this down by making the phone forget its unique phone identification code after so many failed PIN guesses (I’ve heard ten and twenty). Also each failed guess is followed by mandatory interim before the next guess is allowed. And these periods increase with subsequent guesses (after guess nine, it waits an hour before allowing for guess ten).
But — yet another flaw — the phone will accept updates if correctly digitally signed by Apple Inc. Which means Apple has the capability of changing these rules. They can make it so that the phone won’t wait at all between PIN guesses, and won’t forget the unique phone identification code after ten guesses, or a hundred, or a million.
And that is what the FBI wants Apple to do. And if Apple won’t do it voluntarily, the state wants to pressure the court to compel Apple to do it.
By my understanding of the terminology, I’d call this a security flaw. If it is a witting one (as per Apple’s effort to compromise between security and ease-of-use), then it qualifies as a backdoor in place. One cannot usually install a backdoor to encrypted data without first decrypting the data and then encrypting it again with the new backdoored scheme. But this means Apple isn’t writing a backdoor for the Feds, they would be opening the backdoor that was already installed. If we really want to give Apple the benefit of doubt, they would be helping the Feds exploit a security flaw that Apple failed to fix.
And yes, backdoors don’t necessarily have to lead to a perfect way in, but can just present a face that’s easy to crack (like a heist team undermining a strongroom).**
The Court Challenge
Apple’s Tim Cook, in my version of the universe would simply admit Look, we know that people are often lazy about passwords, and wanted to give them a more secure option than a poorly-secure phone, hence our fancy-yet-flawed key generator. and Apple users would nod, agree to having to choose between easy passwords or strong security and Apple would facilitate that (and in my universe, Apple would open its phones to alternative software sources and let iPhone users play Binding of Isaac). But Cook figures that people won’t understand the compromise and will blame Apple Inc. for flawed phone security. He’s trying to direct blame towards the FBI.
According to Cook, Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.
From what I understand, this would be all done on Apple grounds, with code that only works on one phone, and the Feds don’t get to take the code with them (for now).
The pretense by which the FBI is asking this is Apple, you’ve helped us out before. What, are you going Unamerican? Yes, the FBI and the Department of Justice are willing to, at the same time, say just this once and but you’ve helped us before. which makes them not-exactly-trustworthy.
This is the current conversation. I mentioned yesterday its a bad thing this can be done in the first place It means that a) Courts will, knowing it can be done, sooner or later demand that it will be done. And b) hackers (soldiers of the Chinese hacker army or otherwise) will from this point forth on work unceasingly to do it, provided that cracking open iPhones would serve as part of a lucrative business model (illegal or otherwise).
As so many other historical security flaws and backdoors have taught us, they will be exploited until sealed.
The Unspoken Option
Snowden mentioned at least once that all this hyperbole about going dark from Law Enforcement (the FBI, NSA and DHS specifically) is a smoke screen, that they have (or are developing) ways to get into encrypted phones. If getting into this iPhone was critical (say we expected to uncover more cell elements), the FBI wouldn’t be asking Apple to write software, rather a team of engineers would be pulling the phone apart, extracting the flash memory (including the unique phone identification code), and reverse engineering the key-generator themselves.
I would expect the FBI is already doing this, and the court issue about getting Apple to cooperate is a) maybe to speed things up, b) a pretense to imply the Feds are incapable of pulling apart insecure iPhones (ergo implying iPhones are more secure than they are) and c) to create legal precedents towards the convention that tech companies can be forced to facilitate agency exploitation of civilian technology and to push for weaker end-user security (which may or may not include encryption hobbled by mandate).
And the UFO Conspiracy Theory was likely a ploy to discredit those who have figured out real conspiracies.
* I assume the unique phone identification code is randomly generated when the phone is first booted. If not, this is an additional security flaw since this code might be externally derived.
** Early international versions of the Secure Socket Layer were hobbled to only regard the last sixteen bytes, hence while being encrypted, it was easy to crack (by pro-CRYPTAN teams with big mainframes). Yes, big government has been sabotaging secure communications for a long time, and companies have been letting them. Only post-Snowden have we been taking secure civilian communications seriously, and only with great resistance.