Here is a question that keeps me up at night…
Is the San Bernardino iPhone just locked or is it properly encrypted?
Isn’t full encryption beyond the reach of forensic investigators? So we come to the real question: If critical data on the San Bernardino iPhone is properly encrypted, and if the Islamic terrorist who shot innocent Americans used a good password, then what is it that the FBI thinks that Apple can do to help crack this phone? Doesn’t good encryption thwart forensic analysis, even by the FBI and the maker of the phone?
In the case of Syed Rizwan Farook’s iPhone, the FBI doesn’t know if the shooter used a long and sufficiently unobvious password. They plan to try a rapid-fire dictionary attack and other predictive algorithms to deduce the password. But the content of the iPhone is protected by a closely coupled hardware feature that will disable the phone and even erase memory, if it detects multiple attempts with the wrong password. The FBI wants Apple to help them defeat this hardware sentry, so that they can launch a brute force hack-trying thousands of passwords each second. Without Apple’s help, the crack detection hardware could automatically erase incriminating evidence, leaving investigators in the dark.
Mitch Vogel is an Apple expert. As both a former police officer and one who has worked with Apple he succinctly explains the current standoff between FBI investigators and Apple.
The iPhone that the FBI has is locked with a passcode and encrypted. If it was just locked with a passcode, like most iPhones, then something like the 4ukey iPhone Unlocker could be used to bypass and remove the passcode and gain entry into the phone. Download 4ukey iPhone Unlocker for Windows here, if you need these services. However, the iPhone in question is encrypted and this makes things somewhat more complicated. It can only be decrypted with the unique code. Not even Apple has that code or can decrypt it. Unlike what you see in the movies, it’s not possible for a really skilled hacker to say “It’s impossible”” and then break through it with enough motivation. Encryption really is that secure and it’s really impossible to break without the passcode.
What the FBI wants to do is brute force the passcode by trying every possible combination until they guess the right one. However, to prevent malicious people from using this exact technique, there is a security feature that erases the iPhone after 10 attempts or locks it for incrementally increasing time periods with each attempt. There is no way for the FBI (or Apple) to know if the feature that erases the iPhone after 10 tries is enabled or not, so they don’t even want to try and risk it.
So the FBI wants Apple to remove that restriction. That is reasonable. They should, if it is possible to do so without undue burden. The FBI should hand over the iPhone to Apple and Apple should help them to crack it.
However, this isn’t what the court order is asking Apple to do. The FBI wants Apple to create software that disables this security feature on any iPhone and give it to them. Even if it’s possible for this software to exist, it’s not right for the FBI to have it in their possession. They should have to file a court order every single time they use it. The FBI is definitely using this situation as an opportunity to create a precedent and give it carte blanche to get into any iPhone without due process.
So the answer to your question is that yes it is that secure and yes, it’s a ploy by the FBI. Whether it’s actually possible for Apple to help or not is one question and whether they should is another. Either way, the FBI should not have that software.
The FBI is being lazy. Extract the flash memory. Extract whatever chip contains the unique code for encryption. Make duplicates of flash. Make FPGA or similar to replace unique code chip. Make test jigs. Try the million pass codes letting it timeout if it in fact does this. Keep trying until you are in.
The iPhone’s encryption cannot be broken and a brute force attack would be thwarted after 10 failed login attempts, as you stated. The workaround is that the phone’s software security and encryption are components of the operating system. The FBI wants Apple to build a customized OS X update just for this phone that disables or overrides the OS X functionality that either enforces the user’s password altogether or imposes the limitation of 10 failed logins. Basically, Apple would eliminate the security functionality from the customized OS X build, just as it adds and removes operating system features all the time. The customized OS X update could be delivered via WiFi in the normal way, but to this phone only and from an update server in a closed internal network at an Apple facility. This would be a hyper-sophisticated rootkit attack.
I wonder why the FBI cannot extract the flash memory from the circuit board and then duplicate the data into as many high-speed cracker rigs as they wish. Each rig can run tens of thousands of password tests each second—without triggering any peripheral security processes.
If the phone contained an encryption protected in hardware by TPM/TXT (Trusted Platform Module), then the circuit may be protected in ways that defy extraction. But, I am fairly certain that this is not the case. Some notebook PCs contain TPM, but I think that only Samsung has experimented with it in a mobile phone…
And so, my best guess is that Apple creates a very long password from the short pass phrase selected by a user. Perhaps the reason that the FBI needs Apple’s help (and the reason that they wish to crack the password within the phone) is because they do not know the algorithm for generating a password from a pass phrase.
Currently, the media is filled with speculation, journalists who don’t understand encryption, and a profound lack of informed pundits. If I can glean some authoritative data on the Apple password architecture and the government demands for assistance, I will add an update to the original article.
FBI Director James Comey said: “There are a significant number of criminals and terrorists that use WhatsApp, and that’s a problem.”
[This statement was made several weeks after the above WildDuck article. The FBI had contracted with a 3rd party to crack the Apple security layer that prevented extracting the password].
I’ll admit I’m not an IT professional, but since cloud-based messaging services like WhatsApp aren’t tied to the operating system on an individual device, but instead store all messages on the host company’s servers, isn’t that a different, & substantially simpler, issue, both technologically & legally? I’d think the host company would have the ability to get through the encryption on its servers upon presentation of a properly executed search warrant, & would be able to deliver the specific information required by the warrant without compromising the security of other users’ information, right?
The answer is “No”. If WhatsAp deploys end-to-end encryption, they cannot offer a plain text copy in response to a legal writ, summons, judicial order or NSA letter. That’s because they don’t have a decryption key.
The term “end-to-end encryption” means that the only decryption keys are in the hands of the customer. The data is encrypted from the keyboard to the remote recipient. In the case of a storage device, data remains encrypted until it is pulled back by the content originator.
And, after all, isn’t that the way it should be? Of course, it is.
OK, as I said, not an IT professional, clearly I didn’t understand the definition of “end-to-end encryption.” That answers my question: it’s just as complicated, technologically &, I suppose, legally (not a lawyer either).
Regarding your closing question, though you’ve been gracious enough to provide an answer, I nevertheless offer my own, but I admit that unlike you I lack the certainty that my answer is absolutely correct in all cases:
In the case of communications such a those on a messaging service like WhatsApp, I think they should be designed with some way to comply with a search warrant to provide for disclosure of limited, specific information of the sort that might have previously been kept in a locked file cabinet or similar physical location, without damaging the security of all the rest of the information stored in that manner. Information on a messaging service is generally not as broad as what people are likely to store on a phone.
That’s a great sentiment, Leonard—But, how would you resolve these two niggling issues:
1. How can you leave a device or system open to legal process and yet lock it sufficiently from hackers, competitors, an ex-spouse, the tax man, a stalker, etc? Any one of these interlopers might have just as much incentive, tools and training as the government agent that is tasked with hacking data from your phone or eavesdropping on a private conversation.
2. Perhaps more to he point: How can you protect a back-door system when the interests of you and your government are no longer in alignment? That is, imagine a day when having a Jewish ancestor is a crime, or loving someone of the same gender, or writing a poem that is critical of a favored official, or failing to memorize the Qur’an. One of these was illegal in Nazi Germany, another is illegal today in ISIS controlled Iraq and another was illegal in in southern US states just a few decades ago.
Morals, norms and governments are transient. Yet, intolerance is eternal and it is everywhere. Many of your neighbors expect you to be just like them — or more accurately, they want you to be just like they pretend to be. Encryption restores order and keeps a government on its toes. It is far more likely to protect civilians than enable terrorism. There are still plenty of effective, old fashioned ways to discover and track criminal activity. We needn’t lose our freedoms and turn everyone into a suspect to recover our safety and security.