Encryption, at its core, is math. This means that encryption, at its core, is boring. Math beyond basic addition and subtraction isn’t something I don’t have to deal with every day, and while I took calculus in High School, that class occurred longer ago than I really care to admit.
Like many others, I didn’t think a lot about security beyond getting frustrated whenever I forgot my passwords. A few years ago, I had my account hacked because people wanted access to my Blizzard account, so I configured two factor authentication on my email, but that’s as far as it went. In fact, until very recently my social media and financial accounts all shared the same password because I really didn’t think it through.
But encryption is important. Encryption is security, and if you do anything online, it’s not something you can afford to ignore. This is why the Apple case against the FBI is so important. It’s not about just one terrorists phone, it’s the question of whether or not you have the right to privacy, because encryption is binary. Either we all have it, or no one does.
What We Know About the case
The FBI got an order from the judge stating that Apple had to help them in the San Bernardino case. Apple refused, releasing their dissent as a letter to their customers (followed up by a FAQ). The Justice Department responded by calling Apple’s letter a public relations stunt.
Pew Research released a study that found that 53% of Americans supported the FBI in this case, and that their support did not split neatly down party lines. 55% of Democrats and 56% of Republican respondents side with the FBI.
I am not an expert, but I know this, and that’s why I spent a lot of time reading the writings of people who do.
The debate about privacy is one that it’s no longer ok to be ignorant about. We can’t all be experts, but the issue of privacy is far too important to just trust to our gut.
The San Bernardino Attacks
On December 2nd, a man, Syed Rizwan Farook, and his wife, Tashfeen Malik, attended Syed’s work holiday event. At the event, they opened fire with several guns, killing 14 people and seriously injuring 22 others.
The government investigated the killers motives, and discovered hints that they were “radicalized” and sympathized with Da’esh (ISIS), but as of right now there is no formal link between the two, and it appears that Syed and Tashfeen acted alone.
Before capture, the terrorists attempted to destroy their home computers, personal phones, and nearly every other device that had digital memory. In the hours after the attack Syed’s employer came into possession of the phone they issued him for work and, unlike every other device, this iphone was intact.
Since this phone was the property of Syed’s employer, they gave it voluntarily to the FBI, giving their consent to have Apple turn over all of the data they had stored in “iCloud.” In addition to this, the FBI obtained a warrant for any information not covered by this consent, and Apple complied, giving the government everything they had for Syed’s account.
To prevent information being lost due to theft or a damaged device, iPhones automatically back up information to a user’s iCloud. After going through this data, however, Apple and the FBI discovered that the last backup occurred several weeks before the attacks, which means that any recent messages that could be useful were only available on the device, which was secured by a password.
The FBI Reset Syed’s iCloud password
One feature of iOS is that it backs up your phone wirelessly. This prevents you from losing important information if you break your phone, or lose it, since you can restore this data to a new device. Apple does this by syncing the data to your iCloud account. When you get a new phone, you link it to the iCloud account and your information is downloaded effortlessly.
One Device, Two Passwords
Your iCloud is something that exists outside of your device. You can log into your data from a computer, purchase something through iTunes, or set up an iPad or secondary phone to sync content back and forth. If you have an Android phone, this is similar to what your Google Account/Gmail account does.
Since the iPhone launched, however, Apple allows users the ability to lock their devices using a password. For convience, as well as security, they don’t require you to lock the device using your iCloud account, instead allowing you to use something else, such as a four digit pin. Since iOS 8, this pin is also what encrypts the data on your device.
A lot of information is shared between your phone and the iCloud, but not everything. This is why someone who has access to your iCloud won’t automatically have everything on your phone. Most importantly, while Apple can access your iCloud data and reset it, they cannot reset your phone pin because they have no way of knowing what it is.
The FBI Had Access To Syed’s iCloud
iPhone’s typically back up to the cloud whenever they’re connected to a trusted wifi network (such as the owner’s home residence) and plugged into power. Apple suggested that the the FBI take the phone to a network Syed’s phone was likely to connect to, and try to trigger a backup. If they did, Apple could turn that backup, something they can access, over to the FBI.
The government, however, said that this was impossible because prior to contacting Apple, they reset Syed’s iCloud account. When an iPhone connects to a trusted network, it verifies that it has access to an iCloud account and then backs the data up. If you change the iCloud password, it will not automatically update on the phone. If the two passwords don’t match, the backups won’t work.
For its part, the FBI claims that they believe Syed disabled the iCloud backup before committing the terrorist attacks, though they have no direct evidence for it. Even if he did disable the backup, however, it does not change the fact that the FBI reset the iCloud account without consulting Apple.
This shows that either the FBI was not aware of how security on the most popular device in the world worked, or they were and reset the password anyway. Either answer is troubling.
Apple’s Anti-Theft Encryption
Since the FBI cannot access new data from the iCloud, their only alternative is to view it on the phone directly. This is where the court case comes in.
Starting with iOS 8, Apple encrypts all information on a device, requiring that the user sign in using their passcode before loading their data. While someone can choose whatever they’d like as a password, the default is a four digit numeric pin.
This encryption makes it harder for criminals or thieves to get at someone’s private information, even if they manage to steal the phone. To make this protection as secure as possible, Apple made sure that the only person who could access this content was someone with the Pin, there is no master key.
Choosing a password of only four numbers might not sound that secure, but even this basic security means that there are potentially 10,000 combinations. Some pins, like 1234, are far more common than others, but the chance of you randomly guessing the correct password is still 1 in 10,000. With iPhones, the only person who knows the pin number is the user, not even Apple has this data.
Without that passcode the only option the FBI says it has is to attempt and “brute force” their way into the device. What this means is entering one guess at a password after another, repeatedly, until they find whatever combination Syed used. The easiest way to do this is by hooking up the phone to a computer and entering passwords electronically, which is much faster than going through 10,000 combinations by hand.
Apple Limits Brute Force Attempts
But, the FBI has a problem because brute force attacks are also a common tactic employed by thieves, hackers, or governments. Security systems try to prevent this. For example, financial websites will lock an account after so many unsuccessful login attempts. They will actually disable your password and you’ll need to go through a few different steps to re-verify that you’re eligible to login to the account before you can see your data again.
On the iPhone, Apple attempts to limit brute force attacks in a number of ways:
1) Require Manual Entry: They designed iOS so that you cannot enter a password digitally. On the phone, you must enter the password on the screen.
2) Introduce a delay after login attempts: If you enter a password incorrectly, there is a small delay before the system accepts another entry. This delay gets longer after repeated failed login attempts. This delay is too short for people to generally notice, but it makes it very difficult to have a machine enter the digits for you.
3) The Kill Switch: Finally, there is a optional security feature that will completely erase the phone if someone enters the wrong password too many times in a row. The limit on tries makes any password stronger, but could result in a lot of lost data which is why users must choose to enable the system.
What The FBI Wants
On February 16, the FBI got a judge to order Apple (PDF) to assist them in bypassing the security on the phone. Since they have a well-documented history of opposing Apple’s move towards stronger encryption, they tried making this order as narrowly focused as possible, increasing the likelihood that they could effectively argue their case.
The government went out of their way in this court order that they’re not asking for Apple to break the encryption for them, just to make it easier to do. This is important because it shows the argument they’re trying to make.
The idea of a surveillance state isn’t popular, and revelations like those made by Edward Snowden showed just how little privacy citizens had. When Apple, Google, and other companies updated their security, it was partially to protect their customers from criminals, but also to make it harder for government, including our own, to spy on people without having a proper warrant.
If the FBI made their request too broad for this case, like asking Apple to give them software that they could install on any iPhone, or a request for Apple to build something to crack the software itself, there would likely be a public backlash, since the idea of a “master key” is something that is a lot easier for tech companies to argue against.
Instead, they focused on a specific flaw in iOS, and structured their argument around “one phone” needed to fight terrorism. As Americans, we’ve given away a lot of our rights to privacy in the name of combatting terrorism, and the FBI hoped that they could get us to give away more rights because we wouldn’t think of the case as precedent, only in terms of the terrorist acts in California.
iOS Has A Flaw
One flaw in iOS security is that you can install a new version of the operating system on a device without unlocking it, as long as that update comes from Apple. This means that, if you create a new version of the operating system, you can install it without knowing what the password to the device is.
In most cases, this isn’t a problem because iPhone’s will only recognize software that is “signed” by Apple. The phone has one piece of a very strong encryption key, while Apple has the second piece. The chances of anyone, even the government, figuring out what that key is remote.
The FBI wants Apple to create a specialized operating system, signing it with their part of the key, and install it on Syed’s phone.
According to court documents, the FBI made three specific requests for the software:
- Disable The Kill Switch: This will give the FBI unlimited password attempts, preventing the phone from wiping Syed’s personal information if they can’t guess his password in ten tries.
- Remove Delay Between Attempts: Right now, there’s a small delay that keeps you from entering one password after another.
- Allow For Electronic Entry: This would let the FBI enter passwords much faster by having a computer input them electronically.
“Just This One Phone”
The FBI carefully crafted the court order to try to make it seem as small as possible. They said that Apple could design their GovtOS to only work on Syed’s phone by verifying the IMEI, which is a serial number that is unique to each device, almost like a fingerprint.
The government also said that they wouldn’t even need access to the OS. They’d turn the phone over to Apple, who would install the software at their facility. The FBI could then either access the phone remotely, or they would come on campus to perform the hack, extract the user data, and allow Apple to destroy the operating system.\
This apparent magnanimity makes their request seem trivial, and small in focus. The FBI is trying to convince the court that they just need access to a single terrorists phone, nothing more, nothing less. They’re depending on the public’s ignorance of how the law works to win this case because they know that once they win, they can leverage this case to their real goal: backdoor access into every device.
In his open letter, Tim Cook said that, while this is currently possible, it sets a dangerous precedent, and that once they perform this task once, they’ll have no way to prevent the FBI and other government organizations, or other countries, from forcing them to do the same. The FBI responded with an open letter of their own, saying that Apple’s fears were unfounded. They made the appeal that they couldn’t “look the survivors in the eye” if they didn’t follow every lead.
In the days that followed, it came out that the FBI already had at least 12 other phones (WSJ paywall, summary here) lined up to force Apple to decrypt once they comply with this “just one phone” order.
A Dangerous Precedent
According to the government, this is a case of terrorism. They say that Apple’s unwillingness to help them is a violation of a more than 200 year old law, known as the All Writs Act, which compels businesses to help in a legal investigation, provided it doesn’t present an “undue burden.”
Apple responded to that charge by filing a motion to dismiss, saying that the creation of GovtOS would take a minimum of 4 weeks to create, for a team of 10 highly trained employees. It could possibly require even more time after the hack is completed dismantling the project to purge hacked code from their systems, and even then it still leaves ten people who know how to crack their security.
Apple is arguing the long view, that creating this hack once will only make it permissible for other governments and government organizations to force them to create the hack in the future. Furthermore, they say that backdoors like this are always abused, and always make it into the hands of individuals or organizations that do not have the best interests of their customers at heart.
In my next post, A Question Of Precedent, I’ll try to break down their precedent argument, along with the history around encryption vs security in our country over the past few decades. This case is not about fighting terrorism, it is not about cracking just a single phone. It’s about whether or not you as a Citizen have the right to privacy. Encryption is binary. You either have it or you don’t, and there is NO way that you can have data that is “secure” if someone other than you has a key.