I’m Mike Danseglio, and I’m an instructor here at Interface Technical Training. I teach IT Security courses, so I’m plugged-into a lot of what goes on in the security field. I keep abreast of situation. I tend to write a lot of blogs around what’s going on in security and how to do different things.
This is a bit of a tangent for me. I don’t usually comment on current events. I consider that usually a taboo space, but I wanted to talk a little bit about what’s going on at the moment with Apple Computing, and US Government, and data privacy.
This week, Tim Cook released an open letter on his website, Apple.com, regarding a request from a court for Apple to assist them in compromising the data security of an iPhone. I’ve been getting a lot of questions around it, a lot of comments, a lot of people asking for opinions. To be very clear, it’s separated into a few different spaces Technology / People / Process.
From a technology perspective, it’s a nonissue. Technology wise, yes, absolutely. Apple could do what they’re being asked to do. It’s not a question of “If.” It’s more a question of “Should.”
Apple is, in the way they implement security in their iPhones and their iOS devices, they’re doing a fairly good job of adhering to what we call “Best-Practices.” They do not have escrow keys. They do not have backup keys on their servers. The keys for the encryption of these devices resides entirely on the device.
That key, optionally, will self‑destruct after X amount of bad attempts. In this case, 10 bad attempts to open up that key or to access that key means that the device itself will destroy the key, rendering the data unusable. It doesn’t render the device unusable. It renders the key unusable.
That’s good practice. Generally speaking, in cryptography, we’d rather destroy the keys than have unauthorized access to those keys. That’s a great thing. However, that’s not very great when you don’t have access to the keys and you want to have access to that data.
What Apple’s being asked to do is to help circumvent part of the security scheme. They’re not being asked to rewrite cryptography. They’re not being asked to change the bit‑ness, or the strength, or the cypher. They’re not being asked for key escrow. What they’re being asked for is indefinite attempts at group brute-forcing a key, which is trying every possible key until one works.
On a typical iPhone, you either have a four‑digit PIN which could be up to 10,000 combinations, or you have a six‑digit PIN, which could be up to a million combinations.
On a modern computer, how fast could we type those?
If we have an automated device that types these in, for example, if you go to IKEA and you see the mechanical arm pressing down on the mattress, and you do that kind of thing – like a Rube Goldberg pressing on an iPhone. 1,2,3,4 that didn’t work…. 1,2,3,5 that didn’t work…etc…. That’s a fair attempt. Can it be automated? Absolutely.
If Tim Cook allows the engineering to happen that makes it possible to brute force keys by trying an infinite number of combinations on a device until one works, without the device forgetting the keys and locking itself, what that means is that eventually, anyone that wants to attack that device will be able to be successful.
Is that a good thing or a bad thing? From a technology perspective, it’s just another thing. It doesn’t matter. The tech really doesn’t matter here. What matters is the request and to some degree, the insight behind the request.
To give you a bit of perspective, I go to a lot of security conferences. Some of them, I watch presenters that are defenders and protectors and security designers. Other times, I go to conferences and I see hackers, and attackers, and people that literally teach or present sessions while they’re wearing a ski mask or an Anonymous mask, because they don’t want their identity to be known.
The folks at those conferences where we have bad people ‑‑ the attackers ‑‑ once they know that it is possible to circumvent a security element like this, they’ll absolutely find a way.
I can assure you that if Apple engineers a way around the device locking ‑‑ the forgetting of the keys ‑‑ what’s going to happen is attackers will find a way to make that happen on any device they want. It won’t be a matter of “If,” it will be a matter of “When.” Will it take a month? Will it take three months? Will it take six months?
That’s not based on conjecture and hearsay. That’s based on years and years of me seeing technology show up and attackers spend 24 hours a day, seven days a week, hip‑deep in assembly code, and debugging code, and devices attached to oscilloscopes, until they figure out, “This is how I can make it happen.”
Today the iPhone doesn’t have that capability. An attacker simply cannot engineer that kind of attack. There’s no capability.
I don’t say this lightly or often. The engineers in Apple did a great job on security by making sure attackers cannot circumvent that 10 attempts and then the key gets forgotten. They did a fantastic job.
If they intentionally introduce a flaw into that, whether it’s a standalone tool, or part of a Core iOS, or anything like that, attackers will find that vulnerability and be able to exploit it to unencrypt and access any iOS device. It’ll be a question of time, and to some degree, money.
You might think, “Attackers? That’s Matthew Broderick in ‘WarGames.’ That’s some kid in the basement. That’s Sandra Bullock clicking on a pie symbol at the bottom of a Web page. That’s not real. That’s not really a big threat.”
First of all, that’s wrong perception. Today, attackers are usually part of large conglomerates, large organizations, crime organizations, crime syndicates worldwide, that are well‑funded, that send their attackers to training that get them practice so they can do what they’re doing, because there’s profit to be made.
Let’s say I happen to work at Yahoo, and I’m competing with Google, and I happen to be at a Starbucks that’s in between the two campuses, and I find a Google executive’s phone. Wouldn’t I want to be able to give that to some “Contractor” who’s also an attacker, and say, “Here’s $10,000 give me the contents of that phone.”
Right now, if it’s an iOS device like an iPhone and the person says “Here’s some money. Crack that phone.” The answer is, “I can only try it 10 times and then I’m locked out. I don’t really have access to it.” However, if this order is complied with, the answer will be more like, “Yeah. It’ll take a little bit of extra time. It’ll be a little extra money, but I can do it.”
Industrial espionage is very real. State espionage is very real. Foreign countries often do information warfare ‑‑ cyber warfare. Happens constantly, we just don’t hear about it very much. In mainstream, we don’t hear about it very much. It’s not that big of a deal until it is ‑‑ until this kind of thing happens in the real world, and a real device with real important secrets gets compromised.
As a security practitioner, I try desperately hard over, and over, and over again, to make sure there are no design flaws introduced in security systems.
However, this Federal Order to Apple is indicating to me that this organization that’s done a really good job with security is being asked to design a flaw into their system ‑‑ not temporarily, because it’s very clear they can’t temporarily do anything.
Once they design this kind of vulnerability, even if they’re assured that this vulnerability will only be used one time, that’s NOT TRUE. That’s complete and utter nonsense. It will never only be done once.
Once a prosecutor says, “Look! This other prosecutor got this data off an iPhone because they went to Apple and said, ‘You need to give it to us,’ and Apple said, ‘OK, here you go.'”
Do you think other prosecutors aren’t going to do that? Do you think other governments aren’t going to go to Apple and say, “You did it for the US Government. You want to sell product here, you treat us equally.”
The European Union, I assure you, will say ‑‑ and rightfully so ‑‑ “If you’re going to do it for one, you got to do it for the other. You sell your products here and there, you’ve got to treat us equally.” It’s very much a Pandora’s Box, once you open it, there’s going to be no closing it.
The other aspect of this is the Process aspect of it. I talk usually about People Process, and Technology. The technology I’ve already described is moot. It doesn’t matter. The technology is there. People: I think we’ve talked about little bit already, the Tim Cook versus others.
The Process side of how this gets handled is another important bit.
Apple very clearly says ‑‑ and I believe it, having worked at Microsoft long enough and seen it on that side ‑‑ that they comply with court orders. They comply with warrants. They comply with subpoenas. They do, I’m quite certain that they do. I’m quite certain they worked with law enforcement extensively before this letter showed up ‑‑ before this warrant got served.
I have no doubt of that at all because no one wants terrorists to go free. No one wants pedophiles to be free. We want these people in jail. I’m sure Apple did everything they could. The problem is, once this happens, once Apple is compelled to produce this tool, or this technology, and they do, what’s the process for making sure it doesn’t show up again?
What’s the process for making sure if this Federal Government Agency requested it and it happened, that a state agency can’t request it, or a local agency, or a mom and pop, or a private investigator?
Anyone ‑‑ a spouse that opens a civil lawsuit against their spouse says, “I got a judge to sign off on Apple decrypting my spouse’s iPhone. I’m going to steal it from my spouse. I’m going to put it in a box. I’m going to send to Apple with the judge’s warrant,” and there you go.
Now it escalates, it escalates, it escalates and we don’t have a real clear process because we don’t have any process at all for this. We’re making this up as we go along.
I know this is a bit of a rant, however, I think it’s an important rant. With this particular scenario, at this time, whatever decision is made ‑‑ whatever happens here ‑‑ is almost certainly going to go at least in this country, up to a Supreme Court decision.
The Supreme Court’s going to have to figure out, “What do we do with data on phones? Do we tell companies that are making products to intentionally design flaws such that terrorists, and pedophiles, and other criminals, can be caught at the risk of other folks’ data being at risk from these hackers, or these crime syndicates?”
It’s going to be an interesting probably year to two years while this is sorted out, while folks figure out what to do, while Apple, as they’ve clearly said they will fight this warrant and fight this legal proceeding while certainly law enforcement will push very hard for this to happen because they want evidence.
To be very clear, I believe in law enforcement. I believe in the process. I believe that law enforcement wants to do the right thing here.
In fact, I believe Tim Cook very clearly conveys in his letter that he knows they’re just asking for data that they believe is important to protect us ‑‑ to put bad people in jail. I get that. Of course that’s noble.”
I want to support that, but at the cost of the potential privacy of all data, I think it’s too high of a price to pay. I think that we should have absolute data security and that’s just how it is, unfortunately.
On the Microsoft side, we have Bit Locker, an encrypting file system. You lose the keys to those, if you don’t have a backup, you’re OUT-OF-LUCK, because that’s the way we designed it at Microsoft. We have other technologies. You lose the key, TO BAD – out of luck, because that’s how it’s designed.
The newer iPhones ‑‑ in fact, iPhones 6 and later ‑‑ use a processor that has a little security chip on it where it keeps the keys. When it believes that the phone itself is tampered with, it erases itself and you are out of luck. That’s kind of a good thing in data security. If it means we lose data, then we lose data. We have to make a call somewhere.
I unfortunately, having seen this too many times, know that you can’t do this once. The Pandora’s Box is too dangerous to open. We need to leave it closed. We need to not ever design a security flaw into an already secure product.
Mike Danseglio – CISSP, MCSE, and CEH
Mike Danseglio teaches IT Security Training, Windows, System Center and Windows Server 2012 classes at Interface Technical Training. His classes are available in Phoenix, AZ and online with RemoteLive™.