Point/Counterpoint: Apple vs. FBI

The stage is set for a showdown between Apple Corp. and the federal government arising out of the San Bernardino shooting case.  The arguments follow the well-worn tracks of the national security/law enforcement vs. digital data security & privacy debate.  In this article, we will provide an overview of the claims and issues in the case, and attempt to take a clear-headed look at the merits of the claims on both sides, dispensing with (or at least, identifying) the distortions and hype.

Here’s a recap:  On Feb. 16, 2016 a federal magistrate judge ordered Apple to help unlock the iPhone 5C phone of Syed Rizwan Farook, one of the two San Bernardino Shooters of Dec. 2, 2015 (see the order of Magistrate Judge Sheri Pym of the U.S. District Court for the Central District of California (pdf); and the motion of the government requesting the San Bernardino order, which includes a lengthy technical and legal memorandum and technical affidavit in support).   Apple is opposing this order (NYTimes) (this followed a failed month-long, behind-the-scenes negotiation between Apple and the government seeking a deal to unlock the phone).  Apple has not filed a brief yet.  But it has certainly gone to the court of public opinion.

POINT (by Aaron Krowne)

Apple is claiming that the government is “asking it to create a backdoor” into the iPhone’s security; to wit (from the NYTimes article):

Mr. Cook, the chief executive at Apple, responded Wednesday morning with a blistering, 1,100-word letter to Apple customers, warning of the “chilling” breach of privacy posed by the government’s demands. He maintained that the order would effectively require it to create a “backdoor” to get around its own safeguards, and Apple vowed to appeal the ruling by next week.

“The same engineers who built strong encryption into the iPhone to protect our users would, ironically, be ordered to weaken those protections and make our users less safe,” Mr. Cook said.

Apple argues that the software the F.B.I. wants it to create does not exist. But technologists say the company can do it.

I have to hand it to Cook, Apple (and organizational defenders of digital consumer privacy, such as the EFF): they have taken this opportunity to make a strong stand for the principles of popular data privacy.  The rhetoric is quite soaring–perhaps, a bit, into thin air.  And I say this as a privacy buff and advocate of strong individual data privacy rights.

To summarize, Apple claims that (1) following the court’s order would require “creating a backdoor”; (2) Apple cannot technically comply (i.e. that there’s force to their statement that “the software does not exist”); and (3) there would be a “chilling effect” if Apple complied.

Taking the last point first, I think it’s important to note that there there is a valid warrant.  That’s not in dispute.  Further, the phone actually belonged to Farook’s employer, San Bernardino County, and he had signed a release granting them the right to search it.  The County, of course, has given such consent; so this is not even a case of the owner or custodian of a device opposing its forcible unlocking.  Thus, Apple’s “chilling effects” claim should be properly placed in this limited context (see more in the COUNTERPOINT, below).  That doesn’t mean it’s insubstantial–but we’re not talking about unlocking Edward Snowden’s iPhone here.

The broader issue and putative “chilling effect” comes down to the argument that if Apple helps the government in this case, it will open a pandora’s box of repeated applications of the same procedure–not only to “Edward Snowden’s iPhone,” but yours, mine, and all other individuals who don’t have cognizable government criminal charges against them.

That concern, of course, isn’t baseless.   But it does seem like the rhetoric around the “chilling” point is very overblown.

Here is where scrutiny of Apple’s first claim — that complying consists of “creating a backdoor” — comes into play.

The technical roadblock in the case is that the security system of the iPhone 5C (iOS 8) allows for the user to set a device passcode, from which an encryption key is derived that is combined with device-specific keys to doubly-protect all data on the device (thus, removing the storage memory and attempting to access it even with the passcode won’t work; nor will leaving the memory in place and attempting to access it without the passcode–you need both. See this report for a great technical overview).  Further, the user can set a security option which completely wipes the device after 10 unsuccessful attempts to log in, which it appears was enabled in this case (even the possibility he enabled it is enough to dissuade attempts). Even if a 4-digit passcode was set, it could take up to 10,000 attempts to guess the passcode.

This is what the FBI would like bypassed–they want unlimited attempts, and/or elimination of the data-wiping function.  Apple says that this would constitute a “backdoor.”

However, looking at the technical background of iOS 8’s encryption system, it turns out that there is a glaring gap in the security scheme: Apple can, at any time, replace or bypass the core iOS “firmware” on these devices.  All it needs to do is provide a valid cryptographic signature with the new code (that means even third parties can do the same–provided they acquire, or crack, Apple’s signature).  Such updated firmware could conceivably loosen or eliminate the password brute-force guessing restrictions, or eliminate the key-destruction function.  Further, replacing the firmware in this way will not wipe out the on-device cryptographic keys protecting the device’s data–meaning such bypassable firmware can always be installed by Apple, leaving the data intact.

It’s pretty clear why Apple left this facility in: it wants to be able to “rescue” users’ devices (in terms of firmware updates) without effectively destroying the data held within.  That seems pretty helpful, and after all, Apple trusts itself with its own secure signature.  But, technically speaking, this ability to replace the firmware with arbitrary code while still leaving the data on the phone intact is a backdoor to the encryption scheme (it’s hard to call it by anything else, which is probably why Apple isn’t mentioning it).  The obvious flaw here: someone gets their hands on Apple’s software signature keys and can then flash a device with modified firmware, or Apple itself is compelled to do the same (hence this case). Apple probably didn’t intend it that way; but… oops.

The lesson here is don’t build backdoors into your encryption schemes–ironically, the very thing Apple itself is arguing against.  Yes, the government is asking that Apple modify its own firmware to allow bypassing the security options, but it’s only because of the design of Apple’s own iOS 8 security scheme that this is even possible.   It’s like arguing that the key to a second entrance to a room (a “back door,” if you will) is the back door–but if you are against backdoors, then don’t build a second entrance with a second lock (even if you yourself have not yet stamped out the first copy of the key that you know will fit in that lock).

If Apple had, for instance, required the passcode to even flash (or bypass) the device (complete with delay and self-destruct routines), this situation would be prevented.   Another method would be to destroy the on-device keys whenever the firmware is flashed (or bypassed) without entering the passcode.

But Apple did not do that, at least not in iOS 8 (our understanding is that on iOS 9, and on any device with an A7 or later processor, the brute-forcing gets more difficult because the hardware implements the passcode attempt delay–but it’s still within a few-months ability of an attacker to brute force.  See this paper again, especially mentions of “Secure Enclave” on pp. 9, 11).  So, the “backdoor” is there.  If there is some technical reason that Apple cannot alter the firmware to allow for brute-force hacking attempts in the way the FBI wants in the present case, it’s not clear to us what those are, let alone to even more technically-expert individuals.

So now we have:

(1) Apple technically provided the backdoor, the government just wants a “key” to open it;

(2) It appears that Apple can create this key, and without much burden; and

(3) There’s nothing unjust about the situation under which the access would take place, which weighs against its being “chilling.”

Still, many perceive Apple handing the key to a backdoor it itself created over to the U.S. government in these circumstances “chilling”, and that’s not an unreasonable view.  But what I find at least as “chilling” is that Apple created a backdoor in the first place; indeed, when security pros criticize backdoors on technical grounds, it’s the sheer presence of the backdoor that they are referring to as the threat — not the incidental mechanisms of opening those backdoors.  They are most assuredly not saying “create all the backdoors you want; just be really really careful with the keys!” (Here’s an attempt to paint a distinction; but I think it’s a distinction without a difference).

There’s still a whole debate here about whether Apple should be legally compelled to provide that key, but it should be recognized for what it is: a legal debate.  The circumstances here just don’t look anything like the classical ones where there are direct, individual rights at stake to the point where the government would rule against its own warrant- and writ-ordering powers (and against national security interests) (see more in the COUNTERPOINT below).

There are technical wrinkles in the case that make it hard to call, as well.

For instance, reviewing the FBI’s affidavit in support of the order (see above government motion, pdf pp. 26-27), the government actually wants Apple to create a version of the firmware that uniquely will only work on Farook’s iPhone, “to mitigate any perceived risk to Apple iOS software.”  In other words, the government attempted to take into account Apple’s concern about “chilling effects” inherent in creating the “key” that will open Apple’s backdoor (presumably, for the first time) by limiting that key’s usefulness to just Farook’s device.

While that is very thoughtful and considerate of the government, it is basically asking Apple to add a new feature to their software, not just change 1 line/1 byte/1 bit (or whatever isolated change it actually is) to increase the number of passcode entry attempts or turn the device auto-erase off.

One can read this nuance either way: either the request is more permissible because the government is merely requesting a “one-time key” to Apple’s de facto backdoor, or pro-Apple, because such a “key” would be a distinctly different feature than they already support in their software.

My best guess as to the legal outcome? Despite differences in our views, I actually think Charles’ prediction at the end of his COUNTERPOINT is the most likely, so read on to see it…

COUNTERPOINT (by Charles Borrero)

We shouldn’t be too harsh on Apple; they did make end-to-end encryption work by default, which is why they are in this situation–even the FBI can’t crack it.  I think that means they did a pretty good job.

For my part, I think everyone should calm down a bit.  This is a good case for this request.  There’s a warrant and a lawfully seized device.  There’s a possible tie to ISIS, so the phone may have some intelligence value, but it’s not Jack Bauer with a ticking time bomb.  And there’s no need to convict the Farooks.  So there is theoretically some importance, but not enough time-sensitivity to affect a judge’s full consideration.  That’s a good thing–both sides have valid points to make.

Everyone–the prosecutors, Tim Cook, and the judge–is doing their job.  The Assistant US Attorneys (AUSAs) and the FBI want to effectuate a warrant to search a murdering terrorist’s computer. Tim Cook made Apple double-down on privacy and doesn’t want the company to be forced into hypocrisy.  Nor does he want the government–any government–to be able to force Apple to do its bidding on a whim, which could seriously hurt Apple’s image and therefore its bottom line.

The Magistrate Judge is a former AUSA acting on her authority and the only affidavit before her at the moment.  Since the motion was ex parte, Apple hasn’t made its undue burden argument yet; and the Order gives them until Monday to do so.  The judge may schedule a hearing at that point. Since there seems to be no specific urgency, she might stay the Order in the meantime.

If I were arguing for Apple, I’d say this is like asking for a skeleton key to all Masterlocks–customers would stop buying them once the key’s existence got out, hence the “chilling effect.”  That’s an exaggeration, per the POINT above, but it’s basically what Cook is saying.  Since the order asks that Apple use its signature(s) to create a work-around limited to this one device, it’s a bit more like asking them to open a back window–just enough to squeeze through.  Nonetheless, the Masterlock analogy works, because perception is reality in the market.

So Apple has a strong undue burden argument not because it’s too hard or expensive to help out, but because it shouldn’t be forced to hurt its business to unlock something of uncertain value.  In my view, it boils down to whether the Court gives weight to likely aggregate effects in the secure device market.  (It’s also odd to commandeer a private party to create something new, but I don’t think the Court should create a bright line rule on that.)  If I was the judge, I might not order Apple to do it in this case.  But I would say it is within the Court’s authority to do so.  And if I were working for the Dep’t of Justice, I would have filed the same request–they should be testing the bounds of their ability to compel assistance here.

What’s sad is that we got to this inevitable point without any legislative action.  It’s not necessarily bad that a law from 1789 (the “All Writs Act”) is the authority here (the Constitution is from 1789, and it comes in pretty handy, albeit with Amendments).  But there should have been some Congressional action on this in the past decade.  I remember discussing the creation of special warrants for this sort of thing years ago (if you think that’s outrageous, I remind you that Alan Dershowitz proposed the creation of torture warrants after 9/11.)

More specifically, I’d like to see a law allowing a limited number of special warrants, along with an appropriation of funds, to compel assistance in this circumstance.  For example, a DOJ official could authorize a warrant application to the Foreign Intelligence Surveillance Court or a District Court.  The statute would sunset and be subject to reauthorization after, say, 100 requests.  That would help avoid abuse.  Everything could be classified, the assisting third-party could decide what to do with any proprietary materials involved, and there could be an indemnity provision.  That would provide something of interest (in the CYA sense) to companies like Apple: secrecy/plausible deniability.  And if publicly disclosed, they retain the ability to say they had no choice because they were under an exceptional Court order.

I thought that would be the logical conclusion here.  But now it seems that the prevailing technology is near the point where there’s no effective bypass measure, so the whole thing may become moot; which is not to say it won’t be a recurring issue for a while.  (At some point, conspiracy theorists and game theorists will agree that it’s sensible for the intelligence community to quietly encourage the proliferation of encryption it has secretly cracked–but that’s a post for another time.)

To hazard a guess at the outcome, I’d say that the Magistrate Judge will reconsider the Order but uphold it with instructions to pay a high dollar amount to Apple for its assistance.  Then, a district judge or appellate panel will overturn it, and it will end up in the hands of an en banc Ninth Circuit that will find the uncertain benefits don’t outweigh the burden here.

Leave a comment

Your email address will not be published.


*