The New York Times had a story on 7/7 that made it clear an age-old battle (at least, in the information age) rages on: that between government investigators and anyone with an interest in keeping data private and secure.
(by Aaron Krowne)
In this episode, a dream team of security researchers — not convened since 1997, when they defeated the loathed “Clipper Chip” proposal — have come together again to remind the government (and the world) that built-in backdoors are a bad idea. Here is the setup:
Technology companies including Apple, Microsoft and Google, which are grappling with revelations that the National Security Agency and its counterparts were siphoning off digital communications and hacking into corporate data centers, have been moving to encrypt more of their corporate and customer data. Yet law enforcement and intelligence agency leaders argue that such efforts thwart their ability to monitor kidnappers, terrorists and other adversaries.
Noble objectives, no doubt (though we’re surprised they left out everyone’s favorite home-grown bogeyman, child pornographers; someone must have had their morning coffee switched to decaf…) . There are a few fallacies and methodological problems with this argument, however: one is that being able to access “all data” will mostly just increase the “false positives” — red flags that lead nowhere (think of the ratio of people pulled aside for extra screening by the TSA versus actual terrorists caught this way — a false positive ratio probably in the neighborhood of one million-to-one, at best). In other words, having access to this much information is probably not even “economically” worth it for the government (or put more honestly, not economical for you, the law-abiding taxpayer).
Another issue is that these arguments invite the listener to draw the improper inference that without being able to snoop on everyone’s data, the government will have no tools with which to catch the baddies. But it is only with “traditional” investigative tools (undercover work, observation, informants, gathering and analyzing public information, etc.) that the government even knows which protected data to focus on in the first place, so these tools will always be necessary. This implies that cracking encryption doesn’t represent anywhere close to the full picture.
Sure, while there will always be cases where the ability to bypass encryption through the use of a backdoor would allow for catching marginally more criminals — the same would go for giving the government a skeleton key to every front door in the country. But does the government have a right to demand that? Or to approach it from the flip side: do we trust government agents (both well and ill-meaning, procedure-following and procedure-flouting ones) to have that kind of power? The same issue of trust exists in the encryption/backdoor context, and it is discussed in the article.
The security researchers, of course, go farther: they point out from a more technical perspective why backdoors “don’t work”. Here is the high level discussion from the article:
In the report, the group said any effort to give the government “exceptional access” to encrypted communications is technically unfeasible and would leave confidential data and critical infrastructure like banks and the power grid at risk…
With government agency breaches now the norm — most recently at the United States Office of Personnel Management, the State Department and the White House — the security specialists said authorities cannot be trusted to keep such keys safe from hackers and criminals. They added that if the United States and Britain mandated backdoor keys to communications, it would spur China and other governments in foreign markets to do the same.
“Such access will open doors through which criminals and malicious nation-states can attack the very individuals law enforcement seeks to defend,” the report said…
Today, the government’s plans could affect the technology used to lock financial institutions and medical data, and poke a hole in mobile devices and the countless other critical systems — including pipelines, nuclear facilities, the power grid — that are moving online rapidly.
The researchers make excellent arguments (well above our pay grade) that focus on the technical-weakening effect of backdoors. No doubt the already commonplace large-scale breach incidents would utterly explode if backdoors were mandated to be built in to all encryption — if only because backdoors represent an additional “failure point” for technical attacks (though the researchers go well beyond that).
But also note our totally separate points above. And if that is not enough, there are additional good reasons that mandated backdoors are pointless towards in achieving (purported) societal objectives:
- The bad guys will always be able to use “real” (unregulated, backdoor-free) encryption, especially when it comes to pure software, which they will do in droves when they realize that “standard” encryption is “crippled.” So, “crippled” encryption will largely be used by innocent, well-meaning average folks, and the result will be a further explosion of the false positive rate.
- Because law-abiding people will be aware of the backdoors, there will probably be a chilling effect on free speech generally (think about how we all now write email considering what the “NSA employees also reading our emails” think of them…)
- While the security researchers discuss vulnerability with “outside” hackers in mind, think also of “inside” compromises. As recent surveys have underscored, at least 20% of data breaches originate from the intentional actions of insiders. I.e., “trust” that the government will properly handle the keys to backdoors is not just a matter of whether they make a mistake, but whether particular government agents (that is, every single one of them, out of millions) can be trusted not to abuse their access. After the NSA’s own Inspector General admitted that NSA officers abused their power to spy on love interests, and after we learned that Sheriff Joe Arpaio abused the USPS’s “mail covers” program to spy on and target political opponents, and after the IRS targeting controversy, do you feel so comfortable with this assumption?
We would do well to remember in this debate that a government is made up of regular people — who, like the “unwashed masses” they investigate and regulate — are themselves flawed. This is why we have due process rights and administrative procedure rules. Compulsory technical backdoors simply do away with those protections (incidentally, this is why “backdoor putsch” is a fitting phrase — here, the government itself is attempting to overthrow a legitimate, longstanding legal regime of rights-protection).
So, it would likely be better for all of us if the “backdoor putsch” would be abandoned — and sooner rather than later. But that will probably never happen, as we all know government’s tendency is to perpetually seek to acquire and aggregate more power to achieve its objectives (for better or worse). That’s why citizens need to be eternally vigilant in pushing for their rights to be respected and upheld, particularly as rapid technical change blurs the boundaries of what is a valid exercise of power and what is not.
Update (7/9/2015): Mere hours after writing the above, we find out that the impact of the massive government OPM hack has been “upgraded” from effecting some 4 million persons records to 25 million. This, sadly, well-underscores the points made above about trusting the government to “keep things secure.”