The Myth Of A Secure Back Door For Encryption
It seems like an appealing move–give the FBI and other law enforcement agencies, as well as our spy organizations, a back door—a “golden key”—to unlock encrypted communications to help catch criminals and terrorists and to protect Americans from harm. This notion of heightened protection is particularly compelling in the wake of the recent terrorist attacks in Paris and suspected Islamic State complicity in the mass killings in San Bernardino, CA, the worst homeland terrorist episode since 9/11.
When Islamic State commanders find a recruit willing to die for the cause, they move their communications over to encrypted platforms—“going dark,” FBI Director James Comey recently said. He has also pointed out that Islamic State militants and other terrorist groups could use encryption to “recruit troubled Americans to kill people” in the homeland. These are scary points, but the unvarnished truth is that the golden key is a fictitious panacea. It is analogous to iron pyrite, not gold.
In the domain of cybersecurity and encryption, the bad guys are just as smart as the good guys. Their tradecraft is focused on identifying and exploiting vulnerabilities. If there is a back door, they will find and exploit it. End of Story – Full Stop. At the same time, it’s hard to imagine that government agencies, which are regularly breached, could be trusted to keep such a golden key safe from hackers and criminals. Exhibit A – the Office of Personnel Management breach. As for industry as a guardian: Exhibit B – the breach of cybersecurity company RSA Security. In short, all vulnerabilities will be found and exploited by the protagonists in the ongoing cybersecurity battle.
And lest we forget, 99 percent of the users of encryption technology do so for legitimate reasons—to protect sensitive information that in the wrong hands can and will cause irreparable damage and harm. Our government charges corporations, institutions, and individuals to take personal responsibility for their cybersecurity. In many cases, failure to do so on the part of corporations can result in heavy penalties and litigation. Encryption is not a panacea in this regard, but it can be a particularly effective tool for keeping private that which would otherwise be misappropriated and misused.
If encryption is one of the most effective tools we have to protect our most sensitive information, talk of the creation of a golden key will undermine essential innovation, and undermine venture capital investment in encryption solutions—an invaluable element of our cybersecurity tool box. Companies like Crowdstrike, which is known for outing Chinese and Russian hackers; Vera, which locks down transferred documents; and Keybase, which aims to make encryption easier to use, all rely upon market demand for their solutions. Venture capitalists will think twice about investing in these and similar startups if demand for a back door to encrypted systems undermines the effectiveness of, and demand for, encryption.
Further, given that encryption innovation is not the exclusive domain of U.S. innovators, we can expect that technologists operating outside of the U.S. will be more than happy to fill a void created by a U.S. retreat from encryption innovation.
Only two months ago, it looked like efforts to demand a back door or produce a golden key would be derailed because the Obama administration backed down in a dispute with Silicon Valley over the encryption of data on iPhones and other digital devices. The administration reached the conclusion that it wasn’t possible to give American law enforcement and intelligence agencies access to that information without also creating an opening that state actors, cyber criminals and terrorists could exploit.
Unfortunately, the White House and congressional staffers have subsequently asked Silicon Valley executives to re-open talks on the matter in the wake of the Paris terrorist attacks. This is at least partly a public relations dance; Washington doesn’t want to create the impression that it’s brushing off the implications of a tragedy. There is no evidence that Islamic State attackers in Paris relied on scrambled communications. But the U.S. Senate Intelligence Committee has theorized that the terrorists likely used “end-to-end” encryption because no direct communications among terrorists was detected. And, too, Islamic State has created tutorials on how to evade electronic surveillance on the cheap.
Admittedly, debating the pros and cons of a back door to encrypted systems may seem academic. If a there is no such thing as a “secure” back door or golden key, what is the point?
Nonetheless, it is worth noting that two larger issues are at play here, and they favor the “no back door” viewpoint. One is … Next Page »