Paris Rampage Rekindles Conflict Over Government-Proof Encryption
The recent terrorist attacks in Paris were executed with guns and bombs in the physical terrain of city streets and cafe terraces. But the horrific incident is reviving government calls for greater control of the cyber environment.
A host of top U.S. officials including the directors of the CIA and FBI, Senator Dianne Feinstein, New York Mayor Bill de Blasio, and Manhattan District Attorney Cyrus Vance, Jr., have been citing the Paris atrocities to bolster their arguments that tech companies need to give law enforcement ways to break through the impenetrable encryption installed in the latest mobile phones, devices, and apps such as WhatsApp. Their claim is that governments need these powers to prevent terrorist operations, although substantial evidence has yet to emerge that encrypted communication played an important role in mounting that the Paris attack.
Silicon Valley has counterattacked against the government salvos with a war of words this week, through publications including Wired, Re/code, and Gizmodo. Privacy advocates, civil rights leaders, and some tech companies maintain that governments will only do harm if they force device and app makers to create backdoors in their encryption code for crime investigators. They argue that hackers will readily find and exploit those entry points, to more easily steal sensitive personal, corporate—and government—information.
How should businesses react to these counterclaims as they try to protect their data, as well as their physical locations, in an era of increasing cross-border threats?
To get some answers, I touched base this week with a cybersecurity expert who is well versed in the interests of both government agencies and private companies. Mark Weatherford just joined Mountain View, CA-based information security company vArmour as chief cybersecurity strategist. He’s a former cybersecurity deputy with the Department of Homeland Security, and former top security official for two states—California and Colorado.
Weatherford (pictured above) says he understands why law enforcement agencies are frustrated that they can’t decipher the electronic messages of people suspected of criminal and terrorist plots. “That’s a legitimate concern,” he says. But he also sees the downside of creating vulnerable spots in encryption software so investigators can crack the code. “Creating a back door or path for law enforcement is going to open back doors for bad guys, too,” Weatherford says.
In addition to calls for law enforcement access to encrypted messages, Congress is mulling a bill to encourage private companies, including device and app makers, to share cyber threat information with the government. The Cybersecurity Information Sharing Act (CISA), passed by the Senate in late October, would grant companies that cooperate with the government immunity from lawsuits for failing to protect their customers’ privacy. The Electronic Frontier Foundation charges that the law creates “aggressive spying authorities,” includes too few privacy safeguards, and won’t solve the cybersecurity defects that led to data breaches at Target and the U.S. Office of Personnel Management (OPM) this year.
Weatherford says some companies favor the law because of its liability shield, but others don’t want to share anything with the government. “Some companies don’t trust the government at all,” he says.
Although information sharing is critically important, government officials may have a mistaken view that if they get the access they want, “all their problems are going to go away,” Weatherford says. That couldn’t be further from the truth, he says.
The “bigger security pie” must be composed of the right tools, training, and people, Weatherford says.
Another essential element is trust—and the U.S. government has lost some ground on that score.
The American public was stunned when Edward Snowden’s leaks revealed sweeping government data-gathering on ordinary citizens’ communications. Some tech companies have been increasingly resistant to disclosing their customers’ data at government request. Now they maintain that with their current encryption features, even they can’t unlock the meaning of a customer’s messages.
Large American companies will contact the government if they themselves come under a cyberattack, Weatherford says. But there’s still a concern that when a company shares technical information about a malware invasion, for example, the government may also acquire sensitive information about executives, employees, customers, or corporate activities. Suspicion persists that government investigators may share that information more broadly with other U.S. agencies such as the IRS or fraud prosecutors, he says. Companies worry that an appeal for government cybersecurity help might end in an employee’s investigation for tax evasion or drug dealing, he says.
“Unfortunately, the government hasn’t done a good job in proving people wrong when they say it will do what it’s not supposed to do,” Weatherford says. “They don’t have a good track record.”
Even leaked evidence of conduct that’s legal, but distasteful, on the part of top executives could damage a company’s reputation or sink its stock value, Weatherford says. He points to the flood of Sony executives’ e-mails that were spilled online in 2014 after hackers got hold of them.
Another worry for businesses is that government agencies themselves have been repeatedly hacked, and any of their proprietary data stored on government servers could be plundered by … Next Page »