Over the past several months, the tech industry has been experiencing a terrible bout of déjà vu. In a campaign led by FBI Director James Comey, law enforcement and intelligence community voices have argued against the proliferation of ubiquitous strong encryption in consumer devices and communication platforms. By ubiquitous strong encryption, I mean both the expanded availability of device encryption for smartphones, and end-to-end encryption of communications protocols with only the sender and recipient (but no third parties) holding keys.
This sort of strong encryption has proliferated for a variety of reasons. Hacks and data breaches are larger, more consequential, and more prevalent than ever. The last two years have also been marked by controversy over widespread government surveillance in the U.S. and elsewhere. These circumstances have caused consumers to demand, and technology companies to develop, the tools to protect sensitive personal and financial information from those who consumers would prefer not have access.
Naturally, the renewed focus on product and platform security on behalf of consumers has led to conflict between the tech industry and law enforcement. Governments fear “going dark”—the idea that there will be some set of encrypted communications and content that they will not be able to access even after having obtained the necessary legal process to seek that information, which in the U.S. usually means a search warrant.
If “going dark” sounds somewhat familiar, that’s because these are largely the same fears that led to the first “Crypto Wars” in the United States.
By the early 1990s, researchers had begun developing the first widely available strong encryption tools, including protocols like PGP. In response, the Clinton Administration pushed to limit the export of higher-grade encryption protocols. As a result, a class of “export-grade” encryption was developed for use in countries where the export strong encryption was prohibited. Some “key escrow” solutions were also developed for commercial use, like the Clipper Chip, which would allow the government or a trusted third party to hold the master keys to decrypt communications sent via devices using the protocols on that chip. Ultimately, once privacy advocates, industry, and technologists joined together in opposition (and after technical flaws were found in the Clipper Chip specification), the government largely backed down from its opposition to products containing strong encryption.
Winning that fight and enabling the use of secure protocols in electronic communication and transaction systems helped build public trust in the Internet, something that users continue to value. Without the underlying confidence in the integrity and security of the Internet and associated applications, it would not have developed into the successful platform for digital speech and commerce that it is today.
What the FBI and others are looking for now is essentially the same as what the government sought in the 1990s. All the same arguments, both in law enforcement’s favor and against it, apply today. Whether consumers are allowed to use widely available strong encryption on their personal devices and take advantage of communications services that employ it end-to-end remains a cost-benefit problem informed by technical and Constitutional limits.
The deployment of key-recovery-based encryption infrastructures to meet law enforcement’s stated specifications will result in substantial sacrifices in security and greatly increased costs to the end user. Building the secure computer-communication infrastructures necessary to provide adequate technological underpinnings demanded by these requirements would be enormously complex and is far beyond the experience and current competency of the field.
The above quote is from a technical report produced by eleven preeminent computer scientists. In it they analyzed some of the key escrow encryption requirements suggested by government agencies. The report is from 1997.
The government contends that given technical progress in the intervening years and a little good ol’ fashioned American ingenuity, technology companies should be able to develop and implement a key escrow or split-key or “golden” key protocol that makes all parties happy without sacrificing security. Optimism aside, the technical limitations of a key escrow solution have not changed since ‘97. A report prepared this year by another group of computer security experts makes largely the same arguments as presented in the 1997 report:
- Digital devices and communications tools are extremely complex systems, and complexity is the enemy of security. Devising an additional layer of complexity at scale to permit third-party access to encrypted consumer devices and communications will likely create new vulnerabilities or exacerbate existing ones.
- Current encryption protocols are also designed to improve security through forward secrecy—”where decryption keys are deleted immediately after use, so that stealing the encryption key used by a communications server would not compromise earlier or later communications.”
- Key escrow solutions require some third party—be it the provider, law enforcement, or some other entity—to retain security credentials for later use by the government. That third party would immediately become a rich target for hackers of all stripes.
Split key-solutions, where multiple parties hold a decryption key, can reduce the risk associated with the a single third party holding credentials and becoming a target. However, they further increase the complexity of technical systems, and do nothing to reduce the substantial aggregate economic and societal costs detailed below.
The Fourth Amendment
While technical arguments against weakening strong encryption seemingly hold no water for the starry-eyed dreamers at the FBI, the Constitution certainly should. The Fourth Amendment preserves the “right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures,” and requires probable cause before the issuance of any warrants. It explicitly protects “a right of the people.” However, the law enforcement interests working against strong encryption would have the public believe that the Fourth Amendment should be read in the inverse. It is true that the FBI and other U.S. law enforcement agencies argue that they only wish to decrypt encrypted communications after obtaining a warrant based on probable cause. But having a search warrant doesn’t mean that the government then has a positive right of its own to exercise—its agents merely have authorization from a court to intrude upon the privacy of a specific person and search for a particular thing they wish to seize. Yet the FBI seemingly believes that in the digital world, a search warrant is in fact a right to whatever means or assistance that they deem necessary to access the personal effects they seek.
There’s a reason the legal process the FBI seeks from federal judges is called a search warrant, and not a “find warrant.” In the physical world, when law enforcement officers execute a search warrant, the person whose home is being searched does not have to unlock every safe or locked room at the government’s instruction, nor does he have to lead the officers to every particular piece of evidence that they seek. And if the government’s agents encounter a safe that they cannot crack, they do not require that the manufacturer ensure that future products be designed in a manner accessible to law enforcement when they have warrants.
The rules should be no different with respect to digital communications and electronic devices. When law enforcement obtains a probable cause-based search warrant for particular content in an individual’s smartphone or electronic messages, they have authorization from a court to acquire that smartphone if they can find it, and access those messages in whatever form they are stored. In the event the smartphone or messages sought are encrypted, the government can use legal tools to compel the owner to give them access. But, like the safe maker who does not design his products to the specifications of law enforcement needs, third party device manufacturers and communications providers should not have to design their technologies according to the requirements of law enforcement agents who may at some point have warrants.
Building or retrofitting encryption systems to enable easy access for the government through a key escrow or split key mechanism flies in the face of the motivations underlying the Fourth Amendment. The general principle of the Fourth Amendment is that it serves as a check on government activities, not a license, and limited ability to search afforded to the government by the issuance of a search warrant should not swallow that rule in either the physical or digital context.
Weighing Costs and Benefits
The FBI’s position is not without some merit. The fear of “going dark” is a result of a few cases where federal and state law enforcement have encountered difficulty in investigating crimes because devices have been encrypted in a manner that agents were not able to successfully circumvent. For the FBI, it follows that as strong encryption grows more readily available, the number of serious crimes that they cannot solve will increase correspondingly, accompanied by an increase in the potential terrorists that might not be discovered.
Of course, the government can only count a handful of cases that have been recently stymied in some way by the presence of encryption, and none that went unsolved as a result. As for truly committed bad actors like potential terrorists, limiting the availability or effectiveness of encryption available through American services and devices will only drive them to less law-abiding providers in less hospitable reaches of the Internet and globe.
The few benefits of limiting strong encryption are likely outweighed by the aggregate costs. As reports produced by computer security experts detail, developing and implementing encryption systems that incorporate mechanisms through which third parties can access and decrypt secured communications and content is complicated and will introduce vulnerabilities. Some of those vulnerabilities are obvious, like the existence of some entity that must hold cryptographic keys in escrow. These sorts of entities are rich targets for hackers, and prominent firms have already been breached and had their keys compromised.
Other vulnerabilities are less evident, and may only become apparent years down the line, as a result of the complexity of supporting and implementing encryption protocols that satisfy government needs. For example, just this spring, an attack known as Logjam exploited residual application support of weak export-grade cryptography. That was the same export-grade cryptography developed in the 1990s to meet then-existing government restrictions on the distribution of strong encryption. The Logjam vulnerability took two decades to discover. Today, encryption is relied upon and embedded in systems to a much greater degree for commerce and communication, and as a result, vulnerabilities will propagate at a much higher rate, with a much lower likelihood of discovery.
Law enforcement and the intelligence community are not the only government agencies with an interest in the adoption of encryption. The FTC has long cited the use of encryption as a best practice to protect consumers and avoid violation of Section 5 of the FTC Act. The current Chief Technology Officer of the FTC recently advocated for consumers’ use of full disk encryption on their personal devices. And even the FBI, until its current about-face, encouraged users of smartphones to use OS encryption to protect their devices from thieves. The FTC and divisions of the FBI tasked with dealing with the consumer protection and criminal implications of increased fraud and identity theft recognize the steadily growing economic and personal costs of data breaches. They also know through experience that without widespread use of encryption, the already significant aggregate impacts of hacks would be exacerbated.
Requiring a split key or key escrow system for U.S. providers and manufacturers to allow access to the U.S. government would have far reaching outcomes internationally as well. The adoption of a government-access regime for encryption in the U.S. would likely also be used as a license for other countries, particularly those less enamored by the rule of law, to do the same. Not being able to communicate securely would have a chilling effect on the the speech of dissidents and journalists worldwide. In addition, the international competitiveness of the U.S. tech industry has already been harmed by disclosures of mass surveillance by the U.S. intelligence community. This trust deficit would only be deepened by the news that U.S. companies have designed systems to allow the U.S. government to better access the contents of encrypted devices and messages, which would simply lead to more customers lost to international competitors.
Talking in Code
The FBI has repeatedly stated that it seeks, first and foremost, a public conversation on what law enforcement views as a significant looming hindrance to its investigatory capabilities. The goal of this public conversation is for American society to decide (yet again) whether it prefers the costs and benefits of ubiquitous strong encryption or those associated with weakened or less-available encryption tools. It’s worth noting that a former NSA Director, Secretary of Homeland Security, and Deputy Secretary of Defense recently contributed their thoughts to the conversation, and have come down on the side of strong encryption, largely for the reasons listed above.
The government asks whether tech companies are willing to abide by the public’s will if it decides that the risks of going dark are too great. That’s the wrong question, because the public made its decision loud and clear in the first Crypto Wars and again in demanding the development of better security measures for their devices and communications. So perhaps the better question to ask is when the government is going to listen.