Nadiya Kostyuk and Susan Landau [*]
[Full text of this Article in PDF is available at this link]
Introduction
The Internet presents a serious conundrum. Though well known to have security problems, the network is globally relied upon for commerce and used to control many critical systems and infrastructure. This inconsistency is partially explained by the fact that when someone says, “The Internet is insecure,” they are often not referring to the communications network, but rather the applications that run on it. But it is also true that a network itself can be insecure and nonetheless be widely used—because the network provides value and risk can be managed.
The Internet’s communications protocols—TCP/IP—do not authenticate communication senders; this allows various attacks, including Distributed Denial of Service (DDoS), and simplifies attacker intrusions into endpoint networks. The decentralized nature of Internet routing—routers on the network share routing information with their neighbors—allows packets to be routed incorrectly,[1] sometimes leading to eavesdropping on communications, theft,[2] and even shutting down important and well-known websites due to traffic diversion.[3]
A solution to many of these problems exists: cryptography. The technology, which can provide confidentiality (ensuring no one but the intended recipients can read a message), integrity (ensuring that the communication has not been altered), and authenticity (proof that the message came from an authorized source), is essential to providing security and privacy to communications protocols (e.g., https).
Cryptography presents a peculiar problem to communications security. Cryptographic algorithms have been used to provide confidentiality, integrity, and authentication to many Internet communications protocols. These include, for example, the cipher suites[4] in TLS,[5] which is the protocol that provides end-to-end security for web browsing, email, and other Internet applications. For a protocol such as TLS to transit international borders and securely transmit IP-based communications, there must be international acceptance and use of the cryptography employed within the protocols.[6] Protocols transit international borders of nations that do not necessarily trust each other—and, in particular, do not necessarily trust the developer of the cryptographic standard. In spite of this lack of trust, the use of these cryptographic standards has prevailed—an impressive success.
There have been occasional breaks in protocols and more frequent ones in the implementation of communications protocols. But breaks of approved cryptographic standards are quite uncommon, in large part due to public vetting of the proposed standards prior to adoption. One situation where such a break occurred deserves particular attention. In September 2013, The New York Times, ProPublica, and The Guardian reported that the NSA “[had] been deliberately weakening international encryption standards.”[7] The Times pointed to an algorithm, Dual_EC_DRBG,[8] about which cryptographers had previously expressed doubts regarding its security. The algorithm, which supplied “random bits” for determining an encryption key, apparently had a cryptographic “backdoor.”[9] Such a backdoor functions much like a key under a doormat, providing a way for those who know it to bypass the encryption and access the encrypted content.
In this case, the key under the doormat was the relationship between two parameters of the curve. Each elliptic-curve cryptosystem has two parameters, P and Q. If the curves are secure curves suitable for use in cryptography,[10] then finding a mathematical relationship between P and Q is computationally infeasible. But there was strong reason to believe that the NSA knew the mathematical relationship between the two parameters in Dual_EC_DRBG.[11] Knowing this would provide the agency with a backdoor that allowed it to quickly decrypt communications ostensibly secured through Dual_EC_DRBG.[12] In addition, there appeared to be statistical bias in Dual_EC_DRBG’s output,[13] a major flaw in any random-bit generator. This bias simplified the use of the backdoor for anyone with knowledge of the mathematical relationship between P and Q.
Even worse, the system was an approved cryptographic standard. When Dual_EC_DRBG had initially been proposed as a standard, a number of cryptographers raised concerns about the potential of a cryptographic backdoor.[14] Despite the cryptographers’ misgivings, the National Institute of Standards and Technology (NIST) approved[15] Dual_EC_DRBG as a Federal Information Processing Standard (FIPS),[16] a standard for non-national security agencies of the federal government. Although FIPSs only apply to federal non-national security agencies, the impact of the designation is far broader. FIPSs are often adopted by industries, including those outside the United States, for private sector use. That NIST had designated a corrupted cryptosystem as a FIPS thus had reverberations well beyond U.S. borders.
The problem of the algorithm was quickly handled. NIST, which had approved[17] Dual_EC_DRBG as a FIPS, immediately responded by recommending that the algorithm not be used and opened a public comment period on the standard.[18] Seven months later, NIST permanently removed Dual_EC_DRBG from its list of recommended random-number generators.[19] The problem of reestablishing NIST as a purveyor of widely used cryptographic standards—a role it had been successfully filling—was more complex.
This situation provides a conundrum. The Snowden disclosure of a cryptographic backdoor in Dual_EC_DRBG revealed, at best, a weakness in the NIST process of developing cryptographic standards. At worst, it showed nefarious intent on the part of the U.S. government agency. Yet within just a short period it became clear that NIST’s Computer Security Division (CSD) was able to successfully continue as a largely trusted developer of internationally accepted cryptographic standards, most recently with its efforts to develop post-quantum cryptographic standards[20] and lightweight cryptography for the Internet of Things and cyber-physical systems.[21]
In this paper we explain this conundrum. We begin, in Part I, by providing context through a short primer on cryptography and on NIST’s role in developing cryptographic standards. In Part II, we discuss what NIST did to address the problems that allowed a problematic algorithm to be approved as a FIPS and the reaction of the international cryptography community. We next examine in Part III why the situation resolved in this way.
Yet the story does not end with NIST’s successful regaining of the international cryptographic research community’s trust. The Dual_EC_DRBG situation played out against a nearly forty-year continuing conflict between the U.S. government and cryptographers and computer security experts and a fifteen-year history of cooperation between NIST and the cryptographic research community. In Part IV, we briefly discuss the stability of this resolution against two sets of changes: the role of the U.S. in Europe and elsewhere and the increasing fragmentation of the Internet.
[*] Nadiya Kostyuk, Assistant Professor at the School of Public Policy and the School of Cybersecurity and Privacy at Georgia Institute of Technology, and Susan Landau, Bridge Professor in Cyber Security and Policy, Fletcher School of Law & Diplomacy and School of Engineering, Department of Computer Science, Tufts University. The work was done, in part, while Kostyuk was a Cybersecurity Policy Predoctoral Research Fellow, Fletcher School of Law and Diplomacy, Tufts University. This research was supported in part by funding from the William and Flora Hewlett Foundation under grant 2018-7277.
We greatly appreciate the help provided by Jon Lindsay, Michael Poznansky, and other members of the Digital Issues Discussion Group, as well as from Steven M. Bellovin, Dan Bernstein, Lily Chen, Donna Dodson, Bart Jacobs, John Kelsey, Brian LaMacchia, Tanja Lange, Adam Langley, Steve Lipner, Kerry McKay, Abraham Newman, Kenny Paterson, Bart Preneel, Steve Purser, Andrew Regenscheid, and additional researchers who preferred to stay anonymous.
[1] This is known as Border Gateway Protocol (BGP) hijacking.
[2] If a router directs a user to the incorrect site purporting to be the real one, the user may be tricked into authenticating herself, thereby giving away her user credentials.
[3] Declan McCullagh, How Pakistan Knocked YouTube Offline (and How to Make Sure it Never Happens Again), CNET (Feb. 25, 2008), https://www.cnet.com/news/how-pakistanknocked-youtube-offline-and-how-to-make-sure-it-never-happens-again [https://perma.cc/PKL4-DYPV].
[4] In TLS 1.3, these include Diffie-Hellman for key exchange, Advanced Encryption Standard in two key lengths (128 and 256 bits), ChaCha20 (a stream cipher) and related authenticator Poly1305, and the SHA256 cryptographic hash algorithm. Eric Rescorla, Request for Comments No. 8446: The Transport Layer Security (TLS) Protocol Version 1.3, INTERNET ENG’G TASK FORCE (Aug. 2018), https://datatracker.ietf.org/doc/html/rfc8446 [https://perma.cc/4HY7-78GN]; Y. Nir & A. Langley, Request for Comments No. 8349: ChaCha20 and Poly1305 for IETF Protocols, INTERNET ENG’G TASK FORCE (June 2018), https://datatracker.ietf.org/doc/html/rfc8439 [https://perma.cc/YFX2-CLAD]; A. Langley, M. Hamburg & S. Turner, Request for Comments No. 7748: Elliptic Curves for Security, INTERNET ENG’G TASK FORCE (Jan. 2016), https://datatracker.ietf.org/doc/html/rfc7748 [https://perma.cc/E4QL-TS5W].
[5] From the point of view of security, TLS 1.3 is a vast improvement over TLS 1.2. It introduces forward secrecy, in which new encryption keys are used for each communication session, sharply reduces the number of choices of implementations (which public -key algorithm with which private-key algorithm), thus reducing complexity (the bane of security), eliminates problematic ciphers from being considered, speeds up and secures the “handshake” between two systems in which they negotiate a key, and allows encryption in “resumed” messages between two entities. See Nick Sullivan, A Detailed Look at RFC 8446 (a.k.a. TLS 1.3), CLOUDFLARE BLOG (Aug. 10, 2018), https://blog.cloudflare.com/rfc-8446- aka-tls-1-3 [https://perma.cc/3EFQ-A7CA]; Rescorla, supra note 4. The former provides a simplified explanation, and the latter gives more technical details.
[6] In fact, in 2019 Russia explored banning the use of TLS 1.3, apparently because the protocol could be used to hide the name of a destination website. This would thwart government censorship and surveillance efforts. See Федеральный Закон о внесении изменений в статьи 2 и 10 Федерального закона «Об информации, информационных технологиях и о защите информации» [Federal Law on Amendments to Articles 2 and 10 of the Federal Law “About Information Technologies, and Protection of Information”], July 27, 2006, https://www.documentcloud.org/documents/7215232-Proposed-Russia-law-to-ban-secureencryption.html [https://perma.cc/R4UG-6R5X]; Пояснительная Записка к проекту Федерального закона «Овнесении изменений в статьи 2 и 10 Федерального закона «Об информации, информационных технологиях и о защите информации» [Explanatory Note to the Draft Federal Law “About Information Technologies, and Protection of Information”], Dec. 4, 2019, https://www.documentcloud.org/documents/7215233-149-%D0%9F%D0%97.html [https://perma.cc/F5QM-GJNG]. These laws had not yet entered into force when this article was written.
[7] Nicole Perlroth, Jeff Larson & Scott Shane, N.S.A. Able to Foil Basic Safeguards of Privacy on Web, N.Y. TIMES (Sept. 5, 2013), https:/www.nytimes.com/2013/09/06/us/nsa-foilsmuch-internet encryption.html [https://perma.cc/29GU-C6BQ]. See also Jeff Larson, Revealed: The NSA’s Secret Campaign to Crack, Undermine Internet Security, PROPUBLICA (Sept. 5, 2013), https://www.propublica.org/article/the-nsas-secret-campaign-to-crackundermine-internet-encryption [https://perma.cc/K87Z-24JE]; James Ball, Julian Borger & Glenn Greenwald, Revealed: how US and UK spy agencies defeat internet privacy and security, GUARDIAN (Sept. 6, 2013), https://www.theguardian.com/world/2013/sep/05/nsagchq-encryption-codes-security [https://perma.cc/9ZVH-CP4D].
[8] The original news article by Perlroth, et al., supra note 7, did not explicitly name the standard but made clear which it was. The accompanying NSA documents included such statements as “influence policies, standards and specification for commercial public key technologies” and “shape the worldwide commercial cryptography marketplace to make it more tractable to advanced cryptanalytic capabilities being developed by NSA/CSS.” See Secret Documents Reveal N.S.A. Campaign Against Encryption, N.Y. TIMES (Sept. 5, 2013), https://archive.nytimes.com/www.nytimes.com/interactive/2013/09/05/us/documentsreveal-nsa-campaign-against-encryption.html [https://perma.cc/M5ND-RACD] [hereinafter Secret Documents]; Larson, supra note 7; Ball, Borger & Greenwald, supra note 7. Dual_EC_DRBG was identified several days later. See Nicole Perlroth, Government Announces Steps to Restore Confidence in Encryption Standards, N.Y. TIMES (Sept. 10, 2013).
[9] Perlroth, et al. supra note 7.
[10] Essentially this means avoiding “supersingular” curves and “prime-field anomalous elliptic” curves. See Alfred J. Menezes, TatsuakiOkamoto & Scott A. Vanstone, Reducing Elliptic Curve Logarithms to Logarithms in a Finite Field, 39 IEEE TRANSACTIONS ON INFORMATION THEORY 1639, 1639 (1993); Takakazu Satoh & Kiyomichi Araki, Fermat Quotients and the Polynomial Time Discrete Log Algorithm for Anomalous Elliptic Curves, 47 COMMENTARII MATHEMATICI UNIVERSITATIS SANCTI PAULI 81, 81–82 (1998); I. A. Semaev, Evaluation of Discrete Logarithms in a Group of p-Torsion Points of an Elliptic Curve in Characteristic p, 67 MATHEMATICS COMPUTATION 353, 353–55 (1998); N. P. Smart, The Discrete Logarithm Problem on Elliptic Curves of Trace One, 12 J.CRYPTOLOGY 193, 193 (1999). These characteristics of elliptic curves are easy to test for and thus do not present problems in practice. Neal Koblitz & Alfred Menezes, A Riddle Wrapped in an Enigma, 14 IEEE SEC. & PRIV. 34, 37–38 (2016).
[11] In 2007, which was after the standard’s approval, Dan Shumow and Niels Ferguson raised the possibility of a backdoor in the algorithm. Dan Shumow & Niels Ferguson, On the Possibility of a Back Door in the NIST SP800-90 Dual EC Prng 8, Presentation at the 27th Annual International Cryptology Conference (Aug. 21, 2007), https://rump2007.cr.yp.to/15- shumow.pdf [https://perma.cc/ZF26-4HQS]. In fact, the Canadian company Certicom had filed a patent application on the backdoor in Dual_EC_DRBG in 2005, but Certicom never publicized this information. Certicom’s Patent Applications Regarding Dual EC Key Escrow, PROJECT BULLRUN (July 29, 2005), https://projectbullrun.org/dual-ec/patent.html [https://perma.cc/WR7C-V8MP].
[12] Shumow & Ferguson, supra note 11, at 7
[13] In March 2006, Kristian Gjøsteen sent a paper to the National Institute of Standards and Technology (NIST) showing that the bit strings output from Dual_EC_DRBG were biased. Kristian Gjøsteen, Comments on Dual-EC-DRBG/NIST SP 800-90 (Dec. 2005) (unpublished manuscript). This was later improved by Berry Schoenmakers and Andrey Siderenko. Berry Schoenmakers & Andrey Siderenko, Cryptanalysis of the Dual Elliptic Curve Pseudorandom Generator 1 (2006) (manuscript), 2006 CRYPTOLOGY EPRINT ARCHIVE, https://eprint.iacr.org/2006/190.pdf [https://perma.cc/6QUQ-MCB4]. Both of these papers were out after the NIST deadline for comments but before NIST had completed editing the standard. Daniel Bernstein, Tanja Lange & Ruben Niederhagen, Dual EC: A Standardized Backdoor, in THE NEW CODEBREAKERS: ESSAYS DEDICATED TO DAVID KAHN ON THE OCCASION OF HIS 85TH BIRTHDAY 256, 260 (Peter Y.A. Ryan, David Naccache, & Jean-Jacques Quisquater eds., 2016).
[14] Beginning in 2006, there were several technical papers showing the potential for backdoors and other problems with the system. See Shumow & Ferguson, supra note 11, at 8; SCHOENMAKERS &SIDERENKO, supra note 13, at 1; infra Part I.A.
[15] ELAINE BARKER & JOHN KELSEY, NAT’L INST. OF STANDARDS & TECH., NIST SPECIAL PUBLICATION NO. 900-80AREV. 1,RECOMMENDATION FOR RANDOM NUMBER GENERATION USING DETERMINISTIC RANDOM BIT GENERATORS 58 (2006), https://nvlpubs.nist.gov/nistpubs/SpecialPublications/NIST.SP.800-90Ar1.pdf [https://perma.cc/Q66G-SVHC].
[16] FIPS are developed by NIST for use in computer systems for non-national security agencies of the U.S. government. For further information on FIPS, see Compliance FAQs: Federal Information Processing Standards (FIPS), NAT’L INST. OF STANDARDS & TECH. (Nov. 15, 2019), https://www.nist.gov/standardsgov/compliance-faqs-federal-informationprocessing-standards-fips [https://perma.cc/4W6U-AP5P].
[17] BARKER & KELSEY, supra note 15, at 100.
[18] NIST Removes Cryptography Algorithm from Random Number Generator Recommendation, NAT’L INST. OF STANDARDS & TECH. (Apr. 21, 2014), https://www.nist.gov/news-events/news/2014/04/nist-removes-cryptography-algorithmrandom-number-generator-recommendations [https://perma.cc/HA3P-NW5B].
[19] Id.
[20] Post-Quantum Cryptography (PQC), NAT’L INST. OF STANDARDS &TECH. (Jan. 3, 2017), https://csrc.nist.gov/projects/post-quantum-cryptography [https://perma.cc/AQ96-GZU9]. Public-key cryptographic algorithms are believed to be strong and effective against attacks by classical computers, but in 1994 Peter Shor showed a quantum algorithm can factor integers into prime factors much faster than any known classical algorithm. With a major scientific advancement in quantum computing, it might be possible to crack a communication encrypted through current public-key means in a matter of hours. See Peter W. Shor, Algorithms for Quantum Computation: Discrete Logarithms and Factoring, 1994 PROC. 35TH ANN.SYMP. FOUND.COMPUTER SCI. 124 (1994).
[21] Lightweight Cryptography, NAT’L INST. OF STANDARDS & TECH. (Jan. 3, 2017), https://csrc.nist.gov/Projects/lightweight-cryptography [https://perma.cc/D8YE-ZJU4].