The anticipated advent of quantum computing will have a devastating impact on existing modes of asymmetric data encryption. It’s likely that within the next few years, quantum-capable entities will gain the ability to decrypt virtually every secret possessed by individuals, governments and private industry where asymmetric encryption algorithms such as RSA, Finite Field Diffie-Hellman, and Elliptic Curve Diffie-Hellman have been used for protection.

The looming failure of today’s encryption is an alarming prospect and yet the government and various standards bodies require a greater sense of urgency which an existential event like this demands. With the steal-now-decrypt-later (SNDL) threat from quantum, there is a compelling need for solutions that can be deployed today. If history is any indicator, the critical problem we currently face is that the cycle time for migrating to new post-quantum resistant encryption algorithms and related standards will be too long to mitigate the danger posed by the oncoming quantum threat. Quantum computers, which are expected to become viable in the next few years, use subatomic particles and quantum mechanics to perform calculations faster than today’s fastest conventional supercomputers. With this computing power comes the ability to crack encryption methods that are based on factoring large prime numbers. An algorithm introduced by Peter Shor back in 1994 provides a method for the factorization of these large prime numbers in polynomial time instead of exponential time with the use of a quantum computer. What this means to us is that while a conventional computer might take trillions of years to break a 2,048-bit asymmetric encryption key, a quantum computer powered by 4,099 quantum bits, or “qubits,” using Shor’s algorithm would need approximately 10 seconds to accomplish the task. We don’t have a decade for 30 revisions on the standard to get this right, as we have seen from previous standardization efforts.

It may be comforting to think that because quantum computers of a crypto-logically significant scale don’t exist yet, there is nothing to worry about today. However, this idea is a mistake for two reasons. First, quantum computing is advancing at a faster pace than anyone previously contemplated. Second, malicious actors can steal encrypted data today and decrypt it with a quantum computer when quantum computers become available. This is the SNDL threat highlighted above. Banks use quantum-vulnerable public key exchange to validate your account access, as do health providers transmitting digital health records, as well as the IRS when e-filing your taxes. Even VPNs and the core infrastructure (routers and network switches) implement quantum-vulnerable key exchanges when using IPSec and MacSec protocols. Once quantum computing comes on-line, a bad actor can discover the private keys associated with these public keys and the contents of wallets, records and accounts  will become available to the attacker.

Users need a simple control plane that enables them to select any crypto library they desire to defend against these evolving quantum threats. Additionally, many nations are developing post-quantum resistant algorithms and may not want to wait on NIST to standardize an algorithm or certify an implementation and need a solution that provides them with the agility to employ the post-quantum cryptographic algorithms of their choice – in effect, a bring your own algorithms (BYOA) approach.  

Agility allows us to future-proof systems against both novel cryptanalysis and implementation errors.  It shortens the time between the demonstration of a vulnerability in an algorithm, implementation, or protocol, and the patching or upgrading of all applications and services affected by the vulnerability. Agility enables the transition to more efficient algorithms or implementations. Quickly eliminating vulnerable algorithm implementations calls for the capability to access different implementation libraries for the same algorithm and enable “fall back” and switching to other algorithms. For example, a software library may implement an algorithm in a way which is vulnerable to attack. KyberSlash1 and KyberSlash2 impacted the implementation of the Kyber algorithm in all but six of 22 popular crypto libraries. It took more than 90 days to patch the vulnerable implementations on most of the affected libraries. A crypto-agile solution should enable an organization to move easily and rapidly between implementations – otherwise the entire security posture and data of the enterprise is compromised.

New quantum secure encryption methods with crypto-agility functionality have been developed and can be deployed immediately. The challenge is to make them work with existing encryption algorithms and protocols while enabling crypto-agility to stay ahead of the pacing threat without having to rip-and-replace the existing infrastructure. After all, it is impossible for every system to upgrade its encryption algorithms all at once.

The post A Bring Your Own Algorithms (BYOA) Approach to Crypto-Agility Addressing Quantum Threats appeared first on Cybersecurity Insiders.

NIST has released a draft of Special Publication1800-38A: “Migration to Post-Quantum Cryptography: Preparation for Considering the Implementation and Adoption of Quantum Safe Cryptography.” It’s only four pages long, and it doesn’t have a lot of detail—more “volumes” are coming, with more information—but it’s well worth reading.

We are going to need to migrate to quantum-resistant public-key algorithms, and the sooner we implement key agility the easier it will be to do so.

News article.

Quantum computing is a completely new paradigm for computers. A quantum computer uses quantum properties such as superposition, which allows a qubit (a quantum bit) to be neither 0 nor 1, but something much more complicated. In theory, such a computer can solve problems too complex for conventional computers.

Current quantum computers are still toy prototypes, and the engineering advances required to build a functionally useful quantum computer are somewhere between a few years away and impossible. Even so, we already know that that such a computer could potentially factor large numbers and compute discrete logs, and break the RSA and Diffie-Hellman public-key algorithms in all of the useful key sizes.

Cryptographers hate being rushed into things, which is why NIST began a competition to create a post-quantum cryptographic standard in 2016. The idea is to standardize on both a public-key encryption and digital signature algorithm that is resistant to quantum computing, well before anyone builds a useful quantum computer.

NIST is an old hand at this competitive process, having previously done this with symmetric algorithms (AES in 2001) and hash functions (SHA-3 in 2015). I participated in both of those competitions, and have likened them to demolition derbies. The idea is that participants put their algorithms into the ring, and then we all spend a few years beating on each other’s submissions. Then, with input from the cryptographic community, NIST crowns a winner. It’s a good process, mostly because NIST is both trusted and trustworthy.

In 2017, NIST received eighty-two post-quantum algorithm submissions from all over the world. Sixty-nine were considered complete enough to be Round 1 candidates. Twenty-six advanced to Round 2 in 2019, and seven (plus another eight alternates) were announced as Round 3 finalists in 2020. NIST was poised to make final algorithm selections in 2022, with a plan to have a draft standard available for public comment in 2023.

Cryptanalysis over the competition was brutal. Twenty-five of the Round 1 algorithms were attacked badly enough to remove them from the competition. Another eight were similarly attacked in Round 2. But here’s the real surprise: there were newly published cryptanalysis results against at least four of the Round 3 finalists just months ago—moments before NIST was to make its final decision.

One of the most popular algorithms, Rainbow, was found to be completely broken. Not that it could theoretically be broken with a quantum computer, but that it can be broken today—with an off-the-shelf laptop in just over two days. Three other finalists, Kyber, Saber, and Dilithium, were weakened with new techniques that will probably work against some of the other algorithms as well. (Fun fact: Those three algorithms were broken by the Center of Encryption and Information Security, part of the Israeli Defense Force. This represents the first time a national intelligence organization has published a cryptanalysis result in the open literature. And they had a lot of trouble publishing, as the authors wanted to remain anonymous.)

That was a close call, but it demonstrated that the process is working properly. Remember, this is a demolition derby. The goal is to surface these cryptanalytic results before standardization, which is exactly what happened. At this writing, NIST has chosen a single algorithm for general encryption and three digital-signature algorithms. It has not chosen a public-key encryption algorithm, and there are still four finalists. Check NIST’s webpage on the project for the latest information.

Ian Cassels, British mathematician and World War II cryptanalyst, once said that “cryptography is a mixture of mathematics and muddle, and without the muddle the mathematics can be used against you.” This mixture is particularly difficult to achieve with public-key algorithms, which rely on the mathematics for their security in a way that symmetric algorithms do not. We got lucky with RSA and related algorithms: their mathematics hinge on the problem of factoring, which turned out to be robustly difficult. Post-quantum algorithms rely on other mathematical disciplines and problems—code-based cryptography, hash-based cryptography, lattice-based cryptography, multivariate cryptography, and so on—whose mathematics are both more complicated and less well-understood. We’re seeing these breaks because those core mathematical problems aren’t nearly as well-studied as factoring is.

The moral is the need for cryptographic agility. It’s not enough to implement a single standard; it’s vital that our systems be able to easily swap in new algorithms when required. We’ve learned the hard way how algorithms can get so entrenched in systems that it can take many years to update them: in the transition from DES to AES, and the transition from MD4 and MD5 to SHA, SHA-1, and then SHA-3.

We need to do better. In the coming years we’ll be facing a double uncertainty. The first is quantum computing. When and if quantum computing becomes a practical reality, we will learn a lot about its strengths and limitations. It took a couple of decades to fully understand von Neumann computer architecture; expect the same learning curve with quantum computing. Our current understanding of quantum computing architecture will change, and that could easily result in new cryptanalytic techniques.

The second uncertainly is in the algorithms themselves. As the new cryptanalytic results demonstrate, we’re still learning a lot about how to turn hard mathematical problems into public-key cryptosystems. We have too much math and an inability to add more muddle, and that results in algorithms that are vulnerable to advances in mathematics. More cryptanalytic results are coming, and more algorithms are going to be broken.

We can’t stop the development of quantum computing. Maybe the engineering challenges will turn out to be impossible, but it’s not the way to bet. In the face of all that uncertainty, agility is the only way to maintain security.

This essay originally appeared in IEEE Security & Privacy.

EDITED TO ADD: One of the four public-key encryption algorithms selected for further research, SIKE, was just broken.

SIKE is one of the new algorithms that NIST recently added to the post-quantum cryptography competition.

It was just broken, really badly.

We present an efficient key recovery attack on the Supersingular Isogeny Diffie­-Hellman protocol (SIDH), based on a “glue-and-split” theorem due to Kani. Our attack exploits the existence of a small non-scalar endomorphism on the starting curve, and it also relies on the auxiliary torsion point information that Alice and Bob share during the protocol. Our Magma implementation breaks the instantiation SIKEp434, which aims at security level 1 of the Post-Quantum Cryptography standardization process currently ran by NIST, in about one hour on a single core.

News article.

Nadiya Kostyuk and Susan Landau wrote an interesting paper: “Dueling Over DUAL_EC_DRBG: The Consequences of Corrupting a Cryptographic Standardization Process“:

Abstract: In recent decades, the U.S. National Institute of Standards and Technology (NIST), which develops cryptographic standards for non-national security agencies of the U.S. government, has emerged as the de facto international source for cryptographic standards. But in 2013, Edward Snowden disclosed that the National Security Agency had subverted the integrity of a NIST cryptographic standard­the Dual_EC_DRBG­enabling easy decryption of supposedly secured communications. This discovery reinforced the desire of some public and private entities to develop their own cryptographic standards instead of relying on a U.S. government process. Yet, a decade later, no credible alternative to NIST has emerged. NIST remains the only viable candidate for effectively developing internationally trusted cryptography standards.

Cryptographic algorithms are essential to security yet are hard to understand and evaluate. These technologies provide crucial security for communications protocols. Yet the protocols transit international borders; they are used by countries that do not necessarily trust each other. In particular, these nations do not necessarily trust the developer of the cryptographic standard.

Seeking to understand how NIST, a U.S. government agency, was able to remain a purveyor of cryptographic algorithms despite the Dual_EC_DRBG problem, we examine the Dual_EC_DRBG situation, NIST’s response, and why a non-regulatory, non-national security U.S. agency remains a successful international supplier of strong cryptographic solutions.