Skip to main content

Security and quantum computing: Planning next generation cryptography

Current encryption algorithms rely on public key cryptography to keep data secure. Find out why quantum computing could disrupt the current system and how to plan ahead.

We often build assumptions into our technology, assuming the reliability of technologies on which we base them. One of the biggest assumptions we make is the reliability of public key cryptography. We build our most critical systems to rely on it. And yet, a threat is in the offing to its reliability.

Also called asymmetric cryptography, public key cryptography involves two different keys—one kept secret and one made public—that are used to encrypt and decrypt data and validate identities.

Public key cryptography relies on certain mathematical problems that are very hard to solve, such as factoring large numbers that are the product of large prime numbers and finding the discrete logarithm of a random elliptic curve element with respect to a publicly known base point. If you can solve the problem, you can decrypt the data. But we assume that you can’t solve it or that the amount of effort to solve it is so great that it is not worthwhile.

The emergence of quantum computers and algorithms designed to take advantage of their special characteristics undermines these assumptions. If a sufficiently large and reliable quantum computer could be built, it would be able to solve the mathematical problems and decrypt what today qualifies as strongly encrypted data.

Other important cryptographic algorithms, specifically symmetric cryptography and hash functions, are not compromised in the same way as asymmetric cryptography. I discuss the distinction below.

Whether a quantum computer large enough to cause this trouble can be built is not yet certain. Small-scale quantum computers have been built, most famously by IBM, which found the prime factors of relatively small numbers.

The potential for the problem has been known for many years. However, in recent years, experts have begun a coordinated effort to create cryptographic standards that are resistant to quantum computing techniques, coordinated by the U.S. National Institute of Standards and Technology (NIST). We call this the quest for post-quantum cryptography (PQC).

For help with this article, I spoke to Dustin Moody, a mathematician in the cryptographic technology group at NIST. In a presentation at the PQCrypto 2016 conference, Moody briefly explained NIST’s plans. NIST released its short "Report on Post-Quantum Cryptography" at the same time. The report explains the problem in greater detail and contains numerous references to research on the problem.

NIST, if you’re not aware, has long been a major force for advancement of security standards, as it has been in many other fields, from firefighting equipment and techniques to monoclonal antibodies. It has a mandate specifically mentioned in the U.S. Constitution: “Congress shall have the power to … fix the standard of weights and measures.” Many of our most important security standards were promulgated by NIST, including AES (Advanced Encryption Standard) and FIPS (Federal Information Processing Standards) 140.

Get the latest updates on The Machine User Group, and learn how to get started with the Memory-Driven Computing Developer Toolkit.

What is a quantum computer?

There is no sufficiently useful explanation of what quantum computers are that will fit in this article. A meaningful understanding requires a strong background in mathematics and physics. The website Quantum Made Simple includes many explanations of the science, including some clarifying animations.

That said, a quantum computer is based on quantum physics rather than on classical physics. In conventional computing, a bit is stored either as a 0 or 1 in a semiconductor. Not so in a quantum computer.

What gives quantum computers the potential to disrupt our methods of cryptography is the principal of superposition. A bit in quantum computing, called a qubit, can be a 0 or 1 at the same time. This means that a string of n bits can be in 2n states at the same time. Because of this, a quantum computer can do computations comparable to those of many classical computers, all at the same time on the same machine. Einstein would have called it “God playing dice with your data.”

In fairness to quantum computers and the people developing them, they would have tremendous benefits as well. Most of the proposed applications are modeling and simulation, where huge amounts of data are managed in simultaneous reaction to inputs. Think of weather forecasting; econometric modeling; molecular, atomic, and sub-atomic simulation (yes, quantum computers to model quantum mechanics); and consumers of financial modeling, who always want results faster. The inherent parallelization of quantum computing should speed up these compute-intensive applications considerably.

Why is quantum computing a problem?

But it’s that inherent parallelism that presents the problem for cryptographers. One would program a quantum computer differently from a classical computer, and algorithms that would have poor performance on classical computers would have excellent performance on a quantum computer. One such algorithm is Shor’s algorithm, which can both find prime factors of a number in a reasonable time and break the discrete log problem on which elliptic curve and finite field cryptography rely.

But, as far as we know now, the performance advantages of quantum computing do not present fatal problems for other important cryptographic algorithms. The expert consensus is that hash functions and symmetric encryption will still be secure in a post-quantum world but would need larger keys, as illustrated in this table from the NIST report:

impact of quantum computing

Source: NIST's "Report on Post-Quantum Cryptography"

So, if RSA, ECC, and FFC are insecure in a post-quantum computing world, what are the proposals for replacing them?

There are several approaches to “quantum-resistant cryptography.” The idea is to maintain the basic approach to public key cryptography of relying on a mathematical operation that is easy in one direction and very hard in the other. Thus, researchers look for algorithms that appear to have this property of intractability both on quantum and classical computers. The main families of algorithms being studied are:

  • Lattice-based cryptography
  • Code-based cryptography
  • Multivariate polynomial cryptography
  • Hash-based signatures

There is no useful definition of these terms that would fit in the amount of space I have here. The NIST report has a long list of references; Wolfram MathWorld is a good place for mathematical definitions.

Testing

Google has begun experimenting with the NIST proposals in the Chrome browser, as described by Google crypto engineer Adam Langley. Don’t make too much of the experiment; it’s a very early effort, largely geared toward testing the performance impact of larger PQC key sizes on real-world TLS implementations. Quite a few of those implementations failed outright.

Google didn’t test all of the PQC proposals. Some of those it did test increased latency significantly, even under optimistic assumptions. As is often the case, implementations that had greater latency performed better on computational aspects of the tests. Langley’s extremely preliminary conclusion is that structured lattices would have the best protocol performance characteristics, but there are so many unknowns at this point that it’s not worth taking any serious actions based on the results.

The direction of PQC toward larger keys must be frustrating for the engineers who just debuted the TLS 1.3 specification, a major objective of which is to improve performance of secure communications.

As I hinted, it’s not definite that quantum computers will develop successfully to the point that they threaten our public key algorithms, but the potential is there. There have been many experiments, and small-scale quantum computers have demonstrated the problem on a small scale. We don’t know that it will be possible to scale quantum computers up to the needed size, but NIST’s Dustin Moody tells me that those in the field believe it could be possible in 10 to 15 years. Prudence would lead us to continue such research as long as the possibility is there, because if we are unprepared when the problem manifests itself, the results will be catastrophic.

The dormant data problem

It’s going to be a big enough deal getting systems updated to use new protocols and key sizes on a current basis, but there’s another problem you need to think about.

You undoubtedly have a large amount of data encrypted and at rest. You think of this data as safe because, at least in part, it is encrypted. But in the post-quantum world, that protection is of less or possibly no value.

The maximal response would be to plan to re-encrypt all the data with quantum-resistant cryptographic techniques. This won’t be easy. You need to have a good inventory of the data and the ability to decrypt and re-encrypt it. If it was encrypted at the application level, the application may not be able to encrypt with new methods. It may be encrypted with drive-based full-disk encryption, which may not be upgradable to post-quantum cryptography.

Note that data at rest is not encrypted with public key encryption but with symmetric key algorithms like AES. The NIST report says these algorithms aren’t inherently compromised like the public key algorithms, but it does call for larger keys, which means your data at rest is encrypted with potentially weak keys.

Even though a crypto expert might tell you that re-encrypting is the right thing to do (and it is), it’s not hard to see it as an exercise in pure overhead to protect against a remote possibility. Protecting dormant data is important, but it’s definitely a lower priority than protecting your current data. It would be fair to consider the other protections for the data, such as physical security methods, in deciding whether or when to re-encrypt.

So, what now?

There is likely not much that you can and should do now because of the prospect of quantum computing that you shouldn’t already be doing out of best practice. A good example is a detailed inventory of your data and the cryptographic methods used to protect it.

In technology terms,10 to 15 years seems like a very long time. Even so, it’s important for an organization that intends to exist in that time frame to have some level of planning. Putting PQC on the long-term agenda now is the best way to make sure you don’t get blindsided by quantum effects.

For now, it’s largely a problem for mathematicians, but it’s time for others in the industry to speak up. Are you a programmer, system architect, or a network engineer with ideas about what approaches would be acceptable or unacceptable? Send them to the PQC people at NIST at pqc-comments@nist.gov.

 Quantum security: Lessons for leaders

  • Planning a decade ahead for keeping your data secure is not unreasonable.
  • You don't need to fully understand quantum computing to understand the potential security problems.
  • Dormant data security may suddenly become relevant.

This article/content was written by the individual writer identified and does not necessarily reflect the view of Hewlett Packard Enterprise Company.