Earlier this year, Quantum computing took another big step out of the laboratory and toward commercial viability with the release of the IBM Q System One. Last year Google announced its ‘Bristlecone’ Quantum Computing Chip.
Ready or not, here comes quantum computing. And it's going to have a profound impact on specific areas of security eventually. Does that mean encryption built on traditional computers doesn’t stand a chance? Not necessarily.
Before we dive into the future viability of traditional encryption, let’s describe what we are talking about when it comes to quantum computing.
While silicon computers rely on bits that always either represent a zero or represent a one. Quantum computing doesn’t work anything like that. Quantum computing relies on what are called qubits, and qubits can represent a zero or a one as well as any number between those states. Here is a good primer on quantum computing.
Quantum computers typically use algorithms that are probabilistic. That is they don’t provide an exact answer, but an answer within a certain probability, so they will excel at certain types of problem sets, such as risk management, financial management, and other areas where a range of probabilities is an appropriate answer.
Here’s a good write-up on the use cases for quantum computing. Finally, big quantum computers — those with lots of strings of qubits — will be able to solve complex problems exceptionally quickly, much more rapidly than traditional computers, so that the best known most widely used encryption techniques would be swiftly defeated. Not good.
That means current cryptographic algorithms used in public key cryptography such as Finite Field Cryptography, Elliptic Curve Cryptography, and RSA will readily be cracked. However, symmetric key algorithms such as AES will need larger keys to be defendable against quantum attacks. What does this mean practically? Things that rely on public key cryptography such as storage encryption and digital signatures will need to be rethought.
While such attacks are not known to have happened yet (at least not publicly), in 1994 the mathematician Peter Shor created an algorithm that showed how a quantum computer could be used to factor a number in polynomial time, and thereby break common encryption. Since then the field of post-quantum cryptography was born, to develop a resilient enough algorithm.
Experts believe the time is now to develop algorithms that can run on traditional computers that are sufficiently resilient to Shor’s algorithm running on a quantum computer powerful enough to crack commonly currently used algorithms.
These commercial developments have fueled considerable research into developing those types of algorithms. Recently the U.S. National Institute of Science and Technology published a report that provides detail on the progress toward building algorithms that can run on traditional computers and be quantum resistant.
From the NIST report, here is their overview of the primary concepts behind cryptographic methods that researchers hope will prove resilient to quantum computing:
Lattice-based cryptography: Cryptosystems based on lattice problems have received renewed interest, for a few reasons. Exciting new applications (such as fully homomorphic encryption, code obfuscation, and attribute-based encryption) have been made possible using lattice-based cryptography. Most lattice-based key establishment algorithms are relatively simple, efficient, and highly parallelizable. Also, the security of some lattice-based systems are provably secure under a worst-case hardness assumption, rather than on the average case. On the other hand, it has proven difficult to give precise estimates of the security of lattice schemes against even known cryptanalysis techniques.
Code-based cryptography: In 1978, the McEliece cryptosystem was first proposed, and has not been broken since. Since that time, other systems based on error-correcting codes have been proposed. While quite fast, most code-based primitives suffer from having huge key sizes. Newer variants have introduced more structure into the codes in an attempt to reduce the key sizes; however, the added structure has also led to successful attacks on some proposals. While there have been some proposals for code-based signatures, code-based cryptography has seen more success with encryption schemes.
Multivariate polynomial cryptography – These schemes are based on the difficulty of solving systems of multivariate polynomials over finite fields. Several multivariate cryptosystems have been proposed over the past few decades, with many having been broken. While there have been some proposals for multivariate encryption schemes, multivariate cryptography has historically been more successful as an approach to signatures.
Hash-based signatures: Hash-based signatures are digital signatures constructed using hash functions. Their security, even against quantum attacks, is well understood. Many of the more efficient hash-based signature schemes have the drawback that the signer must keep a record of the exact number of previously signed messages, and any error in this record will result in insecurity. Another of their drawbacks is that they can produce only a limited number of signatures. The number of signatures can be increased, even to the point of being effectively unlimited, but this also increases the signature size.
Other: A variety of systems have been proposed which do not fall into the above families. One such proposal is based on evaluating isogenies on supersingular elliptic curves. While Shor's algorithm can efficiently solve the discrete log problem on elliptic curves on a quantum computer, the isogeny problem on supersingular curves has no similar quantum attack known. Like some other proposals, for example, those based on the conjugacy search problem and related problems in braid groups, there has not been enough analysis to have much confidence in their security.
Currently, critical uses of public key cryptography are for key establishment and digital signatures — such as when you rely on a digitally signed application or digitally sign a document.
Whatever the final cryptographic technique that is chosen, implementing it will require significant changes to current networks. According to NIST, in its Report on Post-Quantum Cryptography, it will require substantial changes to existing systems. “It seems improbable that any of the currently known algorithms can serve as a drop-in replacement for what is in use today. One challenge that will likely need to be overcome is that most of the quantum-resistant algorithms have larger key sizes than the algorithms they will replace,” the NIST report stated. "This may result in needing to change various Internet protocols, such as the Transport Layer Security protocol, or the Internet Key Exchange. How this should be done must be carefully considered," it stated.
Of course, none of these techniques have been proven to be effective against all possible quantum attacks, and a new algorithm could make these schemes antiquated immediately.
Of course, all encryption has an expiry date. No one knows when that is. Even with classic computing, it's possible for new techniques to arise and crack existing schemes. And everyone needs to be at the ready for such eventualities.
This is something, of course, NIST readily acknowledges. As the report states, when standards for quantum-resistant public key cryptography become available, “NIST will reassess the imminence of the threat of quantum computers to existing standards and may decide to deprecate or withdraw the affected standards thereafter as a result. Agencies should, therefore, be prepared to transition away from these algorithms as early as ten years from now. As the replacements for currently standardized public-key algorithms are not yet ready, a focus on maintaining crypto-agility is imperative,” the report states.