Lattice Cryptography Will Secure Data in the Post-Quantum Era
Dr. Vadim Lyubashevsky describes how IBM Research is developing cryptography that will stand up to quantum computing
Image by Cait Opperman for IBM
By Jim Utsler08/01/2019
Quantum computers hold great potential for the future, including new discoveries in material sciences and chemistry, and financial-risk and supply-chain optimization, but these systems are also likely to see some adverse effects on encryption.
Critically, this last example, encryption, is likely to cut both ways. Sure, quantum computing may result in theoretically unbreakable encryption techniques, but it could also be used to easily crack classical, non-quantum encryption models. This is no small matter, according to Cybersecurity Ventures, which estimates that cybercrime damages will hit $6 trillion annually
From Years to Days
Currently, it might take thousands of years on today’s most powerful supercomputers to tackle the complexities of the most widely used asymmetric encryption algorithms, while a large-scale quantum computer, according to research conducted by Peter Shor at MIT more than 20 years ago, could theoretically break them down in days or even hours.
That means data protected by current security protocols—safeguarded as it may be, whether on the move or at rest—simply won’t be protected once bad actors get their hands on quantum computers. As a result, organizations need to consider the security implications of quantum computing—both as an adversary and a hack-proof ally.
Today’s encryption techniques, including public-key schemes such as the RSA, Diffie-Hellman or Elliptic-Curve cryptosystems, could eventually be cracked by large quantum computers with millions of qubits—which are likely to be available in the next 10 to 30 years. This is because cryptographic protocols such as SSL, Transport Layer Security and HTTPS are built upon cryptographic primitives (well-established low-level cryptographic algorithms) such as authentication schemes, block ciphers, digital signatures and encryption schemes. These protocols satisfy the respective security properties of their primitives, but they become useless if the primitives are compromised.
An organization looking to future-proof its data now should use lattice cryptography in tandem with traditional primitives in order to remove all risk that comes with introducing new schemes.
Primitives and Protocols
To counter attacks on primitives, IBM Research, realizing that bad-apple quantum decryption is going to go mainstream someday, is working on an underlying security architecture to counter quantum-computing cracks. One, known as lattice cryptography, which is based on problems from an area of mathematics called “geometry of numbers,” hides data inside complex algebraic structures called lattices.
Taking a time-traveling trip back to algebra class, suppose someone is given a square, full-rank matrix A and a value b x mod p, where x is a vector with 0/1 coefficients and p is a small (13-bit, for example) prime, and is then tasked with finding x. This problem has a unique solution x, which is actually quite easy to find using Gaussian elimination.
However, if someone is given a slightly “noisy” version of Ax—that is Ax+e mod p—where e is some random vector with 0/1 coefficients, then for matrices of large-enough dimension (say, around 512), this problem becomes surprisingly difficult to solve. The complexity of the problem allows cryptographers to develop quantum-safe primitives that can then be used to build similarly safe protocols. This type of problem has been widely studied since the 1980s, and it has yet to fall to any algorithmic attacks, either classical or quantum.
So what does this mean? Well, whether it’s cars to planes or even power plants, if a company is manufacturing a product that will be operational in 15 to 30 years from now, it should already begin planning a migration toward lattice cryptography, because once the products are in the field, they’ll be harder to upgrade.
“An organization looking to future-proof its data now should use lattice cryptography in tandem with traditional primitives in order to remove all risk that comes with introducing new schemes. This approach should protect the organization’s data as long as at least one of these constructions is secure. If implemented correctly, the protocols will become quantum-safe, and all it takes is a couple of extra kilobytes of data per communication session,” IBM cryptographer Dr. Vadim Lyubashevsky says.
The National Institute of Standards and Technology is currently working on the development of a quantum-safe standard and will be hosting an event in August to review the current proposals, including the Crystals Kyber, Crystals Dilithium and Falcon algorithms, which are being co-developed by Lyubashevsky.
Even though quantum computers haven’t yet become everyday backroom systems, people should be aware of the security implications they’ll bring with them. This includes both the classical and quantum computer users’ points of view. Thankfully, IBM Research is already taking these issues seriously, knowing that data on whatever computing platform is every organization’s most valuable—and sensitive—asset.
Jim Utsler, IBM Systems magazine senior writer, has been writing for IBM since the mid-1990s.