NIST finally settles on quantum-safe crypto standards

After years of review, the National Institute of Standards and Technology officially picked the world’s first three post-quantum encryption algorithms as the basis for its post-quantum security strategy: ML-KEM, ML-DSA, and SLH-DSA.

NIST first asked cryptographers to develop these new standards in 2016, when the threat of quantum computers started becoming a reality. Quantum computers are expected to be able to break common encryption algorithms used today, such as RSA.

By 2022, 69 such algorithms had been submitted to NIST, out of which the agency chose four for further review. The fourth algorithm, Falcon, was not selected as an initial standard, but evaluation will continue. NIST is also continuing to identify and evaluate other algorithms.

Lattice-based cryptography

The three new algorithms are all designed for asymmetric encryption – which is when the key used to encode the message is different from the key used to decode it. You keep the decoder key secret, just to yourself, and publicize the encoder key. Now anyone can send you a secret message that only you can read.

This is called public key encryption and serves as the basis for basically all online communications, for securing websites, for financial transactions, and for key management systems and other specialized applications.

At the heart of the system is the idea that multiplying two large numbers together is relatively easy, but dividing a very large number into its factors is extremely difficult.

The new lattice-based encryption methods rely on a different mathematical mechanism, one that isn’t just difficult for traditional computers, but for quantum computers as well.

It’s based on something called the knapsack problem, says Gregor Seiler, a cryptography researcher at IBM. You have a collection of very large numbers. Then you take some of these numbers and add them up. The total is another large number. Adding up numbers is very easy. But figuring out which numbers were used to add up to this total is very difficult.

“This is a very hard problem when the set is really big and the integers are really long,” says Seiler.

Lattice-based cryptography takes this idea and ramps up the difficulty. Instead of the knapsack being full of numbers, it’s now full of vectors. If you think of a single number as being a dot on a line, a vector is an arrow pointing to a dot floating in space. And instead of adding up a bunch of vectors, you can also add up multiples of these vectors.

ML-KEM

This algorithm, originally known as CRYSTALS-Kyber, is a standard based on module-lattice-based key encapsulation. It was originally developed by IBM researchers. It’s a standard designed to be used for general encryption, such as for accessing websites securely, because it’s fast to use.

ML-DSA

This algorithm was originally known as CRYSTALS-Dilithium and was also originally developed by IBM. This standard is the second-fastest of the three algorithms, and it was designed to be used for digital signatures.

According to Seiler, the trick to this algorithm is that decoding the message requires knowing all the multipliers of the vectors that had been added up.

Digital signatures are used to authenticate documents or software, “helping make sure that those aren’t modified or tampered with,” says Seiler. “Since they are used in sensitive industries such as healthcare, finance, and manufacturing but also by government agencies, there is a palpable urgency to migrate to quantum-safe digital signature methods.”

SLH-DSA

This is another digital signature standard, but it is more secure than the other two – at a cost. According to Seiler, depending on which variant is implemented, it either has a larger signature or requires more time to create the signature.

More standards are yet to come

These three standards aren’t the last we’re going to see when it comes to quantum-safe encryption, says Tom Patterson, emerging technology security lead at Accenture. “There’s going to be, for the next few years, a series of different algorithms that will be available and standardized,” he says.

There have been evolutions in cryptography standard before, but those typically involved simpler changes, like switching to a longer key length.

The change to quantum-safe cryptography is going to be more complicated because the algorithms are very different, because there are multiple different algorithms that will be used for different use cases, and because the software supply chain is more complicated than ever before.

“This is the opening bell for most CISOs around the world,” says Patterson. “Now they know what algorithms they’re going to work with.”

Read more about quantum computing

  • Crypto flexibility will prepare firms for quantum threat, experts say
  • Error-correction breakthroughs bring quantum computing a step closer
  • What Microsoft’s error-correction milestone means for usable quantum computing
  • Proof-of-concept quantum repeaters bring quantum networks a big step closer
  • Commercial quantum networks inch closer to primetime
  • PSiQuantum to build first utility-scale quantum computer in Australia
  • What is quantum computing good for? XPRIZE and Google offer cash for answers

Source:: Network World