Last week, NIST announced some algorithms selected for standardization as part of their Post-Quantum Cryptography project. This is a good opportunity to recall the history of this process, observe its current state, and comment on the selected algorithms. It is important to remember that the process is *not* finished: round 4 has started, and should ultimately produce at least one more selected algorithm.

The PQC project started in late 2016 with a call for submissions. The ostensible motivation was the possible emergence of quantum computers, since such machines would be able to break through existing asymmetric cryptographic algorithms based on number theory and related algebraic objects (RSA, elliptic curves…). Nobody really knows whether quantum computers will exist in the future; they combine impeccable theory with atrociously difficult technology, and are currently devouring huge research budgets while still being quite far from endangering even toy versions of common cryptographic algorithms. There are strong believers and strong disbelievers in practical quantum computing, but belief is not knowledge; however, the mere possibility is enough to warrant taking some precautions, in particular since the design and specification of a good cryptographic algorithm is known to be a lengthy process. Another good reason to investigate new classes of asymmetric algorithms, unrelated to quantum computing, is that we are currently relying on a relatively small set of mathematical “hard problems” that could potentially be weakened through some new insight by a mathematician, and that’s even less predictable than technological advances in trapping single atoms at ultra-low temperatures. Some variety in our algorithms would therefore be highly desirable.

NIST is adamant that the standardization project is not a competition, though it sure has some competitive flavour, with candidates, rounds and finalists. The call was for two algorithm categories: key encapsulation mechanisms (KEMs) and signatures, to be used in situations where we currently use, typically, Diffie-Hellman key exchange over some elliptic curve, and ECDSA or EdDSA, respectively. They received no fewer than 69 complete submissions! It was then followed by the usual winnowing process in which some of the weakest candidates were quickly broken, or withdrawn; other candidates found that they were so similar to each other that they could be merged. NIST organized several “rounds”, each time selecting some algorithms for the next round, and rejecting others. Their choice was informed by all comments and research papers that flourished about the candidates, though there cannot be a ultimately completely rational and unimpeachably logical “best candidate”, since security relies on predictions on future discoveries in mathematics. We are, at best, in the “educated guess” area in these matters. NIST had to perform a delicate balancing act between the known results, an informal estimation of how well we understand the underlying mathematical objects, performance and secure implementation issues, and their own goal of achieving some extra diversity in the kind of problems upon which the algorithms rely. NIST wrote an extensive status report that details the retained and not retained algorithms, and their rationale.

Round 3 is now finished, and some algorithms were selected for standardization:

- The KEM algorithm CRYSTALS-Kyber
- The signature algorithms CRYSTALS-Dilithium, Falcon, and SPHINCS+

Having a single KEM algorithm does not fulfill the diversity goal of NIST; indeed, a “round 4” has started with four remaining KEM candidates: BIKE, Classic McEliece, HQC and SIKE. The declared intent is to select at least one of these at the end of round 4. Conversely, no other signature algorithm was selected for round 4, so we have to assume that NIST feels content with the three selected algorithms, or, more accurately, that they did not find the remaining candidates to offer a sufficient mix of security and performance. A footnote in the NIST status report (note 7, page 19) states that NIST intends to issue a new call for post-quantum signatures before the end of 2022.

**CRYSTALS-Kyber and CRYSTALS-Dilithium** are two facets of a common mathematical problem, which is the difficulty of finding small vectors in a given lattice. The algorithms use module lattices and can share some parts of their implementations. The CRYSTALS Web site offers some summary and pointers to the specification and some implementations. NIST, very correctly, noticed that the two algorithms were based on strong science, a reasonably simple design, and allowed easy implementation with good performance. A slight issue might be about intellectual property: footnote 6 in the report (page 18) ominously states that some agreements are currently being discussed with some owners of patents that may apply to Kyber, and if these agreements cannot reach a satisfying conclusion by the end of 2022 then NIST might replace Kyber with NTRU, another former candidate and also one of the first proposed lattice-based algorithms. NIST strongly intends that any standardized algorithm may be used and implemented freely.

**Falcon** is also a lattice-based algorithm, though a slightly different kind of lattice. Disclaimer: I am part of the Falcon team (thus, I am technically one of the “winners” of the not-a-competition). Falcon uses an NTRU lattice, though in a somewhat convoluted way (see the Falcon Web site for details). Since it is lattice-based, it does not bring much diversity beyond Dilithium; NIST selected it for performance reasons: Falcon public keys and signatures are substantially shorter than Dilithium keys and signatures. For instance, Falcon offers public keys of size 897 bytes, and signatures of size 666 bytes, while Dilithium starts at 1312-byte keys and 2420-byte signatures. In the common situation of a TLS connection, the server sends its public key as part of a chain of X.509 certificates, and each certificate include both a public key, and a signature value; thus, the larger size of both values in Dilithium translates to more IP packets to send, which noticeably increases connection latency in experiments. This makes Falcon quite desirable in that kind of contexts. Unfortunately, while Falcon signature *verification* is relatively easy to implement, and fast, signature *generation* is a lot more complicated and very hard to implement securely. To my knowledge, apart from the Python demo implementation by Thomas Prest (who led the Falcon submission team), all existing implementations of Falcon are derivative of the reference code, which I wrote with some considerable effort. Falcon was, by far, the most complicated cryptographic algorithm I have ever implemented; this was at least one order of magnitude harder than, say, anything related to elliptic curves. I also got it wrong the first time. NIST recommends Dilithium by default, reserving Falcon for situations where the shorter keys and signatures yield important benefits; I fully agree with NIST here.

**SPHINCS+** is a hash-based signature scheme. This is the conservative choice, whose security is completely unrelated to lattices, but instead relies on fairly basic properties of hash functions, so that we *feel* that we understand quite well why they work, and why they are not at risk at being broken in the near future (though, to be fair, we do not really know, mathematically speaking, whether secure hash functions can exist at all!). As the other algorithms, SPHINCS+ has its own Web site. SPHINCS+ performance is not so good, as is usual with hash-based signature schemes: public keys are very small (32 bytes at the base security level), but signatures are quite large (at least 7856 bytes). It must be noted that SPHINCS+ is a stateless scheme; there are other standardized stateful hash-based schemes (e.g. XMSS and LMS) which offer somewhat smaller signatures, but require the signer to maintain some state that changes for each produced signature. In general, such hash-based schemes are adequate in situations such as an embedded system verifying a cryptographic signature on its firmware image whenever it boots up.

**What next?** The standardization process will continue. NIST will proceed to draft standards for CRYSTALS algorithms, then for Falcon and SPHINCS+; there may be some cosmetic adjustments on the algorithms at that point. The standard-writing and approval steps are not faster than anything else in the whole process, so we should not expect formally published standards before at least a year from now. Non-lattice KEMs are still being investigated (three code-based schemes, and one using isogenies between supersingular elliptic curves). Outside of the PQC process, science still works and new proposals are made; e.g. the recently proposed BAT is a lattice-based KEM using a Falcon-like lattice, but without requiring the cumbersome floating-point computations, and offering smaller keys and ciphertext than CRYSTALS-Kyber.