Internet-Draft PQC for Engineers August 2023
Banerjee, et al. Expires 11 February 2024 [Page]
Workgroup:
PQUIP
Internet-Draft:
draft-ar-pquip-pqc-engineers-03
Published:
Intended Status:
Informational
Expires:
Authors:
A. Banerjee
Nokia
T. Reddy
Nokia
D. Schoinianakis
Nokia
T. Hollebeek
DigiCert

Post-Quantum Cryptography for Engineers

Abstract

The presence of a Cryptographically Relevant Quantum Computer (CRQC) would render state-of-the-art, public-key cryptography deployed today obsolete, since all the assumptions about the intractability of the mathematical problems that offer confident levels of security today no longer apply in the presence of a CRQC. This means there is a requirement to update protocols and infrastructure to use post-quantum algorithms, which are public-key algorithms designed to be secure against CRQCs as well as classical computers. These algorithms are just like previous public key algorithms, however the intractable mathematical problems have been carefully chosen, so they are hard for CRQCs as well as classical computers. This document explains why engineers need to be aware of and understand post-quantum cryptography. It emphasizes the potential impact of CRQCs on current cryptographic systems and the need to transition to post-quantum algorithms to ensure long-term security. The most important thing to understand is that this transition is not like previous transitions from DES to AES or from SHA-1 to SHA2, as the algorithm properties are significantly different from classical algorithms, and a drop-in replacement is not possible.

About This Document

This note is to be removed before publishing as an RFC.

Status information for this document may be found at https://datatracker.ietf.org/doc/draft-ar-pquip-pqc/.

Discussion of this document takes place on the pquip Working Group mailing list (mailto:pqc@ietf.org), which is archived at https://mailarchive.ietf.org/arch/browse/pqc/. Subscribe at https://www.ietf.org/mailman/listinfo/pqc/.

Status of This Memo

This Internet-Draft is submitted in full conformance with the provisions of BCP 78 and BCP 79.

Internet-Drafts are working documents of the Internet Engineering Task Force (IETF). Note that other groups may also distribute working documents as Internet-Drafts. The list of current Internet-Drafts is at https://datatracker.ietf.org/drafts/current/.

Internet-Drafts are draft documents valid for a maximum of six months and may be updated, replaced, or obsoleted by other documents at any time. It is inappropriate to use Internet-Drafts as reference material or to cite them other than as "work in progress."

This Internet-Draft will expire on 11 February 2024.

Table of Contents

1. Introduction

Quantum computing is no longer perceived as a conjecture of computational sciences and theoretical physics. Considerable research efforts and enormous corporate and government funding for the development of practical quantum computing systems are being invested currently. For instance, Google’s announcement on achieving quantum supremacy [Google], IBM’s latest 433-qubit processor Osprey [IBM] or even Nokia Bell Labs' work on topological qubits [Nokia] signify, among other outcomes, the accelerating efforts towards large-scale quantum computers. At the time of writing the document, Cryptographically Relevant Quantum Computers (CRQCs) that can break widely used public-key cryptographic algorithms are not yet available. However, it is worth noting that there is ongoing research and development in the field of quantum computing, with the goal of building more powerful and scalable quantum computers. As quantum technology advances, there is the potential for future quantum computers to have a significant impact on current cryptographic systems. Forecasting the future is difficult, but the general consensus is that such computers might arrive some time in the 2030s, or might not arrive until 2050 or later.

Extensive research has produced several post-quantum cryptographic algorithms that offer the potential to ensure cryptography's survival in the quantum computing era. However, transitioning to a post-quantum infrastructure is not a straightforward task, and there are numerous challenges to overcome. It requires a combination of engineering efforts, proactive assessment and evaluation of available technologies, and a careful approach to product development. This document aims to provide general guidance to engineers who utilize public-key cryptography in their software. It covers topics such as selecting appropriate post-quantum cryptographic (PQC) algorithms, understanding the differences between PQC Key Encapsulation Mechanisms (KEMs) and traditional Diffie-Hellman style key exchange, and provides insights into expected key sizes and processing time differences between PQC algorithms and traditional ones. Additionally, it discusses the potential threat to symmetric cryptography from Cryptographically Relevant Quantum Computers (CRQCs). It is important to remember that asymmetric algorithms are largely used for secure communications between organizations that may not have previously interacted, so a significant amount of coordination between organizations, and within and between ecosystems needs to be taken into account. Such transitions are some of the most complicated in the tech industry. It might be worth mentioning that recently NSA released an article on Future Quantum-Resistant (QR) Algorithm Requirements for National Security Systems [CNSA2-0] based on the need to protect against deployments of CRQCs in the future.

It is crucial for the reader to understand that when the word "PQC" is mentioned in the document, it means Asymmetric Cryptography (or Public key Cryptography) and not any algorithms from the Symmetric side based on stream, block ciphers, etc. It does not cover such topics as when traditional algorithms might become vulnerable (for that, see documents such as [QC-DNS] and others). It also does not cover unrelated technologies like Quantum Key Distribution or Quantum Key Generation, which use quantum hardware to exploit quantum effects to protect communications and generate keys, respectively. Post-quantum cryptography is based on standard math and software and can be run on any general purpose computer.

Please note: This document does not go into the deep mathematics of the PQC algorithms, but rather provides an overview to engineers on the current threat landscape and the relevant algorithms designed to help prevent those threats.

2. Conventions and Definitions

The key words "MUST", "MUST NOT", "REQUIRED", "SHALL", "SHALL NOT", "SHOULD", "SHOULD NOT", "RECOMMENDED", "NOT RECOMMENDED", "MAY", and "OPTIONAL" in this document are to be interpreted as described in BCP 14 [RFC2119] [RFC8174] when, and only when, they appear in all capitals, as shown here.

3. Contributing to This Document

The guide was inspired by a thread in September 2022 on the pqc@ietf.org mailing list. The document is being collaborated on through a GitHub repository.

The editors actively encourage contributions to this document. Please consider writing a section on a topic that you think is missing. Short of that, writing a paragraph or two on an issue you found when writing code that uses PQC would make this document more useful to other coders. Opening issues that suggest new material is fine too, but relying on others to write the first draft of such material is much less likely to happen than if you take a stab at it yourself.

4. Traditional Cryptographic Primitives that Could Be Replaced by PQC

Any asymmetric cryptographic algorithm based on integer factorization, finite field discrete logarithms or elliptic curve discrete logarithms will be vulnerable to attacks using Shor's Algorithm on a sufficiently large general-purpose quantum computer, known as a CRQC. This document focuses on the principal functions of asymmetric cryptography:

5. Invariants of Post-Quantum Cryptography

In the context of PQC, symmetric-key cryptographic algorithms are generally not directly impacted by quantum computing advancements. Symmetric-key cryptography, such as block ciphers (e.g., AES) and message authentication mechanisms (e.g., HMAC-SHA2), rely on secret keys shared between the sender and receiver. HMAC is a specific construction that utilizes a cryptographic hash function (such as SHA-2) and a secret key shared between the sender and receiver to produce a message authentication code. CRQCs, in theory, do not offer substantial advantages in breaking symmetric-key algorithms compared to classical computers (see Section 7.1 for more details).

6. NIST PQC Algorithms

In 2016, the National Institute of Standards and Technology (NIST) started a process to solicit, evaluate, and standardize one or more quantum-resistant public-key cryptographic algorithms, as seen here. The first set of algorithms for standardization (https://csrc.nist.gov/publications/detail/nistir/8413/final) were selected in July 2022.

NIST announced as well that they will be opening a fourth round to standardize an alternative KEM, and a call for new candidates for a post-quantum signature algorithm.

These algorithms are not a drop-in replacement for classical asymmetric cryptographic algorithms. RSA [RSA] and ECC [RFC6090] can be used for both key encapsulation and signatures, while for post-quantum algorithms, a different algorithm is needed for each. When upgrading protocols, it is important to replace the existing use of classical algorithms with either a PQC key encapsulation method or a PQC signature method, depending on how RSA and/or ECC was previously being used.

6.1. NIST candidates selected for standardization

6.1.1. PQC Key Encapsulation Mechanisms (KEMs)

6.1.2. PQC Signatures

6.2. Candidates advancing to the fourth-round for standardization at NIST

The fourth-round of the NIST process focuses only on KEMs. The goal of that round is to select an althernative algorithm that is based on different hard problem than Kyber. The candidates still advancing for standardization are:

  • Classic McEliece: Based on the hardness of syndrome decoding of Goppa codes. Goppa codes are a class of error-correcting codes that can correct a certain number of errors in a transmitted message. The decoding problem involves recovering the original message from the received noisy codeword.
  • BIKE: Based on the the hardness of syndrome decoding of QC-MDPC codes. Quasi-Cyclic Moderate Density Parity Check (QC-MDPC) code are a class of error correcting codes that leverages bit flipping technique to efficiently correct errors.
  • HQC : Based on the hardness of syndrome decoding of Quasi-cyclic concatenated Reed Muller Reed Solomon (RMRS) codes in the Hamming metric. Reed Muller (RM) codes are a class of block error correcting codes used especially in wireless and deep space communications. Reed Solomon (RS) are a class of block error correcting codes that are used to detect and correct multiple bit errors.
  • SIKE (Broken): Supersingular Isogeny Key Encapsulation (SIKE) is a specific realization of the SIDH (Supersingular Isogeny Diffie-Hellman) protocol. Recently, a mathematical attack based on the "glue-and-split" theorem from 1997 from Ernst Kani was found against the underlying chosen starting curve and torsion information. In practical terms, this attack allows for the efficient recovery of the private key. NIST announced that SIKE was no longer under consideration, but the authors of SIKE had asked for it to remain in the list so that people are aware that it is broken.

7. Threat of CRQCs on Cryptography

Post-quantum cryptography or quantum-safe cryptography refers to cryptographic algorithms that are secure against cryptographic attacks from both CRQCs and classic computers.

When considering the security risks associated with the ability of a quantum computer to attack traditional cryptography, it is important to distinguish between the impact on symmetric algorithms and public-key ones. Dr. Peter Shor and Dr. Lov Grover developed two algorithms that changed the way the world thinks of security under the presence of a CRQC.

7.1. Symmetric cryptography

Grover's algorithm is a quantum search algorithm that provides a theoretical quadratic speedup for searching an unstructured database compared to classical algorithms. Grover’s algorithm theoretically requires doubling the key sizes of the algorithms that one deploys today to achieve quantum resistance. This is because Grover’s algorithm reduces the amount of operations to break 128-bit symmetric cryptography to 2^{64} quantum operations, which might sound computationally feasible. However, 2^{64} operations performed in parallel are feasible for modern classical computers, but 2^{64} quantum operations performed serially in a quantum computer are not. Grover's algorithm is highly non-parallelizable and even if one deploys 2^c computational units in parallel to brute-force a key using Grover's algorithm, it will complete in time proportional to 2^{(128−c)/2}, or, put simply, using 256 quantum computers will only reduce runtime by 1/16, 1024 quantum computers will only reduce runtime by 1/32 and so forth ​(see [NIST] and [Cloudflare]​).

For unstructured data such as symmetric encrypted data or cryptographic hashes, although CRQCs can search for specific solutions across all possible input combinations (e.g., Grover's Algorithm), no CRQCs is known to break the security properties of these classes of algorithms.

How can someone be sure that an improved algorithm won’t outperform Grover's algorithm at some point in time? Christof Zalka has shown that Grover's algorithm (and in particular its non-parallel nature) achieves the best possible complexity for unstructured search [Grover-search].

Finally, in their evaluation criteria for PQC, NIST is considering a security level equivalent to that of AES-128, meaning that NIST has confidence in standardizing parameters for PQC that offer similar levels of security as AES-128 does [NIST]​. As a result, 128-bit algorithms should be considered quantum-safe for many years to come.

7.2. Asymmetric cryptography

“Shor’s algorithm” on the other side, efficiently solves the integer factorization problem (and the related discrete logarithm problem), which offer the foundations of the public-key cryptography that the world uses today. This implies that, if a CRQC is developed, today’s public-key cryptography algorithms (e.g., RSA, Diffie-Hellman and Elliptic Curve Cryptography) and protocols would need to be replaced by algorithms and protocols that can offer cryptanalytic resistance against CRQCs. Note that Shor’s algorithm doesn’t run on any classic computer, it needs a CRQC.

For example, to provide some context, one would need 20 million noisy qubits to break RSA-2048 in 8 hours [RSA8HRS] or 4099 stable qubits to break it in 10 seconds [RSA10SC].

For structured data such as public-key and signatures, instead, CRQCs can fully solve the underlying hard problems used in classic cryptography (see Shor's Algorithm). Because an increase of the size of the key-pair would not provide a secure solution in this case, a complete replacement of the algorithm is needed. Therefore, post-quantum public-key cryptography must rely on problems that are different from the ones used in classic public-key cryptography (i.e., the integer factorization problem, the finite-field discrete logarithm problem, and the elliptic-curve discrete logarithm problem).

8. Timeline for transition

A malicious actor with adequate resources can launch an attack to store sensitive encrypted data today that can be decrypted once a CRQC is available. This implies that, every day, sensitive encrypted data is susceptible to the attack by not implementing quantum-safe strategies, as it corresponds to data being deciphered in the future.

+------------------------+----------------------------+
|                        |                            |
|         y              |           x                |
+------------------------+----------+-----------------+
|                                   | <--------------->
|               z                   |   Security gap
+-----------------------------------+

Figure 1: Mosca model

These challenges are illustrated nicely by the so called Mosca model discussed in ​[Threat-Report]. In the Figure 1, "x" denotes the time that our systems and data need to remain secure, "y" the number of years to migrate to a PQC infrastructure and "z" the time until a CRQC that can break current cryptography is available. The model assumes that encrypted data can be intercepted and stored before the migration is completed in "y" years. This data remains vulnerable for the complete "x" years of their lifetime, thus the sum "x+y" gives us an estimate of the full timeframe that data remain insecure​. The model essentially asks how are we preparing our IT systems during those "y" years (or in other words, how can one minimize those "y" years) to minimize the transition phase to a PQC infrastructure and hence minimize the risks of data being exposed in the future.

Finally, other factors that could accelerate the introduction of a CRQC should not be under-estimated, like for example faster-than-expected advances in quantum computing and more efficient versions of Shor’s algorithm requiring less qubits. As an example, IBM, one of the leading actors in the development of a large-scale quantum computer, has recently published a roadmap committing to new quantum processors supporting more than 1000 qubits by 2025 and networked systems with 10k-100k qubits beyond 2026 [IBMRoadmap]. Innovation often comes in waves, so it is to the industry’s benefit to remain vigilant and prepare as early as possible.

9. Post-quantum cryptography categories

The current set of problems used in post-quantum cryptography can be currently grouped into three different categories: lattice-based, hash-based and code-based.

9.1. Lattice-Based Public-Key Cryptography

Lattice-based public-key cryptography leverages the simple construction of lattices (i.e., a regular collection of points in a Euclidean space that are regularly spaced) to build problems that are hard to solve such as the Shortest Vector or Closes Vector Problem, Learning with Errors, and Learning with Rounding. All these problems have good proof for worst-to-average case reduction, thus equating the hardness of the average case to the worst-case.

The possibility to implement public-key schemes on lattices is tied to the characteristics of the basis used for the lattice. In particular, solving any of the mentioned problems can be easy when using reduced or "good" basis (i.e., as short as possible and as orthogonal as possible), while it becomes computationally infeasible when using "bad" basis (i.e., long vectors not orthogonal). Although the problem might seem trivial, it is computationally hard when considering many dimensions. Therefore, a typical approach is to use "bad" basis for public keys and "good" basis for private keys. The public keys ("bad" basis) let you easily verify signatures by checking, for example, that a vector is the closest or smallest, but do not let you solve the problem (i.e., finding the vector). Conversely, private keys (i.e., the "good" basis) can be used for generating the signatures (e.g., finding the specific vector). Signing is equivalent to solving the lattice problem.

Lattice-based schemes usually have good performances and average size public keys and signatures, making them good candidates for general-purpose use such as replacing the use of RSA in PKIX certificates.

Examples of such class of algorithms include Kyber, Falcon and Dilithium.

It is noteworthy that, lattice-based encryption schemes are often prone to decryption failures, meaning that valid encryptions are decrypted incorrectly; as such, an attacker could significantly reduce the security of lattice-based schemes that have a relatively high failure rate. However, for most of the NIST Post-Quantum Proposals, the number of required oracle queries is above practical limits, as has been shown in [LattFail1]. More recent works have improved upon the results in [LattFail1], showing that the cost of searching for additional failing ciphertexts after one or more have already been found, can be sped up dramatically [LattFail2]. Nevertheless, at this point in time (July 2023), the PQC candidates by NIST are considered secure under these attacks and we suggest constant monitoring as cryptanalysis research is ongoing.

9.2. Hash-Based Public-Key Cryptography

Hash based PKC has been around since the 70s, developed by Lamport and Merkle which creates a digital signature algorithm and its security is mathematically based on the security of the selected cryptographic hash function. Many variants of hash based signatures have been developed since the 70s including the recent XMSS [RFC8391], HSS/LMS [RFC8554] or BPQS schemes. Unlike digital signature techniques, most hash-based signature schemes are stateful, which means that signing necessitates the update of the secret key.

SPHINCS on the other hand leverages the HORS (Hash to Obtain Random Subset) technique and remains the only hash based signature scheme that is stateless.

SPHINCS+ is an advancement on SPHINCS which reduces the signature sizes in SPHINCS and makes it more compact. SPHINCS+ was recently standardized by NIST.

9.3. Code-Based Public-Key Cryptography

This area of cryptography stemmed in the 1970s and 80s based on the seminal work of McEliece and Niederreiter which focuses on the study of cryptosystems based on error-correcting codes. Some popular error correcting codes include the Goppa codes (used in McEliece cryptosystems), encoding and decoding syndrome codes used in Hamming Quasi-Cyclic (HQC) or Quasi-cyclic Moderate density parity check (QC-MDPC) codes.

Examples include all the NIST Round 4 (unbroken) finalists: Classic McEliece, HQC, BIKE.

10. KEMs

10.1. What is a KEM

Key Encapsulation Mechanism (KEM) is a cryptographic technique used for securely exchanging symmetric keys between two parties over an insecure channel. It is commonly used in hybrid encryption schemes, where a combination of asymmetric (public-key) and symmetric encryption is employed. The KEM encapsulation results in a fixed-length symmetric key that can be used in one of two ways: (1) Derive a Data Encryption Key (DEK) to encrypt the data (2) Derive a Key Encryption Key (KEK) used to wrap the DEK.

KEM relies on the following primitives [PQCAPI]:

  • def kemKeyGen() -> (pk, sk)
  • def kemEncaps(pk) -> (ct, ss)
  • def kemDecaps(ct, sk) -> ss

where pk is public key, sk is secret key, ct is the ciphertext representing an encapsulated key, and ss is shared secret. The following figure illustrates a sample flow of KEM:

Client Server \ sk, pk = kemKeyGen() -| | pk \ - ss, ct = kemEncaps(pk) ct \ ss = kemDecaps(ct, sk) -

10.1.1. Interactivity in PQC KEM and Diffie-Hellman (DH) Key Exchange

PQ KEMs are interactive in nature because it involves back-and-forth communication to negotiate and establish the shared secret key and unlike Diffie-Hellman (DH) Key exchange (KEX) which provides non-interactive key exchange (NIKE) property. NIKE is a cryptographic primitive which enables two parties, who know each others public keys, to agree on a symmetric shared key without requiring any interaction. The following figure illustrates a sample flow of DH:

Client Server \ sk1, pk1 = KeyGen() - pk1 \ - sk2, pk2 = KeyGen() ss = Combine(pk1, sk2) pk2 \ ss = Combine(pk2, sk1) -

10.2. HPKE

HPKE (Hybrid public key encryption) [RFC9180] deals with a variant of KEM which is essentially a PKE of arbitrary sized plaintexts for a recipient public key. It works with a combination of KEMs, KDFs and AEAD schemes (Authenticated Encryption with Additional Data). HPKE includes three authenticated variants, including one that authenticates possession of a pre-shared key and two optional ones that authenticate possession of a key encapsulation mechanism (KEM) private key. Kyber, which is a KEM does not support the static-ephemeral key exchange that allows HPKE based on DH based KEMs its (optional) authenticated modes as discussed in Section 1.2 of [I-D.westerbaan-cfrg-hpke-xyber768d00-02].

10.3. Security property

  • IND-CCA2 : IND-CCA2 (INDistinguishability under adaptive Chosen-Ciphertext Attack) is an advanced security notion for encryption schemes. It ensures the confidentiality of the plaintext, resistance against chosen-ciphertext attacks, and prevents the adversary from forging new ciphertexts. An appropriate definition of IND-CCA2 security for KEMs can be found in [CS01] and [BHK09]. Kyber, Classic McEliece and Saber provide IND-CCA2 security.

Understanding IND-CCA2 security is essential for individuals involved in designing or implementing cryptographic systems to evaluate the strength of the algorithm, assess its suitability for specific use cases, and ensure that data confidentiality and security requirements are met.

11. PQC Signatures

11.1. What is a Post-quantum Signature

Any digital signature scheme that provides a construction defining security under post quantum setting falls under this category of PQ signatures.

11.2. Security property

  • EUF-CMA : EUF-CMA (Existential Unforgeability under Chosen Message Attack) [GMR88] is a security notion for digital signature schemes. It guarantees that an adversary, even with access to a signing oracle, cannot forge a valid signature for an arbitrary message. EUF-CMA provides strong protection against forgery attacks, ensuring the integrity and authenticity of digital signatures by preventing unauthorized modifications or fraudulent signatures. Dilithium, Falcon and Sphincs+ provide EUF-CMA security.

Understanding EUF-CMA security is essential for individual involved in designing or implementing cryptographic systems to ensure the security, reliability, and trustworthiness of digital signature schemes. It allows for informed decision-making, vulnerability analysis, compliance with standards, and designing systems that provide strong protection against forgery attacks.

11.3. Details of FALCON, Dilithium, and SPHINCS+

Dilithium [Dilithium] is a digital signature algorithm (part of the CRYSTALS suite) based on the hardness lattice problems over module lattices (i.e., the Module Learning with Errors problem(MLWE)). The design of the algorithm is based on Fiat Shamir with Abort method that leverages rejection sampling to render lattice based FS schemes compact and secure. Additionally, Dilithium offers both deterministic and randomized signing. Security properties of Dilithium are discussed in Section 9 of [I-D.ietf-lamps-dilithium-certificates].

Falcon [Falcon] is based on the GPV hash-and-sign lattice-based signature framework introduced by Gentry, Peikert and Vaikuntanathan, which is a framework that requires a class of lattices and a trapdoor sampler technique.

The main design principle of Falcon is compactness, i.e. it was designed in a way that achieves minimal total memory bandwidth requirement (the sum of the signature size plus the public key size). This is possible due to the compactness of NTRU lattices. Falcon also offers very efficient signing and verification procedures. The main potential downsides of Falcon refer to the non-triviality of its algorithms and the need for floating point arithmetic support.

Access to a robust floating-point stack in Falcon is essential for accurate, efficient, and secure execution of the mathematical computations involved in the scheme. It helps maintain precision, supports error correction techniques, and contributes to the overall reliability and performance of Falcon's cryptographic operations as well makes it more resistant to side-channel attacks.

Falcon's signing operations require constant-time, 64-bit floating point operations to avoid catastrophic side channel vulnerabilities. Doing this correctly (which is also platform-dependent to an extreme degree) is very difficult, as NIST's report noted. Providing a masked implementation of Falcon also seems impossible, per the authors at the RWPQC 2023 symposium earlier this year.

The performance characteristics of Dilithium and Falcon may differ based on the specific implementation and hardware platform. Generally, Dilithium is known for its relatively fast signature generation, while Falcon can provide more efficient signature verification. The choice may depend on whether the application requires more frequent signature generation or signature verification. For further clarity, please refer to the tables in sections Section 12 and Section 13.

SPHINCS+ [SPHINCS] utilizes the concept of stateless hash-based signatures, where each signature is unique and unrelated to any previous signature (as discussed in Section 9.2). This property eliminates the need for maintaining state information during the signing process. SPHINCS+ was designed to sign up to 2^64 messages and it offers three security levels. The parameters for each of the security levels were chosen to provide 128 bits of security, 192 bits of security, and 256 bits of security. SPHINCS+ offers smaller key sizes, larger signature sizes, slower signature generation, and slower verification when compared to Dilithium and Falcon. SPHINCS+ does not introduce a new intractability assumption. It builds upon established foundations in cryptography, making it a reliable and robust digital signature scheme for a post-quantum world. The advantages and disadvantages of SPHINCS+ over other signature algorithms is disussed in Section 3.1 of [I-D.draft-ietf-cose-sphincs-plus].

11.4. Details of XMSS and LMS

The eXtended Merkle Signature Scheme (XMSS) [RFC8391] and Leighton-Micali Signature (LMS) [RFC8554] are stateful hash-based signature schemes, where the secret key changes over time. In both schemes, reusing a secret key state compromises cryptographic security guarantees.

Multi-Tree XMSS and LMS can be used for signing a potentially large but fixed number of messages and the number of signing operations depends upon the size of the tree. XMSS and LMS provide cryptographic digital signatures without relying on the conjectured hardness of mathematical problems, instead leveraging the properties of cryptographic hash functions. XMSS and Hierarchical Signature System (HSS) use a hierarchical approach with a Merkle tree at each level of the hierarchy. [RFC8391] describes both single-tree and multi-tree variants of XMSS, while [RFC8554] describes the Leighton-Micali One-Time Signature (LM-OTS) system as well as the LMS and HSS N-time signature systems. Comparison of XMSS and LMS is discussed in Section 10 of [RFC8554].

The number of tree layers in XMSS^MT provides a trade-off between signature size on the one side and key generation and signing speed on the other side. Increasing the number of layers reduces key generation time exponentially and signing time linearly at the cost of increasing the signature size linearly.

XMSS and LMS can be applied in various scenarios where digital signatures are required, such as software updates.

11.5. Hash-then-Sign Versus Sign-then-Hash

Within the hash-then-sign paradigm, the message is hashed before signing it. Hashing the message before signing it provides an additional layer of security by ensuring that only a fixed-size digest of the message is signed, rather than the entire message itself. By pre-hashing, the onus of resistance to existential forgeries becomes heavily reliant on the collision-resistance of the hash function in use. As well as this security goal, the hash-then-sign paradigm also has the ability to improve performance by reducing the size of signed messages. As a corollary, hashing remains mandatory even for short messages and assigns a further computational requirement onto the verifier. This makes the performance of hash-then-sign schemes more consistent, but not necessarily more efficient. Using a hash function to produce a fixed-size digest of a message ensures that the signature is compatible with a wide range of systems and protocols, regardless of the specific message size or format. Hash-then-Sign also greatly reduces the amount of data that needs to be processed by a hardware security module, which sometimes have somewhat limited data processing capabilities.

Protocols like TLS 1.3 and DNSSEC use the Hash-then-Sign paradigm. TLS 1.3 [RFC8446] uses it in the Certificate Verify to proof that the endpoint possesses the private key corresponding to its certificate, while DNSSEC [RFC4033] uses it to provide origin authentication and integrity assurance services for DNS data.

In the case of Dilithium, it internally incorporates the necessary hash operations as part of its signing algorithm. Dilithium directly takes the original message, applies a hash function internally, and then uses the resulting hash value for the signature generation process. In case of SPHINCS+, it internally performs randomized message compression using a keyed hash function that can process arbitrary length messages. In case of Falcon, a hash function is used as part of the signature process, it uses the SHAKE-256 hash function to derive a digest of the message being signed. Therefore, the hash-then-sign paradigm is not needed for Dilithium, SPHINCS+ and Falcon.

12. Recommendations for Security / Performance Tradeoffs

The table below denotes the 5 security levels provided by NIST required for PQC algorithms. Users can leverage the required algorithm based on the security level based on their use case. The security is defined as a function of resources required to break AES and SHA2/SHA3 algorithms, i.e., exhaustive key recovery for AES and optimal collision search for SHA2/SHA3.

Table 1
PQ Security Level AES/SHA(2/3) hardness PQC Algorithm
1 Atleast as hard as to break AES-128 (exhaustive key recovery) Kyber512, Falcon512, Sphincs+SHA-256 128f/s
2 Atleast as hard as to break SHA-256/SHA3-256 (collision search) Dilithium2
3 Atleast as hard as to break AES-192 (exhaustive key recovery) Kyber768, Dilithium3, Sphincs+SHA-256 192f/s
4 Atleast as hard as to break SHA-384/SHA3-384 (collision search) No algorithm tested at this level
5 Atleast as hard as to break AES-256 (exhaustive key recovery) Kyber1024, Falcon1024, Dilithium5, Sphincs+SHA-256 256f/s

Please note the Sphincs+SHA-256 x"f/s" in the above table denotes whether its the Sphincs+ fast (f) version or small (s) version for "x" bit AES security level. Refer to [I-D.ietf-lamps-cms-sphincs-plus-02] for further details on Sphincs+ algorithms.

The following table discusses the signature size differences for similar SPHINCS+ algorithm security levels with the "simple" version but for different categories i.e., (f) for fast verification and (s) for compactness/smaller. Both SHA-256 and SHAKE-256 parametrisation output the same signature sizes, so both have been included.

Table 2
PQ Security Level Algorithm Public key size (in bytes) Private key size (in bytes) Signature size (in bytes)
1 SPHINCS+-{SHA2,SHAKE}-128f 32 64 17088
1 SPHINCS+-{SHA2,SHAKE}-128s 32 64 7856
3 SPHINCS+-{SHA2,SHAKE}-192f 48 96 35664
3 SPHINCS+-{SHA2,SHAKE}-192s 48 96 16224
5 SPHINCS+-{SHA2,SHAKE}-256f 64 128 49856
5 SPHINCS+-{SHA2,SHAKE}-256s 64 128 29792

The following table discusses the impact of performance on different security levels in terms of private key sizes, public key sizes and ciphertext/signature sizes.

Table 3
PQ Security Level Algorithm Public key size (in bytes) Private key size (in bytes) Ciphertext/Signature size (in bytes)
1 Kyber512 800 1632 768
1 Falcon512 897 1281 666
2 Dilithium2 1312 2528 2420
3 Kyber768 1184 2400 1088
5 Falcon1024 1793 2305 1280
5 Kyber1024 1568 3168 1588

13. Comparing PQC KEMs/Signatures vs Traditional KEMs (KEXs)/Signatures

In this section, we provide two tables for comparison of different KEMs and Signatures respectively, in the traditional and Post scenarios. These tables will focus on the secret key sizes, public key sizes, and ciphertext/signature sizes for the PQC algorithms and their traditional counterparts of similar security levels.

The first table compares traditional vs. PQC KEMs in terms of security, public, private key sizes, and ciphertext sizes.

Table 4
PQ Security Level Algorithm Public key size (in bytes) Private key size (in bytes) Ciphertext size (in bytes)
Traditional P256_HKDF_SHA-256 65 32 65
Traditional P521_HKDF_SHA-512 133 66 133
Traditional X25519_HKDF_SHA-256 32 32 32
1 Kyber512 800 1632 768
3 Kyber768 1184 2400 1088
5 Kyber1024 1568 3168 1588

The next table compares traditional vs. PQC Signature schemes in terms of security, public, private key sizes, and signature sizes.

Table 5
PQ Security Level Algorithm Public key size (in bytes) Private key size (in bytes) Signature size (in bytes)
Traditional RSA2048 256 256 256
Traditional P256 64 32 64
1 Falcon512 897 1281 666
2 Dilithium2 1312 2528 768
3 Dilithium3 1952 4000 3293
5 Falcon1024 1793 2305 1280

As one can clearly observe from the above tables, leveraging a PQC KEM/Signature significantly increases the key sizes and the ciphertext/signature sizes as well as compared to traditional KEM(KEX)/Signatures. But the PQC algorithms do provide the additional security level in case there is an attack from a CRQC, whereas schemes based on prime factorization or discrete logarithm problems (finite field or elliptic curves) would provide no level of security at all against such attacks.

14. Post-Quantum and Traditional Hybrid Schemes

The migration to PQC is unique in the history of modern digital cryptography in that neither the traditional algorithms nor the post-quantum algorithms are fully trusted to protect data for the required lifetimes. The traditional algorithms, such as RSA and elliptic curve, will fall to quantum cryptalanysis, while the post-quantum algorithms face uncertainty about the underlying mathematics, compliance issues, unknown vulnerabilities, and hardware and software implementations that have not had sufficient maturing time to rule out classical cryptanalytic attacks and implementation bugs.

During the transition from traditional to post-quantum algorithms, there may be a desire or a requirement for protocols that use both algorithm types. [I-D.ietf-pquip-pqt-hybrid-terminology] defines the terminology for the Post-Quantum and Traditional Hybrid Schemes.

14.1. PQ/T Hybrid Confidentiality

The PQ/T Hybrid Confidentiality property can be used to protect from a "Harvest Now, Decrypt Later" attack, which refers to an attacker collecting encrypted data now and waiting for quantum computers to become powerful enough to break the encryption later. Two types of hybrid key agreement schemes are discussed below:

  1. Concatenate hybrid key agreement scheme: The final shared secret that will be used as an input of the key derivation function is the result of the concatenation of the secrets established with each key agreement scheme. For example, in [I-D.ietf-tls-hybrid-design], the client uses the TLS supported groups extension to advertise support for a PQ/T hybrid scheme, and the server can select this group if it supports the scheme. The hybrid-aware client and server establish a hybrid secret by concatenating the two shared secrets, which is used as the shared secret in the existing TLS 1.3 key schedule.
  2. Cascade hybrid key agreement scheme: The final shared secret is computed by applying as many iterations of the key derivation function as the number of key agreement schemes composing the hybrid key agreement scheme. For example, [RFC9370] extends the Internet Key Exchange Protocol Version 2 (IKEv2) to allow one or more PQC algorithms in addition to the traditional algorithm to derive the final IKE SA keys using the cascade method as explained in Section 2.2.2 of [RFC9370].

14.2. PQ/T Hybrid Authentication

The PQ/T Hybrid Authentication property can be utilized in scenarios where an on-path attacker possesses network devices equipped with CRQCs, capable of breaking traditional authentication protocols. This property ensures authentication through a PQ/T hybrid scheme or a PQ/T hybrid protocol, as long as at least one component algorithm remains secure to provide the intended security level. For instance, a PQ/T hybrid certificate can be employed to facilitate a PQ/T hybrid authentication protocol. However, a PQ/T hybrid authentication protocol does not need to use a PQ/T hybrid certificate [I-D.ounsworth-pq-composite-keys]; separate certificates could be used for individual component algorithms [I-D.ietf-lamps-cert-binding-for-multi-auth].

The frequency and duration of system upgrades and the time when CRQCs will become widely available need to be weighed in to determine whether and when to support the PQ/T Hybrid Authentication property.

14.3. Additional Considerations

It is also possible to use more than two algorithms together in a hybrid scheme, and there are multiple possible ways those algorithms can be combined. For the purposes of a post-quantum transition, the simple combination of a post-quantum algorithm with a single classical algorithm is the most straightforward, but the use of multiple post-quantum algorithms with different hard math problems has also been considered. When combining algorithms, it is possible to require that both algorithms validate (the so-called "and" mode) or that only one does (the "or" mode), or even some more complicated scheme. Schemes that do not require both algorithms to validate only have the strength of the weakest algorithm, and therefore offer little or no security benefit. Since such schemes generally also require both keys to be distributed (e.g. https://datatracker.ietf.org/doc/html/draft-truskovsky-lamps-pq-hybrid-x509-01), there are substantial performance costs in some scenarios. This combination of properties makes optionally including post-quantum keys without requiring their use to be generally unattractive in most use cases.

When combining keys in an "and" mode, it may make more sense to consider them to be a single composite key, instead of two keys. This generally requires fewer changes to various components of PKI ecosystems, many of which are not prepared to deal with two keys or dual signatures. To them, a "composite" algorithm composed of two other algorithms is simply a new algorithm, and support for adding new algorithms generally already exists. All that needs to be done is to standardize the formats of how the two keys from the two algorithms are combined into a single data structure, and how the two resulting signatures are combined into a single signature. The answer can be as simple as concatenation, if the lengths are fixed or easily determined.

One last consideration is the pairs of algorithms that can be combined. A recent trends in protocols is to only allow a small number of "known good" configurations that make sense, instead of allowing arbitrary combinations of individual configuration choices that may interact in dangerous ways. The current consensus is that the same approach should be followed for combining cryptographic algorithms, and that "known good" pairs should be explicitly listed ("explicit composite"), instead of just allowing arbitrary combinations of any two crypto algorithms ("generic composite").

The same considerations apply when using multiple certificates to transport a pair of related keys for the same subject. Exactly how two certificates should be managed in order to avoid some of the pitfalls mentioned above is still an active area of investigation. Using two certificates keeps the certificate tooling simple and straightforward, but in the end simply moves the problems with requiring that both certs are intended to be used as a pair, and both must validate, to the certificate management layer, where they still need to be addressed.

At least one scheme has been proposed that allows the pair of certificates to exist as a single certificate when being issued and managed, but dynamically split into individual certificates when needed (https://datatracker.ietf.org/doc/draft-bonnell-lamps-chameleon-certs/).

Many of these points are still being actively explored and discussed, and the consensus may change over time.

15. Security Considerations

15.1. Cryptanalysis

Classical cryptanalysis exploits weaknesses in algorithm design, mathematical vulnerabilities, or implementation flaws, whereas quantum cryptanalysis harnesses the power of CRQCs to solve specific mathematical problems more efficiently. Both pose threats to the security of cryptographic algorithms, including those used in PQC. Developing and adopting new cryptographic algorithms resilient against these threats is crucial for ensuring long-term security in the face of advancing cryptanalysis techniques. Recent attacks on the side-channel implementations using deep learning based power analysis have also shown that one needs to be cautious while implementing the required PQC algorithms in hardware. Two of the most recent works include: one attack on Kyber [KyberSide] and one attack on Saber [SaberSide]. Evolving threat landscape points to the fact that lattice based cryptography is indeed more vulnerable to side-channel attacks as in [SideCh], [LatticeSide]. Consequently, there were some mitigation techniques for side channel attacks that have been proposed as in [Mitigate1], [Mitigate2], and [Mitigate3].

15.2. Cryptographic Agility

Cryptographic agility is relevant for both classical and quantum cryptanalysis as it enables organizations to adapt to emerging threats, adopt stronger algorithms, comply with standards, and plan for long-term security in the face of evolving cryptanalytic techniques and the advent of CRQCs. Several PQC schemes are available that need to be tested; cryptography experts around the world are pushing for the best possible solutions, and the first standards that will ease the introduction of PQC are being prepared. It is of paramount importance and a call for imminent action for organizations, bodies, and enterprises to start evaluating their cryptographic agility, assess the complexity of implementing PQC into their products, processes, and systems, and develop a migration plan that achieves their security goals to the best possible extent.

15.3. Hybrid Key Exchange : Bridging the Gap Between Post-Quantum and Traditional Cryptography

Post-quantum algorithms selected for standardization are relatively new and they they have not been subject to the same depth of study as traditional algorithms. In addition, certain deployments may need to retain traditional algorithms due to regulatory constraints, for example FIPS compliance. Hybrid key exchange enables potential security against "Harvest Now, Decrypt Later" attack while not fully abandoning traditional cryptosystems.

16. Further Reading & Resources

16.1. Reading List

(A reading list. Serious Cryptography. Pointers to PQC sites with good explanations. List of reasonable Wikipedia pages.)

17. Contributors

The following individuals have contributed to this document:

Kris Kwiatkowski

PQShield, LTD

United Kingdom.

kris@amongbytes.com

Acknowledgements

It leverages text from https://github.com/paulehoffman/post-quantum-for-engineers/blob/main/pqc-for-engineers.md. Thanks to Dan Wing, Florence D, Thom Wiggers, Sophia Grundner-Culemann, Sofia Celi, Melchior Aelmans, and Falko Strenzke for the discussion, review and comments.

References

Normative References

[RFC2119]
Bradner, S., "Key words for use in RFCs to Indicate Requirement Levels", BCP 14, RFC 2119, DOI 10.17487/RFC2119, , <https://www.rfc-editor.org/rfc/rfc2119>.
[RFC8174]
Leiba, B., "Ambiguity of Uppercase vs Lowercase in RFC 2119 Key Words", BCP 14, RFC 8174, DOI 10.17487/RFC8174, , <https://www.rfc-editor.org/rfc/rfc8174>.
[RFC8391]
Huelsing, A., Butin, D., Gazdag, S., Rijneveld, J., and A. Mohaisen, "XMSS: eXtended Merkle Signature Scheme", RFC 8391, DOI 10.17487/RFC8391, , <https://www.rfc-editor.org/rfc/rfc8391>.
[RFC8554]
McGrew, D., Curcio, M., and S. Fluhrer, "Leighton-Micali Hash-Based Signatures", RFC 8554, DOI 10.17487/RFC8554, , <https://www.rfc-editor.org/rfc/rfc8554>.

Informative References

[BHK09]
"Subtleties in the Definition of IND-CCA: When and How Should Challenge-Decryption be Disallowed?", <https://eprint.iacr.org/2009/418>.
[Cloudflare]
"NIST’s pleasant post-quantum surprise", <https://blog.cloudflare.com/nist-post-quantum-surprise/>.
[CNSA2-0]
"Announcing the Commercial National Security Algorithm Suite 2.0", <https://media.defense.gov/2022/Sep/07/2003071834/-1/-1/0/CSA_CNSA_2.0_ALGORITHMS_.PDF>.
[CS01]
"Design and Analysis of Practical Public-Key Encryption Schemes Secure against Adaptive Chosen Ciphertext Attack", <https://eprint.iacr.org/2001/108>.
[Dilithium]
"Cryptographic Suite for Algebraic Lattices (CRYSTALS) - Dilithium", <https://pq-crystals.org/dilithium/index.shtml>.
[Falcon]
"Fast Fourier lattice-based compact signatures over NTRU", <https://falcon-sign.info/>.
[GMR88]
"A digital signature scheme secure against adaptive chosen-message attacks.", <https://people.csail.mit.edu/silvio/Selected%20Scientific%20Papers/Digital%20Signatures/A_Digital_Signature_Scheme_Secure_Against_Adaptive_Chosen-Message_Attack.pdf>.
[Google]
"Quantum Supremacy Using a Programmable Superconducting Processor", <https://ai.googleblog.com/2019/10/quantum-supremacy-using-programmable.html>.
"C. Zalka, “Grover’s quantum searching algorithm is optimal,” Physical Review A, vol. 60, pp. 2746-2751, 1999.".
[I-D.draft-ietf-cose-sphincs-plus]
Prorock, M., Steele, O., Misoczki, R., Osborne, M., and C. Cloostermans, "JOSE and COSE Encoding for SPHINCS+", Work in Progress, Internet-Draft, draft-ietf-cose-sphincs-plus-01, , <https://datatracker.ietf.org/doc/html/draft-ietf-cose-sphincs-plus-01>.
[I-D.ietf-lamps-cert-binding-for-multi-auth]
Becker, A., Guthrie, R., and M. J. Jenkins, "Related Certificates for Use in Multiple Authentications within a Protocol", Work in Progress, Internet-Draft, draft-ietf-lamps-cert-binding-for-multi-auth-01, , <https://datatracker.ietf.org/doc/html/draft-ietf-lamps-cert-binding-for-multi-auth-01>.
[I-D.ietf-lamps-cms-sphincs-plus-02]
Housley, R., Fluhrer, S., Kampanakis, P., and B. Westerbaan, "Use of the SPHINCS+ Signature Algorithm in the Cryptographic Message Syntax (CMS)", Work in Progress, Internet-Draft, draft-ietf-lamps-cms-sphincs-plus-02, , <https://datatracker.ietf.org/doc/html/draft-ietf-lamps-cms-sphincs-plus-02>.
[I-D.ietf-lamps-dilithium-certificates]
Massimo, J., Kampanakis, P., Turner, S., and B. Westerbaan, "Internet X.509 Public Key Infrastructure: Algorithm Identifiers for Dilithium", Work in Progress, Internet-Draft, draft-ietf-lamps-dilithium-certificates-02, , <https://datatracker.ietf.org/doc/html/draft-ietf-lamps-dilithium-certificates-02>.
[I-D.ietf-pquip-pqt-hybrid-terminology]
D, F., "Terminology for Post-Quantum Traditional Hybrid Schemes", Work in Progress, Internet-Draft, draft-ietf-pquip-pqt-hybrid-terminology-00, , <https://datatracker.ietf.org/doc/html/draft-ietf-pquip-pqt-hybrid-terminology-00>.
[I-D.ietf-tls-hybrid-design]
Stebila, D., Fluhrer, S., and S. Gueron, "Hybrid key exchange in TLS 1.3", Work in Progress, Internet-Draft, draft-ietf-tls-hybrid-design-06, , <https://datatracker.ietf.org/doc/html/draft-ietf-tls-hybrid-design-06>.
[I-D.ounsworth-pq-composite-keys]
Ounsworth, M., Gray, J., Pala, M., and J. Klaußner, "Composite Public and Private Keys For Use In Internet PKI", Work in Progress, Internet-Draft, draft-ounsworth-pq-composite-keys-05, , <https://datatracker.ietf.org/doc/html/draft-ounsworth-pq-composite-keys-05>.
[I-D.westerbaan-cfrg-hpke-xyber768d00-02]
Westerbaan, B. and C. A. Wood, "X25519Kyber768Draft00 hybrid post-quantum KEM for HPKE", Work in Progress, Internet-Draft, draft-westerbaan-cfrg-hpke-xyber768d00-02, , <https://datatracker.ietf.org/doc/html/draft-westerbaan-cfrg-hpke-xyber768d00-02>.
[IBM]
"IBM Unveils 400 Qubit-Plus Quantum Processor and Next-Generation IBM Quantum System Two", <https://newsroom.ibm.com/2022-11-09-IBM-Unveils-400-Qubit-Plus-Quantum-Processor-and-Next-Generation-IBM-Quantum-System-Two>.
[IBMRoadmap]
"The IBM Quantum Development Roadmap", <https://www.ibm.com/quantum/roadmap>.
[KyberSide]
"A Side-Channel Attack on a Hardware Implementation of CRYSTALS-Kyber", <https://eprint.iacr.org/2022/1452>.
[LattFail1]
"Decryption Failure Attacks on IND-CCA Secure Lattice-Based Schemes", <https://link.springer.com/chapter/10.1007/978-3-030-17259-6_19#chapter-info>.
[LattFail2]
"(One) Failure Is Not an Option: Bootstrapping the Search for Failures in Lattice-Based Encryption Schemes.", <https://link.springer.com/chapter/10.1007/978-3-030-45727-3_1>.
[LatticeSide]
"Generic Side-channel attacks on CCA-secure lattice-based PKE and KEM schemes", <https://eprint.iacr.org/2019/948>.
[Mitigate1]
"POLKA: Towards Leakage-Resistant Post-Quantum CCA-Secure Public Key Encryption", <https://eprint.iacr.org/2022/873>.
[Mitigate2]
"Leakage-Resilient Certificate-Based Authenticated Key Exchange Protocol", <https://ieeexplore.ieee.org/document/9855226>.
[Mitigate3]
"Post-Quantum Authenticated Encryption against Chosen-Ciphertext Side-Channel Attacks", <https://eprint.iacr.org/2022/916>.
[NIST]
"Post-Quantum Cryptography Standardization", <https://csrc.nist.gov/projects/post-quantum-cryptography/post-quantum-cryptography-standardization>.
[Nokia]
"Interference Measurements of Non-Abelian e/4 & Abelian e/2 Quasiparticle Braiding", <https://journals.aps.org/prx/pdf/10.1103/PhysRevX.13.011028>.
[PQCAPI]
"PQC - API notes", <https://csrc.nist.gov/CSRC/media/Projects/Post-Quantum-Cryptography/documents/example-files/api-notes.pdf>.
[QC-DNS]
"Quantum Computing and the DNS", <https://www.icann.org/octo-031-en.pdf>.
[RFC4033]
Arends, R., Austein, R., Larson, M., Massey, D., and S. Rose, "DNS Security Introduction and Requirements", RFC 4033, DOI 10.17487/RFC4033, , <https://www.rfc-editor.org/rfc/rfc4033>.
[RFC6090]
McGrew, D., Igoe, K., and M. Salter, "Fundamental Elliptic Curve Cryptography Algorithms", RFC 6090, DOI 10.17487/RFC6090, , <https://www.rfc-editor.org/rfc/rfc6090>.
[RFC8446]
Rescorla, E., "The Transport Layer Security (TLS) Protocol Version 1.3", RFC 8446, DOI 10.17487/RFC8446, , <https://www.rfc-editor.org/rfc/rfc8446>.
[RFC9180]
Barnes, R., Bhargavan, K., Lipp, B., and C. Wood, "Hybrid Public Key Encryption", RFC 9180, DOI 10.17487/RFC9180, , <https://www.rfc-editor.org/rfc/rfc9180>.
[RFC9370]
Tjhai, CJ., Tomlinson, M., Bartlett, G., Fluhrer, S., Van Geest, D., Garcia-Morchon, O., and V. Smyslov, "Multiple Key Exchanges in the Internet Key Exchange Protocol Version 2 (IKEv2)", RFC 9370, DOI 10.17487/RFC9370, , <https://www.rfc-editor.org/rfc/rfc9370>.
[RSA]
"A Method for Obtaining Digital Signatures and Public-Key Cryptosystems+", <https://dl.acm.org/doi/pdf/10.1145/359340.359342>.
[RSA10SC]
"Breaking RSA Encryption - an Update on the State-of-the-Art", <https://www.quintessencelabs.com/blog/breaking-rsa-encryption-update-state-art>.
[RSA8HRS]
"How to factor 2048 bit RSA integers in 8 hours using 20 million noisy qubits", <https://arxiv.org/abs/1905.09749>.
[SaberSide]
"A side-channel attack on a masked and shuffled software implementation of Saber", <https://link.springer.com/article/10.1007/s13389-023-00315-3>.
[SideCh]
"Side-Channel Attacks on Lattice-Based KEMs Are Not Prevented by Higher-Order Masking", <https://eprint.iacr.org/2022/919>.
[SPHINCS]
"SPHINCS+", <https://sphincs.org/index.html>.
[Threat-Report]
"Quantum Threat Timeline Report 2020", <https://globalriskinstitute.org/publications/quantum-threat-timeline-report-2020/>.

Authors' Addresses

Aritra Banerjee
Nokia
Munich
Germany
Tirumaleswar Reddy
Nokia
Bangalore
Karnataka
India
Dimitrios Schoinianakis
Nokia
Athens
Greece
Timothy Hollebeek
DigiCert
Pittsburgh,
United States of America