What Is Post-Quantum Cryptography (PQC)? A Complete Guide

5 min. read

Post-quantum security is the protection of digital information and communications against attacks from quantum computers. It focuses on replacing public-key cryptography methods, such as RSA and elliptic-curve systems, that could be broken by quantum algorithms.

Post-quantum security depends on new cryptographic standards and algorithms designed to remain secure against both classical and quantum computational threats.

Infographic titled 'Post-quantum cryptography explained'. The diagram is divided into five horizontal sections labeled Part 1 through Part 4, with a concluding takeaway bar. Part 1, labeled 'The problem', contains two red boxes: one labeled 'RSA and ECC today' with text 'Secure against classical computers by using factoring and discrete logarithms', and the other labeled 'Quantum threat' with text 'Shor's algorithm on a quantum computer could break RSA and ECC'. Part 2, labeled 'The solution', shows a blue box reading 'Post-quantum cryptography (PQC)' with text 'New encryption methods based on math problems that remain hard for both classical and quantum computers'. Part 3, labeled 'Algorithm families', presents three purple boxes. The first, 'Lattice-based', reads 'Foundation of ML-KEM & ML-DSA; uses high-dimensional algebraic structures'. The second, 'Hash-based', reads 'Relies on secure one-way hash functions; basis of SPHINCS'. The third, 'Multivariate', reads 'Uses polynomial equations; still in research stages'. Part 4, labeled 'NIST standards', includes three gray circular icons with accompanying text: 'ML-KEM (FIPS 203) Standard for key establishment', 'ML-DSA (FIPS 204) Standard for digital signatures', and 'SLH-DSA (FIPS 205) Stateless hash-based digital signature scheme'. A dark gray bar at the bottom labeled 'Takeaway' contains the statement 'PQC is the standards-led path forward - practical and deployable today'.

 

Why do we need new cryptography for the quantum era?

"To stave off attacks by a quantum computer — if and when a cryptographically relevant one is built — the worldwide community must retire current encryption algorithms. Post-quantum encryption algorithms must be based on math problems that would be difficult for both conventional and quantum computers to solve."

Quantum computers will eventually change what's required to keep data secure because they can solve certain mathematical problems much faster than classical computers. That matters because modern encryption depends on problems that are deliberately hard to solve.

In other words:

The math that protects today's internet traffic, digital identities, and financial systems won't hold up once quantum computing reaches scale.

Public-key cryptography—like RSA, elliptic-curve, and Diffie-Hellman—depends on factoring or discrete logarithms. Those problems are exactly what quantum algorithms, such as Shor's, can solve efficiently. And when that happens, attackers could decrypt private communications or forge digital signatures.

Symmetric encryption is different. It can be strengthened by larger key sizes. AES-256 is expected to remain strong under quantum attack. But public-key systems can't simply scale up. Their structure becomes fundamentally weak in the face of quantum computation.

That's why the shift to new cryptography is already underway:

  • The U.S. National Institute of Standards and Technology (NIST) has finalized its first set of post-quantum cryptographic standards.
  • The National Security Agency's CNSA 2.0 suite sets federal migration timelines.
  • And the Cybersecurity and Infrastructure Security Agency (CISA) has issued quantum-readiness guidance urging organizations to start inventorying cryptographic systems now.

The reason is simple. Sensitive data stolen today could be decrypted tomorrow.

Horizontal process diagram titled 'Harvest now, decrypt later (HNDL)' showing five sequential steps connected by arrows. Step 1, in a blue square, reads 'Data exfiltration' with subtext 'Steals encrypted traffic or files.' Step 2, in a lighter blue square, reads 'Cold storage' with subtext 'Keeps ciphertext for years.' Step 3, in an orange square, reads 'Advances in quantum computing' with subtext 'Waits for quantum systems.' Step 4, in a white square with a blue lock icon, reads 'Decrypt later' with subtext 'Shor's breaks RSA/ECC.' Step 5, in a purple square, reads 'Use the plaintext' with subtext 'Read, sell, or forge identities.' Small text under several steps notes 'Years can pass' to indicate elapsed time between stages.

The sooner organizations adopt quantum-resistant algorithms, the longer their information will stay secure. That's why we need new cryptography for the quantum era–and why quantum security has become such a major focus area for so many businesses.

| Further reading:

 

What makes post-quantum algorithms different from what we use today?

Today's encryption depends on math that classical computers can't solve efficiently.

RSA and elliptic-curve cryptography rely on factoring and discrete logarithms. These problems take so long for traditional computers to compute that they're considered secure.

Quantum computers change that.

They can solve both of those problems exponentially faster using algorithms like Shor's. Which means: once large-scale quantum systems exist, they could break the foundations of most public-key encryption and digital signatures.

Comparison: Classical vs. post-quantum cryptography
Aspect Classical public-key cryptography Post-quantum cryptography (PQC)
Mathematical foundation Based on factoring (RSA) and discrete logarithms (ECC, Diffie–Hellman) Based on problems like lattices, hash chains, error-correcting codes, and multivariate equations
Vulnerability to quantum attacks Breakable by Shor's algorithm on large-scale quantum computers Designed to resist both classical and quantum attacks
Hardware requirements Runs on classical computers Runs on classical computers — no quantum hardware needed
Security assumption Computational difficulty for classical systems Hardness believed to hold against classical and quantum attacks
Examples RSA, ECDSA, Diffie–Hellman ML-KEM (KEM), ML-DSA (signatures), SLH-DSA (signatures), Classic McEliece (KEM).
Readiness for deployment Already in use globally but becoming obsolete under quantum threat NIST standards finalized and ready for phased deployment

Post-quantum algorithms avoid those weaknesses altogether.

Important: post-quantum algorithms are not quantum themselves. They run on the same classical hardware we already use. That's why they can be implemented now—without waiting for quantum computers to arrive.

The goal is dual resistance.

Post-quantum cryptography is designed to stay secure against both classical and quantum adversaries. In other words, it keeps today's data protected now and after quantum computing becomes practical.

 

What types of algorithms make up post-quantum cryptography?

Chart titled 'Families of post-quantum cryptographic algorithms'. The chart contains five columns labeled Lattice, Hash, Code, Multivariate, and Isogeny beneath a heading that defines post-quantum cryptography as an ecosystem of algorithms built on new mathematical problems that resist both classical and quantum attacks. Each column lists five rows describing different attributes. In the 'Mathematical basis' row, entries include 'High-dimensional lattices (LWE, SIS)', 'One-way hash functions', 'Error-correcting codes', 'Nonlinear polynomial equations over finite fields', and 'Relationships between elliptic curves (isogenies)'. The 'Primary use case' row lists 'Key establishment and signatures', 'Signatures only', 'Key establishment (KEM)', 'Signatures', and 'Encryption and signatures'. The 'Example algorithms' row includes 'ML-KEM, ML-DSA', 'SLH-DSA, XMSS, LMS', 'Classic McEliece', 'Rainbow (retired), GeMSS', and 'SIDH (deprecated)'. The 'Status' row reads 'NIST standardized ML-KEM (FIPS 203) & ML-DSA (FIPS 204)', 'SLH-DSA (FIPS 205)', 'NIST Round 4 candidate', 'Under evaluation', and 'Research focus only'. The 'Key point' row summarizes 'Strong balance of performance and security', 'Simple, mature, no new mathematical assumptions', 'Long-standing security record but very large public keys', 'Efficient but produces large signatures', and 'Lightweight design but current schemes are broken'.

Post-quantum cryptography isn't a single algorithm or standard. It's an ecosystem of cryptographic methods built on new mathematical problems that resist both classical and quantum attacks.

These approaches differ in how they secure data, how mature they are, and how well they balance performance with security.

Each family relies on a problem that's hard to solve for both classical and quantum computers.

Lattice-based cryptography

This family builds security on geometric problems in high-dimensional lattices. The best-known examples—ML-KEM (FIPS 203) and ML-DSA (FIPS 204)—form the foundation of NIST’s post-quantum standards.

Lattice methods are favored because they balance strong security with efficient performance.

They rely on mathematical assumptions such as Learning With Errors (LWE) and Short Integer Solution (SIS), which remain difficult for both classical and quantum computers to solve.

Note:
Lattice-based schemes also support fully homomorphic encryption (FHE)—a capability that allows computation on encrypted data without decrypting it. This makes them not only quantum-resistant but also foundational to emerging privacy-preserving technologies.

Hash-based cryptography

Hash-based systems use one-way hash functions to create secure digital signatures.

Stateless algorithms such as SLH-DSA (FIPS 205, based on SPHINCS+) are now standardized, while stateful schemes like XMSS and LMS are used in specific applications. These methods are simple, mature, and backed by decades of cryptographic study.

They’re designed exclusively for digital signatures, not encryption, and derive their strength from the proven hardness of reversing hash functions.

Note:
Hash-based signatures are considered quantum-safe by design because their security doesn't depend on new algebraic assumptions. They rely only on the proven hardness of breaking modern hash functions, making them the most conservative choice for long-term assurance.

Code-based cryptography

Code-based algorithms rely on error-correcting codes to secure information.

The McEliece system is the classic example. It has withstood extensive cryptanalysis for more than forty years, though its large public keys make it harder to deploy widely.

Its security depends on the difficulty of decoding random linear codes—a problem believed to remain hard even for quantum computers.

Note:
In 2025, NIST selected HQC as a second key-encapsulation mechanism (KEM) for standardization, complementing lattice-based ML-KEM. Although McEliece keys are large, their decryption performance is fast and stable, even under quantum attack models. This makes code-based cryptography appealing for hardware implementations and specialized high-security use cases where key storage is less constrained.

Multivariate cryptography

Multivariate systems are built around solving nonlinear polynomial equations over finite fields.

They can be efficient and flexible but often require large signatures and keys. Because of that, they're still being evaluated for broader use.

These schemes are primarily designed for digital signatures and rely on mathematical problems that are easy to verify but extremely difficult to invert.

Isogeny-based cryptography

Isogeny approaches use relationships between elliptic curves to create key exchange and signature schemes.

They were once seen as a promising lightweight option. However, after recent cryptanalysis exposed vulnerabilities in the SIDH protocol, interest has shifted toward other families with stronger assurance.

Isogeny-based systems aimed to offer small key sizes and efficient performance. But current research focuses on reinforcing their underlying mathematical assumptions before re-evaluation.

 

Which algorithms did NIST standardize—and what do they do?

The National Institute of Standards and Technology (NIST) has finalized the first post-quantum cryptography standards:

These algorithms mark the start of a global transition toward quantum-resistant encryption. They’re production-ready standards. Not experimental research.

In 2025, NIST also selected HQC (Hamming Quasi-Cyclic) for standardization as a second key-encapsulation mechanism (KEM) to complement ML-KEM. A draft FIPS for FN-DSA (FALCON) is under development to provide an additional digital signature option.

Classic McEliece and BIKE remain alternate code-based candidates under further evaluation but were not chosen for standardization at this stage.

ML-KEM and the other key-establishment candidates are key encapsulation mechanisms (KEMs), which secure the exchange of encryption keys rather than encrypting user data directly.

The table below summarizes the finalized standards and Round 4 candidates currently under NIST review or draft preparation.

NIST post-quantum standards and candidates
Algorithm NIST designation Cryptographic type Primary purpose Notable characteristics
ML-KEM FIPS 203 Lattice-based Key establishment Compact keys, efficient performance, strong resistance against classical and quantum attacks
ML-DSA FIPS 204 Lattice-based Digital signatures Simple and side-channel resistant; provides authentication and integrity for software, firmware, and communications
SLH-DSA FIPS 205 Hash-based Digital signatures Stateless and conservative; slower but based on well-understood hash functions for long-term assurance
BIKE Round 4 candidate Code-based Key establishment Designed for diversity in security assumptions and efficient key exchange
Classic McEliece Round 4 candidate Code-based Key establishment Proven security record over 40 years; large public keys but strong cryptanalytic resilience
HQC Selected 2025 for standardization Code-based Key establishment Combines performance with established code-based assumptions for reliability

Module-Lattice-Based Key-Encapsulation Mechanism (ML-KEM)

ML-KEM is a lattice-based algorithm used for key establishment. It’s designed to replace public-key systems such as RSA and elliptic-curve Diffie-Hellman.

ML-KEM is based on the Kyber family. It provides compact keys, high performance, and strong resistance against both classical and quantum attacks.

Module-Lattice-Based Digital Signature Algorithm (ML-DSA)

ML-DSA is a lattice-based digital signature scheme. It offers authentication and data integrity for software, firmware, and communications.

Based on the Dilithium family, ML-DSA emphasizes simplicity and side-channel resistance, making it well-suited for real-world deployment.

Stateless Hash-Based Digital Signature (SLH-DSA)

SLH-DSA is a stateless hash-based digital signature algorithm. It’s slower than lattice-based schemes but relies only on well-understood hash functions instead of new mathematical assumptions.

The SPHINCS+ design forms the foundation of SLH-DSA, offering a conservative and reliable option for long-term security and redundancy.

Other NIST candidates (Round 4 and beyond)

Beyond the finalized and selected standards, NIST continues to evaluate additional code-based schemes such as BIKE and Classic McEliece to maintain diversity in security assumptions.

These candidates remain under evaluation for potential inclusion or further study, helping ensure alternatives are available if new vulnerabilities are discovered.

Globally, standards organizations—including the NSA’s CNSA 2.0 suite, ENISA, and ETSI—are aligning migration guidance with NIST’s framework to create a consistent path toward quantum-resistant cryptography.

Together, these efforts define the global roadmap for post-quantum cryptography.

 

How does PQC fit into real-world systems today?

"This marks the beginning of one of the most complex transitions in the history of the Internet, as these post-quantum cryptosystems are deployed in a myriad of different communications protocols, software applications, and hardware devices."

Post-quantum cryptography is already moving from research to deployment.

Organizations are beginning to test it in production networks—but replacing existing encryption isn't simple. Most digital systems depend on public-key infrastructures (PKI) that link certificates, trust chains, and authentication processes. Hybrid operation is expected to run for several years during migration.

Which is why the first step is hybrid cryptography.

It allows classical and post-quantum algorithms to run together, so systems stay compatible while gaining quantum resistance. For instance, draft extensions to TLS 1.3 and IKEv2 enable hybrid key exchanges that pair RSA or ECDH with lattice-based algorithms such as ML-KEM. This approach lets organizations test PQC performance and interoperability before making full replacements.

The next challenge is scale.

Updating PKI means regenerating root certificates, rebuilding trust chains, and validating dependencies across browsers, devices, and firmware. Even minor changes can disrupt critical authentication flows.

To handle this, many organizations are adopting crypto-agile architectures—systems designed to swap algorithms without major rework. Crypto-agility supports phased migrations and makes it easier to incorporate future NIST standards as they evolve.

"Organizations that practice crypto agility should be able to turn off the use of weak cryptographic algorithms quickly when a vulnerability is discovered and adopt new cryptographic algorithms without making significant changes to infrastructures or suffering from unnecessary disruptions."

Major vendors are already experimenting with PQC in practice.

Google and Cloudflare have tested hybrid TLS connections across global infrastructure. Telecom providers and government agencies are piloting post-quantum authentication in 5G, VPN, and secure messaging systems.

Bottom line: PQC is no longer theoretical. It's being implemented step by step—through hybrid testing, PKI modernization, and crypto-agile design—to prepare for the quantum transition ahead.

 

How to get quantum-ready in five steps

Flow diagram titled 'Quantum readiness roadmap'. The diagram presents five sequential boxes connected vertically, each representing a step in preparing for post-quantum cryptography, with accompanying blue icons. Step 1 is labeled 'Inventory current cryptographic assets' with text stating identify where encryption is used across systems and dependencies. Step 2 is labeled 'Prioritize long-lived and high-value data' with text stating rank systems by data lifespan and exposure surface. Step 3 is labeled 'Plan hybrid and phased rollouts' with text stating run classical and post-quantum algorithms in parallel during transition. Step 4 is labeled 'Build crypto-agility into new architectures' with text stating design systems that can switch algorithms without disruption. Step 5 is labeled 'Coordinate across vendors and ecosystems' with text stating collaborate with partners to ensure interoperability and consistency. At the bottom right, a small circular gray arrow icon is labeled 'Ongoing quantum readiness'.

Preparing for the post-quantum era is more than a cryptography upgrade. It's a multi-year transformation that affects architectures, vendors, and risk models across entire ecosystems.

Government and industry roadmaps alike indicate that quantum migrations will take years—potentially stretching through 2030–2035—making early coordination essential.

Which means: organizations that start now can plan deliberately instead of reacting later.

1. Inventory current cryptographic assets

The first step is visibility.

Organizations need to identify where and how encryption is used—across TLS, VPNs, PKI, firmware, and third-party APIs. Many discover thousands of certificates and hardcoded cryptographic calls once they begin.

Automated scanning and discovery tools can help catalog these assets and expose dependencies before migration begins.

Document ownership and upgrade paths for each dependency.

Tip:
Use discovery tools that integrate with existing PKI, DevOps pipelines, and cloud APIs to detect hardcoded or legacy algorithms. Cryptography often hides in compiled code, firmware, or third-party SDKs that standard scanners miss.

2. Prioritize long-lived and high-value data

Some data loses sensitivity quickly. Other information—like intellectual property, biometric identifiers, or government records—must remain secure for decades.

Prioritize systems based on both data lifespan and exposure surface to focus early quantum protection efforts where they matter most.

3. Plan hybrid and phased rollouts

Hybrid cryptography allows classical and post-quantum algorithms to run together during transition. It helps systems maintain compatibility while gaining quantum resistance.

Phased rollouts let organizations validate performance, test interoperability, and reduce compatibility risks before fully switching to post-quantum encryption.

Tip:
Pilot hybrid implementations in lower-risk environments first—like internal VPNs or non-production TLS endpoints—to benchmark latency and certificate performance before rolling out enterprise-wide.

4. Build crypto-agility into new architectures

Crypto-agility is the ability to replace cryptographic algorithms without redesigning systems. It should extend beyond software libraries to APIs, firmware, and key-management interfaces. This flexibility is essential for adapting to evolving standards and addressing newly discovered vulnerabilities.

Tip:
Design configuration files and APIs to reference algorithm identifiers rather than hardcoded implementations. This makes future replacements or PQC upgrades a policy change instead of a code rewrite.

5. Coordinate across vendors and ecosystems

Quantum readiness isn't a single-organization project. It requires coordination across suppliers, cloud providers, and hardware vendors to ensure consistent algorithm support and timely updates. Large-scale migration will unfold gradually over the next decade, demanding sustained collaboration across the entire technology ecosystem.

When it comes down to it, quantum readiness isn't a one-time upgrade. It's an evolving process that combines inventory, hybrid deployment, and long-term collaboration. Standards bodies, vendors, and governments are still finalizing guidance, but the organizations preparing now will adapt fastest when the transition becomes mandatory.

 

How is PQC different from quantum cryptography?

Post-quantum cryptography and quantum cryptography solve the same problem—securing data against quantum attacks—but in completely different ways.

PQC uses classical mathematics to build algorithms that run on existing hardware. Quantum cryptography, on the other hand, relies on the laws of quantum physics.

Comparison: Post-quantum cryptography vs. quantum cryptography
Aspect Post-quantum cryptography (PQC) Quantum cryptography (QC)
Foundational principle Based on classical mathematics (e.g., lattices, hashes, error-correcting codes) Based on quantum physics (e.g., photon behavior, quantum states)
Primary examples ML-KEM, ML-DSA, SLH-DSA Quantum key distribution (QKD), quantum random number generators (QRNGs)
Hardware requirements Runs on existing classical hardware Requires specialized optical or quantum hardware
Deployment scope Software-deployable across enterprise and government systems Limited to specialized research, telecom, or defense environments
Scalability Highly scalable and interoperable with current internet infrastructure Constrained by distance, cost, and environmental sensitivity
Purpose Replaces vulnerable public-key encryption and signature systems Secures key exchange or randomness generation
Maturity and standardization Standardized by NIST (FIPS 203–205); ready for adoption Limited standardization; deployment-constrained
Goal Quantum-resistant algorithms for long-term security Physics-based security for specific communication links

Quantum key distribution (QKD), for instance, uses photons to exchange encryption keys. If an eavesdropper tries to intercept the photons, their quantum state changes and the intrusion is detected. It's a powerful concept—but it depends on specialized optical hardware and works only over limited distances.

The same applies to quantum random number generators (QRNGs), which use quantum effects to improve randomness but don't replace core encryption functions.

To sum up: quantum cryptography remains a niche solution suited to specific research and telecom use cases. PQC, by contrast, is universal. It's software-based, scalable, and ready to deploy in enterprise and government systems using today's infrastructure.

 

What's next for post-quantum cryptography?

Post-quantum cryptography is now shifting from standardization to large-scale implementation. With the first FIPS standards complete and HQC selected for standardization, the next focus is integration: embedding PQC across products, networks, and cloud ecosystems.

Over the next few years, expect NIST and global partners to refine interoperability, performance benchmarks, and migration tooling. Governments and vendors will expand hybrid deployments while preparing for fully post-quantum environments.

The long-term goal extends beyond replacing algorithms. The aim is a quantum-resilient cryptographic ecosystem. One that’s agile enough to adapt as new threats, standards, and hardware advances emerge. Achieving that resilience will require ongoing coordination among standards bodies, hardware manufacturers, and service providers throughout the next decade.

Get your quantum readiness assessment
The assessment includes:
  • Overview of your cryptographic landscape
  • Quantum-safe deployment recommendations
  • Guidance for securing legacy apps & infrastructure

Get my assessment

 

PQC FAQs

Post-quantum cryptography protects digital communications from attacks by quantum computers. It replaces public-key algorithms like RSA and ECC with new mathematical methods that remain secure against both classical and quantum attacks, ensuring long-term confidentiality, authentication, and data integrity across modern cryptographic systems.
Post-quantum cryptography uses classical mathematics and runs on existing hardware. Quantum cryptography, such as quantum key distribution (QKD), relies on quantum physics and requires specialized optical equipment. PQC is software-deployable and scalable; quantum cryptography is limited to niche use cases and short-range, hardware-dependent applications.
Yes. PQC algorithms often use larger keys and signatures, which can increase bandwidth and storage needs. Some are computationally intensive, affecting constrained devices. Migration also requires updating cryptographic infrastructures like PKI, but these tradeoffs are considered manageable given the long-term security benefits.
The U.S. National Institute of Standards and Technology (NIST) leads global PQC standardization, selecting algorithms from academic and industry submissions. Contributors include research teams from universities, government agencies, and companies worldwide. International organizations like ENISA and ETSI support alignment and interoperability efforts.
Start with a cryptographic inventory to identify where vulnerable algorithms are used. Prioritize systems protecting long-lived or sensitive data. Plan hybrid rollouts combining classical and PQC algorithms, build crypto-agility into new architectures, and follow NIST, CNSA 2.0, and CISA quantum-readiness guidance.
FIPS 203 standardizes the Module-Lattice-Based Key-Encapsulation Mechanism (ML-KEM), a lattice-based algorithm for key establishment. It replaces RSA and elliptic-curve Diffie-Hellman with quantum-resistant key exchange, offering compact keys, high performance, and strong resistance against both classical and quantum attacks.
FIPS 204 standardizes the Module-Lattice-Based Digital Signature Algorithm (ML-DSA), a lattice-based digital signature scheme. It provides authentication and integrity for communications, software, and firmware, emphasizing simplicity and side-channel resistance for secure deployment in real-world systems.
FIPS 205 standardizes the Stateless Hash-Based Digital Signature (SLH-DSA) algorithm. Built on the SPHINCS+ design, it offers conservative, hash-only security foundations and serves as a complementary alternative to lattice-based methods for long-term cryptographic assurance.