- 1. Why do we need new cryptography for the quantum era?
- 2. What makes post-quantum algorithms different from what we use today?
- 3. What types of algorithms make up post-quantum cryptography?
- 4. Which algorithms did NIST standardize—and what do they do?
- 5. How does PQC fit into real-world systems today?
- 6. How to get quantum-ready in five steps
- 7. How is PQC different from quantum cryptography?
- 8. What's next for post-quantum cryptography?
- 9. PQC FAQs
- Why do we need new cryptography for the quantum era?
- What makes post-quantum algorithms different from what we use today?
- What types of algorithms make up post-quantum cryptography?
- Which algorithms did NIST standardize—and what do they do?
- How does PQC fit into real-world systems today?
- How to get quantum-ready in five steps
- How is PQC different from quantum cryptography?
- What's next for post-quantum cryptography?
- PQC FAQs
What Is Post-Quantum Cryptography (PQC)? A Complete Guide
- Why do we need new cryptography for the quantum era?
- What makes post-quantum algorithms different from what we use today?
- What types of algorithms make up post-quantum cryptography?
- Which algorithms did NIST standardize—and what do they do?
- How does PQC fit into real-world systems today?
- How to get quantum-ready in five steps
- How is PQC different from quantum cryptography?
- What's next for post-quantum cryptography?
- PQC FAQs
Post-quantum security is the protection of digital information and communications against attacks from quantum computers. It focuses on replacing public-key cryptography methods, such as RSA and elliptic-curve systems, that could be broken by quantum algorithms.
Post-quantum security depends on new cryptographic standards and algorithms designed to remain secure against both classical and quantum computational threats.

Why do we need new cryptography for the quantum era?
Quantum computers will eventually change what's required to keep data secure because they can solve certain mathematical problems much faster than classical computers. That matters because modern encryption depends on problems that are deliberately hard to solve.
In other words:
The math that protects today's internet traffic, digital identities, and financial systems won't hold up once quantum computing reaches scale.
Public-key cryptography—like RSA, elliptic-curve, and Diffie-Hellman—depends on factoring or discrete logarithms. Those problems are exactly what quantum algorithms, such as Shor's, can solve efficiently. And when that happens, attackers could decrypt private communications or forge digital signatures.
Symmetric encryption is different. It can be strengthened by larger key sizes. AES-256 is expected to remain strong under quantum attack. But public-key systems can't simply scale up. Their structure becomes fundamentally weak in the face of quantum computation.
That's why the shift to new cryptography is already underway:
- The U.S. National Institute of Standards and Technology (NIST) has finalized its first set of post-quantum cryptographic standards.
- The National Security Agency's CNSA 2.0 suite sets federal migration timelines.
- And the Cybersecurity and Infrastructure Security Agency (CISA) has issued quantum-readiness guidance urging organizations to start inventorying cryptographic systems now.
The reason is simple. Sensitive data stolen today could be decrypted tomorrow.
The sooner organizations adopt quantum-resistant algorithms, the longer their information will stay secure. That's why we need new cryptography for the quantum era–and why quantum security has become such a major focus area for so many businesses.
- 8 Quantum Computing Cybersecurity Risks [+ Protection Tips]
- Harvest Now, Decrypt Later (HNDL): The Quantum-Era Threat
What makes post-quantum algorithms different from what we use today?
Today's encryption depends on math that classical computers can't solve efficiently.
RSA and elliptic-curve cryptography rely on factoring and discrete logarithms. These problems take so long for traditional computers to compute that they're considered secure.
Quantum computers change that.
They can solve both of those problems exponentially faster using algorithms like Shor's. Which means: once large-scale quantum systems exist, they could break the foundations of most public-key encryption and digital signatures.
| Comparison: Classical vs. post-quantum cryptography |
|---|
| Aspect | Classical public-key cryptography | Post-quantum cryptography (PQC) |
|---|---|---|
| Mathematical foundation | Based on factoring (RSA) and discrete logarithms (ECC, Diffie–Hellman) | Based on problems like lattices, hash chains, error-correcting codes, and multivariate equations |
| Vulnerability to quantum attacks | Breakable by Shor's algorithm on large-scale quantum computers | Designed to resist both classical and quantum attacks |
| Hardware requirements | Runs on classical computers | Runs on classical computers — no quantum hardware needed |
| Security assumption | Computational difficulty for classical systems | Hardness believed to hold against classical and quantum attacks |
| Examples | RSA, ECDSA, Diffie–Hellman | ML-KEM (KEM), ML-DSA (signatures), SLH-DSA (signatures), Classic McEliece (KEM). |
| Readiness for deployment | Already in use globally but becoming obsolete under quantum threat | NIST standards finalized and ready for phased deployment |
Post-quantum algorithms avoid those weaknesses altogether.
Important: post-quantum algorithms are not quantum themselves. They run on the same classical hardware we already use. That's why they can be implemented now—without waiting for quantum computers to arrive.
The goal is dual resistance.
Post-quantum cryptography is designed to stay secure against both classical and quantum adversaries. In other words, it keeps today's data protected now and after quantum computing becomes practical.
What types of algorithms make up post-quantum cryptography?

Post-quantum cryptography isn't a single algorithm or standard. It's an ecosystem of cryptographic methods built on new mathematical problems that resist both classical and quantum attacks.
These approaches differ in how they secure data, how mature they are, and how well they balance performance with security.
Each family relies on a problem that's hard to solve for both classical and quantum computers.
Lattice-based cryptography
This family builds security on geometric problems in high-dimensional lattices. The best-known examples—ML-KEM (FIPS 203) and ML-DSA (FIPS 204)—form the foundation of NIST’s post-quantum standards.
Lattice methods are favored because they balance strong security with efficient performance.
They rely on mathematical assumptions such as Learning With Errors (LWE) and Short Integer Solution (SIS), which remain difficult for both classical and quantum computers to solve.
Hash-based cryptography
Hash-based systems use one-way hash functions to create secure digital signatures.
Stateless algorithms such as SLH-DSA (FIPS 205, based on SPHINCS+) are now standardized, while stateful schemes like XMSS and LMS are used in specific applications. These methods are simple, mature, and backed by decades of cryptographic study.
They’re designed exclusively for digital signatures, not encryption, and derive their strength from the proven hardness of reversing hash functions.
Code-based cryptography
Code-based algorithms rely on error-correcting codes to secure information.
The McEliece system is the classic example. It has withstood extensive cryptanalysis for more than forty years, though its large public keys make it harder to deploy widely.
Its security depends on the difficulty of decoding random linear codes—a problem believed to remain hard even for quantum computers.
Multivariate cryptography
Multivariate systems are built around solving nonlinear polynomial equations over finite fields.
They can be efficient and flexible but often require large signatures and keys. Because of that, they're still being evaluated for broader use.
These schemes are primarily designed for digital signatures and rely on mathematical problems that are easy to verify but extremely difficult to invert.
Isogeny-based cryptography
Isogeny approaches use relationships between elliptic curves to create key exchange and signature schemes.
They were once seen as a promising lightweight option. However, after recent cryptanalysis exposed vulnerabilities in the SIDH protocol, interest has shifted toward other families with stronger assurance.
Isogeny-based systems aimed to offer small key sizes and efficient performance. But current research focuses on reinforcing their underlying mathematical assumptions before re-evaluation.
Which algorithms did NIST standardize—and what do they do?
The National Institute of Standards and Technology (NIST) has finalized the first post-quantum cryptography standards:
- Module-Lattice-Based Key-Encapsulation Mechanism (ML-KEM) — FIPS 203
- Module-Lattice-Based Digital Signature Algorithm (ML-DSA) — FIPS 204
- Stateless Hash-Based Digital Signature (SLH-DSA) — FIPS 205
These algorithms mark the start of a global transition toward quantum-resistant encryption. They’re production-ready standards. Not experimental research.
In 2025, NIST also selected HQC (Hamming Quasi-Cyclic) for standardization as a second key-encapsulation mechanism (KEM) to complement ML-KEM. A draft FIPS for FN-DSA (FALCON) is under development to provide an additional digital signature option.
Classic McEliece and BIKE remain alternate code-based candidates under further evaluation but were not chosen for standardization at this stage.
ML-KEM and the other key-establishment candidates are key encapsulation mechanisms (KEMs), which secure the exchange of encryption keys rather than encrypting user data directly.
The table below summarizes the finalized standards and Round 4 candidates currently under NIST review or draft preparation.
| NIST post-quantum standards and candidates |
|---|
| Algorithm | NIST designation | Cryptographic type | Primary purpose | Notable characteristics |
|---|---|---|---|---|
| ML-KEM | FIPS 203 | Lattice-based | Key establishment | Compact keys, efficient performance, strong resistance against classical and quantum attacks |
| ML-DSA | FIPS 204 | Lattice-based | Digital signatures | Simple and side-channel resistant; provides authentication and integrity for software, firmware, and communications |
| SLH-DSA | FIPS 205 | Hash-based | Digital signatures | Stateless and conservative; slower but based on well-understood hash functions for long-term assurance |
| BIKE | Round 4 candidate | Code-based | Key establishment | Designed for diversity in security assumptions and efficient key exchange |
| Classic McEliece | Round 4 candidate | Code-based | Key establishment | Proven security record over 40 years; large public keys but strong cryptanalytic resilience |
| HQC | Selected 2025 for standardization | Code-based | Key establishment | Combines performance with established code-based assumptions for reliability |
Module-Lattice-Based Key-Encapsulation Mechanism (ML-KEM)
ML-KEM is a lattice-based algorithm used for key establishment. It’s designed to replace public-key systems such as RSA and elliptic-curve Diffie-Hellman.
ML-KEM is based on the Kyber family. It provides compact keys, high performance, and strong resistance against both classical and quantum attacks.
Module-Lattice-Based Digital Signature Algorithm (ML-DSA)
ML-DSA is a lattice-based digital signature scheme. It offers authentication and data integrity for software, firmware, and communications.
Based on the Dilithium family, ML-DSA emphasizes simplicity and side-channel resistance, making it well-suited for real-world deployment.
Stateless Hash-Based Digital Signature (SLH-DSA)
SLH-DSA is a stateless hash-based digital signature algorithm. It’s slower than lattice-based schemes but relies only on well-understood hash functions instead of new mathematical assumptions.
The SPHINCS+ design forms the foundation of SLH-DSA, offering a conservative and reliable option for long-term security and redundancy.
Other NIST candidates (Round 4 and beyond)
Beyond the finalized and selected standards, NIST continues to evaluate additional code-based schemes such as BIKE and Classic McEliece to maintain diversity in security assumptions.
These candidates remain under evaluation for potential inclusion or further study, helping ensure alternatives are available if new vulnerabilities are discovered.
Globally, standards organizations—including the NSA’s CNSA 2.0 suite, ENISA, and ETSI—are aligning migration guidance with NIST’s framework to create a consistent path toward quantum-resistant cryptography.
Together, these efforts define the global roadmap for post-quantum cryptography.
How does PQC fit into real-world systems today?
Post-quantum cryptography is already moving from research to deployment.
Organizations are beginning to test it in production networks—but replacing existing encryption isn't simple. Most digital systems depend on public-key infrastructures (PKI) that link certificates, trust chains, and authentication processes. Hybrid operation is expected to run for several years during migration.
Which is why the first step is hybrid cryptography.
It allows classical and post-quantum algorithms to run together, so systems stay compatible while gaining quantum resistance. For instance, draft extensions to TLS 1.3 and IKEv2 enable hybrid key exchanges that pair RSA or ECDH with lattice-based algorithms such as ML-KEM. This approach lets organizations test PQC performance and interoperability before making full replacements.
The next challenge is scale.
Updating PKI means regenerating root certificates, rebuilding trust chains, and validating dependencies across browsers, devices, and firmware. Even minor changes can disrupt critical authentication flows.
To handle this, many organizations are adopting crypto-agile architectures—systems designed to swap algorithms without major rework. Crypto-agility supports phased migrations and makes it easier to incorporate future NIST standards as they evolve.
Major vendors are already experimenting with PQC in practice.
Google and Cloudflare have tested hybrid TLS connections across global infrastructure. Telecom providers and government agencies are piloting post-quantum authentication in 5G, VPN, and secure messaging systems.
Bottom line: PQC is no longer theoretical. It's being implemented step by step—through hybrid testing, PKI modernization, and crypto-agile design—to prepare for the quantum transition ahead.
How to get quantum-ready in five steps

Preparing for the post-quantum era is more than a cryptography upgrade. It's a multi-year transformation that affects architectures, vendors, and risk models across entire ecosystems.
Government and industry roadmaps alike indicate that quantum migrations will take years—potentially stretching through 2030–2035—making early coordination essential.
Which means: organizations that start now can plan deliberately instead of reacting later.
1. Inventory current cryptographic assets
The first step is visibility.
Organizations need to identify where and how encryption is used—across TLS, VPNs, PKI, firmware, and third-party APIs. Many discover thousands of certificates and hardcoded cryptographic calls once they begin.
Automated scanning and discovery tools can help catalog these assets and expose dependencies before migration begins.
Document ownership and upgrade paths for each dependency.
2. Prioritize long-lived and high-value data
Some data loses sensitivity quickly. Other information—like intellectual property, biometric identifiers, or government records—must remain secure for decades.
Prioritize systems based on both data lifespan and exposure surface to focus early quantum protection efforts where they matter most.
3. Plan hybrid and phased rollouts
Hybrid cryptography allows classical and post-quantum algorithms to run together during transition. It helps systems maintain compatibility while gaining quantum resistance.
Phased rollouts let organizations validate performance, test interoperability, and reduce compatibility risks before fully switching to post-quantum encryption.
4. Build crypto-agility into new architectures
Crypto-agility is the ability to replace cryptographic algorithms without redesigning systems. It should extend beyond software libraries to APIs, firmware, and key-management interfaces. This flexibility is essential for adapting to evolving standards and addressing newly discovered vulnerabilities.
5. Coordinate across vendors and ecosystems
Quantum readiness isn't a single-organization project. It requires coordination across suppliers, cloud providers, and hardware vendors to ensure consistent algorithm support and timely updates. Large-scale migration will unfold gradually over the next decade, demanding sustained collaboration across the entire technology ecosystem.
When it comes down to it, quantum readiness isn't a one-time upgrade. It's an evolving process that combines inventory, hybrid deployment, and long-term collaboration. Standards bodies, vendors, and governments are still finalizing guidance, but the organizations preparing now will adapt fastest when the transition becomes mandatory.
How is PQC different from quantum cryptography?
Post-quantum cryptography and quantum cryptography solve the same problem—securing data against quantum attacks—but in completely different ways.
PQC uses classical mathematics to build algorithms that run on existing hardware. Quantum cryptography, on the other hand, relies on the laws of quantum physics.
| Comparison: Post-quantum cryptography vs. quantum cryptography |
|---|
| Aspect | Post-quantum cryptography (PQC) | Quantum cryptography (QC) |
|---|---|---|
| Foundational principle | Based on classical mathematics (e.g., lattices, hashes, error-correcting codes) | Based on quantum physics (e.g., photon behavior, quantum states) |
| Primary examples | ML-KEM, ML-DSA, SLH-DSA | Quantum key distribution (QKD), quantum random number generators (QRNGs) |
| Hardware requirements | Runs on existing classical hardware | Requires specialized optical or quantum hardware |
| Deployment scope | Software-deployable across enterprise and government systems | Limited to specialized research, telecom, or defense environments |
| Scalability | Highly scalable and interoperable with current internet infrastructure | Constrained by distance, cost, and environmental sensitivity |
| Purpose | Replaces vulnerable public-key encryption and signature systems | Secures key exchange or randomness generation |
| Maturity and standardization | Standardized by NIST (FIPS 203–205); ready for adoption | Limited standardization; deployment-constrained |
| Goal | Quantum-resistant algorithms for long-term security | Physics-based security for specific communication links |
Quantum key distribution (QKD), for instance, uses photons to exchange encryption keys. If an eavesdropper tries to intercept the photons, their quantum state changes and the intrusion is detected. It's a powerful concept—but it depends on specialized optical hardware and works only over limited distances.
The same applies to quantum random number generators (QRNGs), which use quantum effects to improve randomness but don't replace core encryption functions.
To sum up: quantum cryptography remains a niche solution suited to specific research and telecom use cases. PQC, by contrast, is universal. It's software-based, scalable, and ready to deploy in enterprise and government systems using today's infrastructure.
What's next for post-quantum cryptography?
Post-quantum cryptography is now shifting from standardization to large-scale implementation. With the first FIPS standards complete and HQC selected for standardization, the next focus is integration: embedding PQC across products, networks, and cloud ecosystems.
Over the next few years, expect NIST and global partners to refine interoperability, performance benchmarks, and migration tooling. Governments and vendors will expand hybrid deployments while preparing for fully post-quantum environments.
The long-term goal extends beyond replacing algorithms. The aim is a quantum-resilient cryptographic ecosystem. One that’s agile enough to adapt as new threats, standards, and hardware advances emerge. Achieving that resilience will require ongoing coordination among standards bodies, hardware manufacturers, and service providers throughout the next decade.