Google says 2029. EMVCo says 2040. The standards are final, the algorithms work, the threat is well-understood — and the migration is eleven years late in the half that actually has to move.
Google says 2029. EMVCo says 2040. They are looking at the same threat — a future quantum computer powerful enough to break the encryption that protects most of the internet — and they cannot agree on what decade the danger arrives in.
In March 2026, Google announced it would complete its post-quantum cryptography migration by 2029. The company has been working on the problem since 2016 — a thirteen-year runway. EMVCo, the consortium that sets payment-card standards on behalf of Visa, Mastercard, JCB, Discover, China UnionPay, and American Express, points to 2040 for full readiness. The U.S. government’s CNSA 2.0 mandate for national security systems sits at 2033. The European Union’s Phase 3 timeline lands at 2035. The G7 Cyber Expert Group’s recommendation for finance is 2030–2032. The UK’s NCSC and the NIST IR 8547 draft both call for the disallowance of all quantum-vulnerable asymmetric cryptography after 2035.
The spread — 2029 to 2040 — is eleven years. It is not a difference of risk appetite. It is a coordination failure where the participants cannot agree on what decade to defend against.
This is the actual state of post-quantum cryptography in May 2026: the standards exist, the algorithms work, the threat is well-understood, and the migration is barely starting. The bottleneck is not technology. It is coordination — the same kind of collective-action problem that delayed metric-system adoption, IPv6 deployment, and Y2K remediation, only with a clock that runs on quantum-hardware progress instead of regulatory patience.
By way of grounding: the cryptographic standards are real and final. NIST published FIPS 203 (ML-KEM, derived from CRYSTALS-Kyber), FIPS 204 (ML-DSA, from CRYSTALS-Dilithium), and FIPS 205 (SLH-DSA, from SPHINCS+) in August 2024. FIPS 206 (FN-DSA, derived from FALCON) is in draft. HQC was selected in 2025 as a backup key-encapsulation mechanism, and a fresh round of digital-signature finalists is in evaluation.
The largest operators of the public web shipped these algorithms within roughly eighteen months. Cloudflare reports that more than 65% of human-visible HTTPS traffic now uses post-quantum key exchange. Google enabled X25519MLKEM768 by default in Chrome 124 in April 2024. AWS, Microsoft Azure, and Apple iCloud Mail integrated hybrid post-quantum protocols across 2024 and 2025. Signal deployed PQXDH for its handshake. Meta has staged a public PQC maturity framework across its services.
This is the half of the migration that is going well. It is also the easy half. The hard half — the half running eleven years late — is everything that is not a TLS handshake on a content-delivery network.
There are nearly fourteen billion payment cards in circulation. A high-end payment-card processor runs on a 32-bit single-core chip at 100 MHz with 48 KB of RAM. A contactless transaction must complete in under 300 milliseconds.
The smallest ML-DSA variant, ML-DSA-44, produces signatures of 2,420 bytes. ECDSA, the algorithm it is meant to replace, produces 64-byte signatures. The ML-DSA signature is 37.8 times larger.
ISO 8583, the messaging standard every bank in the world uses to authorize a card transaction, has authentication fields that typically cap at 256 bytes. A single ML-DSA signature exceeds the entire field by roughly 9.5 times.
Classic McEliece, an alternative key-encapsulation scheme considered for high-security deployments, requires more than 70 KB of memory for its public keys. The card has 48 KB total — the algorithm does not fit. FALCON, depending on parameter choice and side-channel masking, requires 25 KB or more — roughly half the card’s memory budget for one signature operation.
This is not a software update. It is a physical hardware refresh for every wallet on Earth, every transit pass, every chip-and-PIN terminal, every SIM card with a security element. The protocol that carries the message has to grow. The chip that signs has to grow. The terminal that verifies has to grow. The hardware security module in the data center that signs the card-personalization keys has to grow.
PostQuantum.com estimates that a single large payments institution faces over 120,000 discrete program tasks for full PQC migration. That is not headcount. It is workstream count — distinct pieces of code, configuration, certificate, or contract that must each be touched, tested, and made backward-compatible.
Citi Institute (January 2026, cited via PostQuantum.com) modeled the failure case. A successful quantum-enabled attack on a top-five U.S. bank’s Fedwire access could cause $2.0–3.3 trillion in indirect economic losses — 10–17% of annual U.S. GDP — and trigger a six-month recession. Fedwire settles over $4 trillion daily. TARGET2, the European interbank settlement system, clears more than €2 trillion daily. The dollar number on coordination failure is the cost of one bad day in a system that processes a year of GDP every fortnight.
A 2025 enterprise-readiness survey (arXiv:2509.01731, “Are Enterprises Ready for Quantum-Safe Cybersecurity?”) asked technology professionals about quantum risk:
The awareness-to-action ratio is roughly twelve to one. This is not ignorance. It is the signature of a coordination problem: everyone knows they should move; almost no one wants to go first. The cost of being early is concrete (more migration cycles, fragile early-deployment infrastructure, retraining). The cost of being late is mostly external — the threat is borne by the encrypted payload, not the company that was slow.
Sector readiness reflects this. Banking and telecom, the two industries with regulators breathing down their necks, hit 45–47% with budgeted PQC plans. Defense and high-tech reach roughly 43%. Consumer products, retail, and manufacturing hover near “no business case” or “exploration without timelines.”
The sectors most exposed are most prepared. The sectors that consider themselves least exposed are the ones that will discover their supply chain depended on the financial infrastructure all along. A factory’s payment terminal, a clinic’s appointment-confirmation SMS, a utility’s billing portal — none of these companies build PQC roadmaps. All of them inherit the readiness of the institutions whose cryptography they ride on.
The Asian payment sector survey is starker still: only 20% of leadership are “very familiar” with the quantum threat; about 44% are “not familiar at all.”
Trace the dependency chain to deploy PQC for a single enterprise application:
Every prerequisite has a prerequisite. The cumulative dependency graph is the threat. You cannot migrate your application until you migrate the HSM that signs its keys, and you cannot migrate the HSM until the vendor ships PQC firmware, and the vendor cannot ship PQC firmware until the certification body completes the validation regime. The algorithms are frozen. The chain still has to shake out year by year.
Here is the part that surprised the engineers who built the standards: post-quantum signatures verify faster than RSA on most modern hardware.
ML-DSA signatures are 37.8 times larger than ECDSA. Hybrid TLS handshakes inflate from roughly 1.2 KB to 14.7 KB. Signature-verification slowdowns measured in Project Leap Phase 2, the BIS Innovation Hub central-bank PQC pilot, reached 7.5x — 209.9 ms versus 28.1 ms — but in the specific context of payment processing where the signing chain ran through hardware that had not been updated.
In well-tuned modern environments, the picture inverts. AWS measured a 0.05% throughput reduction on hybrid ML-KEM TLS handshakes — essentially noise. Cloudflare reports negligible latency impact on the post-quantum traffic that now serves most of its human users.
The takeaway is structural. The algorithms are not the problem. The wire is. The chip is. The protocol field width is. The CPU does the new math just fine; the network and the silicon and the standards body are where the cost lives.
This is the cleanest possible diagnosis of a coordination problem: when the technology works in isolation but fails in deployment, the failure is not technical. It is institutional.
In 2012, breaking RSA-2048 was estimated to require approximately one billion physical qubits running for many days. In 2025, Craig Gidney published an algorithmic improvement that brought the estimate to under one million physical qubits in fewer than seven days. That is roughly a 10,000x reduction in qubit requirements over fourteen years. Every algorithmic refinement compresses the timeline; every quantum-hardware milestone — Google’s Willow chip, IBM’s roadmap toward Kookaburra, Quantinuum’s logical-qubit demonstrations, IonQ’s modular trapped-ion progress — ratifies that the work continues at industrial scale.
Expert surveys give a 19–34% probability of a cryptographically relevant quantum computer (CRQC) within ten years, and a 60–82% probability by 2044. Individual expert estimates range from four to sixteen years.
These are probabilities, not deadlines. But “harvest now, decrypt later” makes the probability irrelevant for any data with a long sensitivity window. Diplomatic cables, medical records, trade secrets, financial settlement data — encrypted today, captured today, decrypted whenever the hardware crosses the threshold. The cost of coordination failure is not paid in 2032 when a quantum computer arrives. It is being paid right now, in every captured ciphertext that an adversary is patient enough to store.
Three honest counter-positions deserve a hearing.
First, “migration timelines always slip — this is normal enterprise IT.” Partly true. The difference is that most enterprise migrations have no external clock. Quantum hardware progress does not wait for procurement cycles. Google’s thirteen-year runway is the honest timeline; the organizations starting in 2026 will need to compress that into three to five years, or accept the residual risk explicitly.
Second, “the 5% acting figure is misleadingly low — many are in planning.” Planning without budgets is not preparation. The planning-to-action conversion rate in cybersecurity migrations has historically been poor. The 37% who have not discussed quantum internally are not in planning by any reasonable definition.
Third, “harvest-now-decrypt-later is overhyped — most encrypted data is not sensitive for ten-plus years.” For most consumer traffic, true. For diplomatic communications, medical records, trade secrets, and financial-settlement data, it is plainly false. Nation-state collectors do not sort first; they capture and decide later.
The argument that survives is narrower than the alarmism and broader than the dismissal: organizations whose data has long sensitivity windows, whose protocols have hard format constraints, or whose hardware has long replacement cycles need to be in motion now. Most are not.
If you operate technical infrastructure and have not yet started a PQC inventory, here is the concrete first move. Run a cryptographic inventory: for every external-facing service, every code-signing pipeline, every API key, every VPN tunnel, every database-at-rest encryption deployment, list the algorithm, the key length, the certificate authority, the HSM model, and the data-sensitivity window. Most teams discover that the inventory itself is a multi-week project — and that they own quantum-vulnerable cryptography in places no one had documented (a forgotten S/MIME deployment, a payment-processor SDK pinned to a 2019 version, a self-signed root in a CI runner).
Then identify the prerequisite chain for each item. TLS 1.3 readiness, HSM PQC roadmap, CA PQC roadmap, application-layer protocol field widths. The point is not to move everything at once — it is to know which dominoes have to fall before yours can. NIST’s NCCoE migration playbook and the FS-ISAC PQC Working Group “Future State” technical paper are the two best public starting points; both are free.
For data with multi-year sensitivity (anything regulated, anything diplomatic, anything proprietary), assume the harvest is already happening. Hybrid encryption — classical and post-quantum, where breaking either alone is insufficient — is the defensive posture in the meantime. Most major cloud key-management services now offer it, including AWS KMS, Google Cloud KMS, and Azure Key Vault.
If you build software that other people deploy, audit your dependency tree for cryptography assumptions. The pure algorithm is the easy part. The protocol fields, the on-disk formats, the message sizes you assume are small — those are the hidden coupling that will break first.
If you are buying enterprise software in 2026, ask the vendor for their PQC roadmap and their HSM dependency. The vendors that cannot answer the question are the ones that will surprise you in 2029.
Google’s 2029 deadline is not Google being aggressive. It is the only honest deadline a company that builds quantum computers can offer, given what they know. EMVCo’s 2040 is not EMVCo being lazy — it reflects how much physical card infrastructure has to be replaced and how slowly billions of plastic rectangles cycle through reissue.
Both are correct for their own mission. Both are wrong for the other party’s. The eleven-year gap is what the world’s encryption looks like when the threat is one clock and the response is many.
The participants who started in 2016 will finish on time. The participants who started in 2024 will be late but defensible. The participants who haven’t started will be the next decade’s news cycle of breaches retroactively traced to ciphertext captured today.
The harvest is happening now. The migration is the only response. The coordination problem is the actual threat — not the quantum computer.
A cryptographic inventory is the first move. A signed, replaceable record of who signed what, with which algorithm, on which date is the second.
Chain of Consciousness is an append-only signed log of every action a system takes — including which signature scheme produced the receipt. When ML-DSA replaces ECDSA in your stack, the chain notes the swap; the old signatures stay verifiable, the new ones layer on top, and the audit trail survives the algorithm transition. Crypto-agility is not a slogan; it is a property of logs that were designed to outlive the cipher that signed their first entry.
Install the SDK:
pip install chain-of-consciousness · npm install chain-of-consciousness
Hosted Chain of Consciousness · See a live chain · Verify a signed action