Quantum Threat: Party Like It’s BB84 Tonight

March 19, 2026

EN | DE

A Quantum AI postdoc in Berlin (PhD researcher, focused on theoretical math for error correction) said over a coffee that there’s no possible way Shor’s algorithm breaks cryptography before 2035. He said the physicists he knows working on it are still taking considerable time to figure out things, and thus unable to even give timelines. Engineers can’t build what physicists haven’t solved.

Then I showed him pqprobe’s scanner output. He said he didn’t know what TLS is, or why 1.0 is weak.

The Gap

This isn’t a gotcha post. His credentials are real and impressive. Error correction math for quantum computing is genuinely hard, genuinely important work, and clearly he knows his stuff. However, a researcher confident enough about Shor’s algorithm being far away — who doesn’t know what protocol it arrives for — is making a claim about risk using only some of the variables. He knows the weapon’s estimated timeline. He doesn’t know the surface it hits, or how long it takes to harden that surface, or who’s already leaking data to decrypt later.

TLS is how your browser, your email server, your database, your VPN, and your payment system negotiate encryption. TLS 1.0, deprecated since 2021, is still running on production infrastructure across Fortune 500 companies. The distance between “I know when Shor’s hits” and “I don’t know what Shor’s hits” is the distance between theoretical physics and operational security. Both are valid disciplines. Only one of them decides whether your organization is ready.

Google’s Own Numbers

The postdoc’s “very long way away” framing doesn’t match the trajectory of the field’s own research. Craig Gidney at Google Quantum AI published an updated analysis showing that breaking RSA-2048 could be achieved in under a week using fewer than one million noisy qubits. That’s a 95% reduction from his own 2019 estimate of 20 million qubits. The resource requirement didn’t drop incrementally over those six years. It collapsed.

In February 2026, Iceberg Quantum published their Pinnacle Architecture, claiming fewer than 100,000 physical qubits could factor RSA-2048 using quantum low-density parity-check codes instead of surface codes. Another order of magnitude below Gidney’s already-reduced estimate. These aren’t fringe claims — they’re peer-reviewed architecture proposals under realistic error assumptions, using error rates consistent with what leading hardware platforms already achieve.

The postdoc’s million-qubit figure is already outdated.

The Hardware Is Converging on 2029

The postdoc characterized the pace as slow. The public roadmaps say otherwise.

IBM’s quantum roadmap targets Starling in 2029: 200 logical qubits, 100 million quantum gates, housed in a new data center in Poughkeepsie, New York. By 2033, Bluejay scales to 2,000 logical qubits and one billion gates — specifications that exceed what current estimates require for cryptanalysis. IBM’s own framing: “It is now a question of engineering, not science.”

Quantinuum’s Apollo system, also targeting 2029, aims to be the first universal fully fault-tolerant quantum computer. They’ve demonstrated the lowest error rates in the industry and completed the full fault-tolerant gate set — what the field calls “the last major hurdle.” DARPA selected Quantinuum for its Quantum Benchmarking Initiative, with a concrete path to utility-scale quantum computing by 2033.

Google itself has completed the first two milestones on its own six-milestone roadmap: beyond-classical computation in 2019 and error-corrected qubit prototype in 2023. With Willow in 2024, they crossed below-threshold error correction — the point where adding more qubits reduces errors rather than compounding them. That is precisely the physics-to-engineering transition the postdoc said hasn’t happened yet. It already happened.

Microsoft and Atom Computing plan to deliver an error-corrected quantum computer to Denmark in 2026. QuEra is delivering error-correction-ready hardware to Japan’s AIST. A paper in Science in January 2026 described the current moment as quantum computing’s “transistor moment” — functional systems exist and scaling is now an engineering challenge.

Three companies targeting 2029 for fault tolerance. Multiple independent teams confirming error correction works. Nature reporting a “vibe shift” among researchers toward usable quantum computers within a decade, not decades. The view from the theory bench and the view from the hardware roadmap are diverging fast.

Shor’s Isn’t the Only Threat Model

Even if the postdoc is exactly right — even if full Shor’s on real hardware doesn’t land before 2035 — that confidence doesn’t bound the actual risk surface.

Shor’s algorithm is from 1994. It’s the canonical quantum threat to public-key cryptography, but it’s not the only possible one. Hybrid quantum-classical approaches keep getting refined. The Shanghai factoring paper and the JVG algorithm from earlier this month both failed under scrutiny, but the research direction — offloading computation to classical systems and using quantum hardware for a narrower step — keeps producing new proposals.

The resource estimates themselves are the clearest evidence against hard-date confidence. Six years ago, the number was 20 million qubits. Today it’s under a million, possibly under 100,000 with newer architectures. A researcher claiming “no possible way before 2035” is asserting stability in a variable that has dropped 95% in the time it took him to complete his postdoc.

And then there’s the classified dimension. NSA set CNSA 2.0 deadlines requiring post-quantum cryptography in new national security products by 2027, with RSA deprecated by 2030 and disallowed by 2035. The G7 Cyber Expert Group identified 2035 as the completion target, with critical systems prioritized for 2030–2032. The European Commission set milestones through 2035. Australia targets 2030 for RSA deprecation.

These aren’t dates set by optimists or vendors. They’re set by agencies with classified intelligence on adversary capabilities. When NSA tells the defense industrial base to stop using RSA by 2030, either they’re wasting taxpayer money on a premature migration or they know something. The postdoc’s view reflects one bench in one lab. The deadlines reflect Fort Meade, Canberra, Brussels, and Bonn.

The Binding Constraint Isn’t Shor’s

The most important thing the postdoc’s framing misses is that even if he’s right, the operational conclusion doesn’t change.

The harvest-now-decrypt-later threat doesn’t require a cryptanalytically relevant quantum computer to exist today. It requires that adversaries collect encrypted traffic now and store it until they can decrypt it later. The question isn’t when Shor’s runs. It’s whether the shelf life of your secrets exceeds the time until decryption becomes possible, minus the time it takes your organization to complete migration.

Large-scale cryptographic migrations take five to ten years for complex enterprises. Organizations starting today barely meet the 2035 deadline on the most conservative estimates.

If your data needs to stay confidential for a decade — financial records, healthcare data, classified communications, intellectual property — and migration takes five to ten years, you’re already behind on any credible timeline.

This is what pqprobe measures. Not whether Shor’s algorithm runs today, but whether your organization is making measurable progress on migration — improving, degrading, stable, or oscillating against the deadlines that the people with classified threat assessments have set. The postdoc’s confidence about when the flood arrives is a physics question. Whether your walls are going up fast enough is an engineering and organizational question. That’s the one that matters.

The Turing Award Knows

On March 18, 2026 — this week — the A.M. Turing Award went to Charles Bennett and Gilles Brassard for their 1984 work on quantum key distribution. BB84, the protocol they invented, was designed to protect against exactly the threat that quantum computing poses to classical cryptography. They built the defense before the weapon existed.

The timing is pointed. The highest honor in computer science, awarded for work that anticipated the quantum threat 42 years ago, arriving in the same month that resource estimates for quantum factoring drop another order of magnitude and three companies publicly converge on 2029 fault-tolerant machines.

Bennett and Brassard didn’t wait for the quantum computer to exist before building the countermeasure. They moved on the walls before the flood. The postdoc agreed with me on this point — migration should happen regardless of timeline confidence. Which makes his hard-date claim operationally irrelevant. Whether Shor’s arrives in 2032 or 2038, the walls go up now.

The question is which organizations are measuring progress.

That’s what pqprobe answers.