Quantum Threat: The JVG Algorithm Does Not Break RSA

March 5, 2026

EN | DE

A group of engineering professors and PQC hardware executives claim they’ve built a quantum algorithm that can break RSA-2048 in 11 hours using fewer than 5,000 qubits. Their press release calls it a “Cybersecurity Apocalypse in 2026.” SecurityWeek ran it. Yahoo Finance syndicated it. Quantum Zeitgeist covered it.

The claim does not survive technical scrutiny. But the episode reveals something important about how PQC migration is being distorted by commercial incentives.

What JVG Actually Does

The JVG algorithm (named after its three authors: Jesse Van Griensven Thé, Victor Oliveira Santos, and Bahram Gharabaghi) restructures Shor’s integer factoring pipeline. It offloads modular exponentiation — Shor’s most resource-intensive phase — to classical processors, and substitutes the Quantum Fourier Transform (QFT) with a Quantum Number Theoretic Transform (QNTT) for period finding.

The paper reports simulation results showing roughly 99% reductions in gate counts and memory usage versus baseline Shor implementations, and demonstrates execution on real IBM quantum hardware. These are measurable results on the composites tested.

The composites tested are 15, 21, 143, 1,363, and 67,297 — all smaller than 17 bits. The RSA-2048 projections (11-hour runtime, fewer than 5,000 qubits) are extrapolations from log-log trend lines fitted to those five data points. The paper has not undergone peer review.

Three Unresolved Technical Problems

1. The state-preparation cost is excluded from the benchmarks.

Shor’s algorithm performs modular exponentiation in quantum superposition — that’s the source of its exponential advantage. JVG moves this step to classical computation, then encodes the results into a quantum register via amplitude encoding. This is a trade of time for data. To encode 2^n classical values into a quantum state (where n = 2048), you need an encoding circuit with either 2^n depth or 2^n width. For small composites, this cost is trivial. For RSA-2048, it is astronomical.

There is a deeper problem. If you have already classically computed the modular exponentiation results for every possible exponent, you haven’t just prepared the quantum circuit — you’ve already solved the factoring problem classically. At that point you don’t need a quantum computer. You need a very large database. The paper begins its quantum resource benchmarks after state preparation is complete, excluding the step where the actual computational cost lives.

2. Five toy-scale data points do not project to RSA-2048.

Sub-17-bit composites projected via trend lines to 2,048-bit keys, without a formal asymptotic complexity proof. The paper describes these as “indicative trends” — a caveat the press release omits entirely. Noise scaling and error accumulation in quantum circuits behave nonlinearly as circuits deepen. NISQ-era algorithms routinely hit exponential walls at scale that small-number benchmarks don’t reveal.

3. The “5,000 qubits” figure conflates logical and physical qubits.

The number refers to algorithmic register width — logical qubits. Running a deep quantum circuit reliably on real hardware requires quantum error correction, grouping hundreds to thousands of physical qubits per logical qubit. Peer-reviewed estimates for breaking RSA-2048 via Shor’s algorithm range from 4 to 20 million physical qubits depending on the error-correction scheme. The press release’s “5,000” figure elides this distinction.

There is also a question about what the 99% gate reduction actually measures. The paper claims QNTT is more “noise-tolerant” than QFT because it avoids complex rotations. But QNTT introduces modular arithmetic — multipliers and adders — within the quantum circuit. Implementing modular arithmetic on quantum hardware is notoriously gate-heavy. The reported 99% reduction likely compares mathematical operations rather than the physical gate implementations required to execute those operations on a real processor. At RSA-2048 scale, the physical gate cost of QNTT modular arithmetic may erode or eliminate the claimed advantage.

Who Made These Claims

Prof. Van Griensven Thé is an adjunct professor in Mechanical and Mechatronics Engineering at the University of Waterloo, where he teaches quantum computing and artificial intelligence at the doctoral level. His published research career spans environmental engineering, air quality modeling, and computational fluid dynamics. He has co-authored a Springer textbook on quantum computing and quantum machine learning. He is also the CEO of EigenQ, which develops and sells post-quantum cryptography hardware, and the founder of TAURIA, another PQC company. EigenQ has partnerships with HPE and WNC, is a member of the NVIDIA Inception Program, and is currently running a crowdfunding campaign on Republic.

Prof. Gharabaghi is a full professor at the University of Guelph’s School of Engineering, with over 250 publications — predominantly in hydrology, water quality modeling, and watershed management.

The “Advanced Quantum Technologies Institute” — which alternately identifies itself as “Applied” in its own press release — has minimal public presence. Its domain was registered months before the paper appeared, and it does not appear in established quantum research directories. The paper was not posted to arXiv, the standard preprint server for quantum computing and cryptography. AQTI appears to function as an internal brand for the authors’ own group rather than an independent research institution.

The authors have legitimate academic affiliations and a real commercial presence in PQC. They also have a direct financial interest in the conclusion that quantum threats are more imminent than current consensus suggests. This pattern has precedent. Schnorr’s 2021 claim to have destroyed RSA via lattice methods was amplified through press before the community quietly identified fatal scaling flaws. The 2022 Chinese QAOA factoring paper generated similar headlines before independent analysis showed it could not scale to cryptographically relevant key sizes. In each case, dramatic claims were made, media attention followed, and the community’s patient scrutiny eventually separated signal from noise.

The Only Outside Endorsement

The sole external voice quoted across media coverage is Nir Ben-David, CEO of Qombat Ltd, an Israel-based quantum defense company that operates in the same PQC commercial ecosystem as EigenQ and TAURIA. Independent cryptographic or quantum computing researchers have not, as of this writing, publicly validated or engaged with the JVG claims.

Community Response

The quantum security research community has not engaged. No arXiv commentary, no cryptography mailing list responses, no working group discussion. Marin Ivezic published a detailed technical rebuttal identifying the same state-preparation and extrapolation problems. The Hacker News thread is uniformly skeptical.

This silence doesn’t prove the algorithm is wrong. It tells you that the people who spend their careers on quantum factoring don’t see a reason to revise their models.

Summary

ClaimThe RealityThe Catch
“11-hour runtime”Counts only the quantum period-finding phaseExcludes the classical pre-computation and amplitude encoding, which scale exponentially for RSA-2048
“5,000 qubits”Refers to logical qubits, not physical hardwareWould still require millions of physical qubits to survive noise at that circuit depth
“99% gate reduction”Observed on composites smaller than 17 bitsQNTT modular arithmetic is gate-heavy on real hardware; extrapolation to 2,048 bits lacks formal proof

What’s True Regardless

Algorithms do improve. Shor’s isn’t the last word on quantum factoring, and the QNTT substitution is a real idea worth exploring through proper peer review. Regev’s 2023 lattice reduction work demonstrates that algorithmic progress can compress quantum resource requirements — independent of hardware timelines. The question is always whether a specific claim survives independent validation.

Harvest-now-decrypt-later is happening today. Nation-state and criminal actors are collecting encrypted data for future decryption. PQC migration is urgent regardless of whether JVG or any other specific algorithm matures.

The recommended actions in the paper’s conclusion — inventory public-key usage, demand vendor PQC roadmaps, deploy crypto-agile architectures — are sound. They’ve been sound since NIST began its PQC standardization process. That the advice is correct doesn’t validate the alarm attached to it.

The Cost of Manufactured Urgency

Every manufactured crisis erodes the credibility that PQC migration actually needs. A paper that hasn’t survived peer review, issued by an institute that didn’t exist six months ago, authored by executives who sell PQC hardware, claiming “Cybersecurity Apocalypse” — this is how legitimate urgency gets turned into background noise.

The quantum threat to classical cryptography is real. CNSA 2.0 timelines are real: software and firmware signatures by 2027, most systems by 2030, full deprecation of legacy algorithms by 2035. BSI TR-02102 requirements are real. The G7 Cyber Expert Group roadmap is real. These deadlines were set by people with no PQC products to sell.

The question that matters is not whether RSA will eventually fall. It will. The question is whether your organization is making measurable progress replacing it — and whether that progress is keeping pace with published deadlines. That’s a trajectory problem, not a headline problem. Trajectory is what you should be measuring.