Sebastien Rousseau

Quantum Thresholds Are Moving: 10,000-Qubit Shor Risk

A new paper suggests Shor's algorithm could run on as few as 10,000 qubits. The implications for cryptography are hard to ignore.

Quantum Thresholds Are Moving Again

A new paper suggests Shor's algorithm could run on as few as 10,000 qubits. The threshold for cryptographically relevant quantum computing is dropping faster than most had assumed.

Key Takeaways

  • A new paper proposes Shor's algorithm could execute on as few as 10,000 physical qubits. Roughly one hundred times fewer than previous consensus estimates.
  • The reduction is driven by three converging advances: high-rate quantum error-correcting codes, reconfigurable neutral atom arrays, and increased parallelism.
  • The threat is not uniform. Elliptic Curve Cryptography (ECC) is more vulnerable at lower qubit counts; RSA-2048 requires significantly longer runtimes at comparable scales.
  • This is a theoretical projection, not a working demonstration. A substantial engineering gap remains between current hardware and fault-tolerant operation at this scale.
  • Post-quantum cryptographic standards are already finalised. The priority now is accelerating migration. Not waiting for a quantum system to appear.

A Familiar Assumption, Now Under Pressure

Over the past decade, discussions around quantum computing and cryptography have followed a familiar arc. Quantum machines were acknowledged as theoretically powerful, yet considered impractical at scale. Breaking modern cryptographic systems would require millions of physical qubits, and the timeline remained comfortably distant. That assumption is now under serious pressure.

A recent paper, Shor's algorithm is possible with as few as 10,000 reconfigurable atomic qubits ⧉, proposes something more consequential than a single breakthrough. It suggests that the threshold for cryptographically relevant quantum computation may be an order of magnitude lower than previously believed. Not millions of qubits, but tens of thousands. The distinction matters, and the direction it implies is difficult to ignore.

The Convergence Driving the Shift: Error Correction, Architecture, and Parallelism

The result does not emerge from a single discovery. It reflects a convergence of improvements across several layers of the quantum computing stack that, taken together, shift the boundary of what appears feasible.

The first improvement concerns error correction. Traditional approaches required large overheads, often hundreds of physical qubits to represent a single logical qubit. The paper instead relies on high-rate quantum error-correcting codes, which significantly reduce that overhead. (Emergent Mind ⧉) The second concerns architecture. The system is built on reconfigurable arrays of neutral atoms, which can be rearranged during computation to allow for more flexible connectivity and more efficient execution. (The Quantum Insider ⧉) The third is parallelism: increasing the number of qubits allows more operations to run simultaneously, reducing overall execution time.

None of these ideas are new in isolation. Combined, however, they reframe what was previously treated as a hard limit.

From Millions to Tens of Thousands: What the Numbers Actually Mean

For years, the consensus estimate for running Shor's algorithm at cryptographic scales required millions of physical qubits. The new analysis suggests that, under certain assumptions, this number could fall to approximately 10,000. (arXiv ⧉) That figure, however, is not the complete picture.

At the lower end of that range, runtimes remain long. Factoring RSA-2048 at minimal qubit counts could still take years of continuous operation. Faster execution requires more qubits, potentially in the tens of thousands. The relationship between qubit count and runtime is not linear, and the paper is careful to present this as a spectrum rather than a fixed threshold. What changes is the direction: the barrier is no longer purely theoretical. It is now a question of engineering.

Old Assumptions vs. New Realities

Dimension Old Assumption New Reality
Physical qubits required (Shor's algorithm) ~1,000,000+ ~10,000–26,000
Time to break RSA-2048 (at minimum qubits) Not feasible this decade Years (at 10K qubits); faster with more
Time to break ECC-256 Not feasible this decade Days (estimated at ~26K qubits)
Dominant hardware paradigm Superconducting qubits Reconfigurable neutral atom arrays
Error correction overhead Hundreds of physical qubits per logical qubit Significantly reduced via high-rate codes
Nature of the barrier Theoretical Engineering
Migration urgency Long-term planning Active deployment required now

Source: Analysis based on arXiv:2603.28627 ⧉ and prior literature.

Time, Scale, and the Uneven Vulnerability of Cryptographic Systems

One of the more significant contributions of the paper is the nuance it introduces around time. Quantum advantage does not arrive all at once. It exists along a spectrum determined by the scale of the system and the nature of the cryptographic target.

With approximately 26,000 qubits, the authors estimate that breaking elliptic curve cryptography could take days under favourable conditions. (arXiv ⧉) For RSA-2048, the timelines are considerably longer. This asymmetry is important. It suggests that different cryptographic systems may become vulnerable at different points in time, rather than simultaneously, and that the transition to post-quantum standards is unlikely to be a single event with a single deadline.

This pattern is consistent with broader reporting. Analyses from recent months suggest that quantum systems capable of challenging widely used encryption could emerge before the end of the decade. (Nature ⧉) Governments and standards bodies are already planning transitions to post-quantum cryptography, with implementation timelines extending into the 2030s. (The Quantum Insider ⧉) The discussion has moved from whether to when.

The Engineering Gap That Remains

It is important to be precise about what this paper represents. It is a projection, not a demonstration. The proposed systems depend on assumptions about error rates, hardware stability, and scaling behaviour that have not yet been validated at the required scale. Current experiments operate at the level of hundreds to low thousands of qubits, not tens of thousands operating fault-tolerantly over extended periods. (Phys.org ⧉)

A substantial engineering gap remains. The path from a compelling theoretical model to a functioning system capable of sustained, fault-tolerant operation at this scale involves challenges that are not yet fully understood, let alone solved. What has changed is not the proximity of a working machine, but the credibility of the target. The gap is narrowing, and the direction of progress is consistent.

Why the Compressing Timeline Demands Attention Now

The significance of this work is not that cryptography will be broken in the near term. It is that the timeline is compressing in ways that affect decisions being made today. Security systems are designed with long lifecycles in mind. Data encrypted now may need to remain confidential for decades. Infrastructure decisions made this year will be difficult to reverse within a five-year window. If quantum capabilities arrive sooner than expected, those assumptions become fragile.

This is why post-quantum cryptography is already being deployed across critical sectors. Not because the threat is immediate, but because the transition takes time and the cost of being late is asymmetric. There is a recurring pattern in the history of computing: progress appears slow until it is suddenly not. What begins as a theoretical improvement becomes a practical constraint, and what was once dismissed as distant becomes something that must be planned for. Quantum computing may be following exactly that trajectory, not through a single dramatic breakthrough, but through steady reductions in cost, complexity, and scale.

What This Means by Industry: A Practical Guide

The implications of this research are not uniform across sectors. The appropriate response depends on the type of cryptographic assets at risk, the sensitivity and longevity of the data involved, and the pace at which regulatory expectations are moving.

Financial Services and FinTech

Financial institutions face a compounded risk: they hold long-lived sensitive data, operate on infrastructure with slow replacement cycles, and are subject to increasing regulatory scrutiny around cryptographic resilience. ECC is widely used in TLS connections, mobile authentication, and digital signatures across payment rails. The cryptographic category the paper identifies as most vulnerable at lower qubit counts. Institutions that have not yet begun a cryptographic inventory or initiated a post-quantum migration roadmap should treat this paper as a prompt to accelerate, not a reason to panic. CRYSTALS-Kyber and CRYSTALS-Dilithium, both now standardised by NIST, are the appropriate migration targets for key encapsulation and digital signatures respectively.

Government and Defence

State-level actors have the strongest motivation. And in many cases the resources. To accelerate quantum hardware development beyond what is publicly known. Governments holding sensitive communications, intelligence data, or critical infrastructure keys must assume that adversaries are already harvesting encrypted data for future decryption, a strategy commonly known as "harvest now, decrypt later." For public sector organisations, compliance with national quantum-readiness mandates is increasingly unavoidable, and the window for proactive migration is narrowing.

Healthcare and Critical Infrastructure

Healthcare records, utility control systems, and industrial networks share a common vulnerability: data and systems with very long operational lifespans, protected by cryptographic standards that were designed for a pre-quantum threat model. A medical record encrypted today may need to remain private for fifty years. A control system certified this year may remain in service for two decades. For these sectors, the compressing timeline is not an abstract concern. It is a direct challenge to the foundational assumptions behind current security architectures.

Conclusion

The most important aspect of this paper is not the specific qubit count it presents. It is the direction that count implies. The question is no longer whether quantum computers can challenge modern cryptography. It is how quickly the required systems can be built, and whether the organisations that depend on current standards are moving fast enough in response.

For now, the answers remain uncertain. But the margin for deferring the question is narrowing, and the cost of waiting grows with each credible reduction in the theoretical threshold. The cryptographic community, security planners, and the industries that rely on them would do well to treat this paper not as cause for alarm, but as a serious prompt to accelerate transitions that are already underway.

Frequently Asked Questions

Can 10,000 qubits really break RSA encryption?

Theoretically, yes. But with important caveats. While previous estimates suggested millions of physical qubits were required, new research into high-rate error correction codes and reconfigurable neutral atom arrays suggests the threshold is significantly lower. However, at 10,000 qubits, the estimated runtime for factoring RSA-2048 remains extremely long. Potentially years of continuous operation. Faster attacks require more qubits, likely in the range of tens of thousands. The paper represents a projection based on modelled assumptions, not a demonstration on a working system.

Which encryption is most at risk from quantum computing?

Elliptic Curve Cryptography (ECC) is generally more vulnerable to lower qubit counts than RSA-2048. The paper estimates that breaking ECC could take days using approximately 26,000 reconfigurable qubits under favourable conditions. RSA-2048 requires a significantly longer runtime at comparable qubit counts. This asymmetry means ECC-dependent systems. Common in TLS, mobile authentication, and blockchain. May face risk on a shorter timeline than RSA-based infrastructure.

What is a reconfigurable neutral atom qubit?

Neutral atom qubits are individual atoms. Typically rubidium or caesium. Trapped and manipulated using laser light in a vacuum chamber. "Reconfigurable" means the arrangement of atoms can be changed dynamically during computation, allowing more efficient execution of complex quantum circuits. This flexibility reduces the number of physical qubits needed to implement fault-tolerant logical operations, and is a key reason the new paper achieves lower qubit estimates than earlier work based on superconducting qubit architectures.

What is post-quantum cryptography and why is it being deployed now?

Post-quantum cryptography (PQC) refers to cryptographic algorithms believed to be secure against both classical and quantum computers. NIST finalised its first set of PQC standards in 2024, including CRYSTALS-Kyber for key encapsulation and CRYSTALS-Dilithium for digital signatures. Deployment is beginning now. Well before quantum computers pose an immediate threat. Because cryptographic transitions are slow. Replacing embedded standards across global infrastructure typically takes a decade or more, and data encrypted today may need to remain confidential long after quantum capabilities mature.

How many qubits does the most powerful quantum computer have today?

As of early 2026, leading quantum systems operate in the range of hundreds to low thousands of physical qubits. Crucially, most are not yet fault-tolerant. They operate below the error-correction thresholds required for sustained, reliable logical computation. The gap between today's hardware and the tens of thousands of high-fidelity, fault-tolerant logical qubits described in the new paper remains significant, though the pace of progress across superconducting, neutral atom, and trapped-ion platforms is accelerating.

References

Last reviewed .