Key takeaways
Public-key cryptography can become unreliable in certain post-Q-Day scenarios.
The earliest onchain stress is likely to concentrate in digital signatures (authorization and ownership) because they prove control.
What fails first depends heavily on public-key exposure time and how quickly systems can coordinate upgrades.
Post-quantum migration is primarily a standards and coordination challenge involving wallets, infrastructure and governance.
Quantum computers might not arrive this decade, or even the next. But that has not stopped people from obsessing over their potential impact on crypto.
After all, a cryptographically relevant quantum computer (CRQC) would threaten specific cryptographic primitives that many digital systems rely on today, especially public-key cryptography used for identity and authorization.
The timing of a CRQC remains uncertain, which is why public discussions often blend confident claims with genuine unknowns.
Let’s separate the myths from the facts and answer what’s likely to fail first on Q-Day when it comes to crypto.
Did you know? “Q-Day” is shorthand for the point at which quantum computers become capable of breaking widely used public-key cryptography, typically referring to Shor-type capabilities. It is not a fixed date; some national guidance uses planning horizons such as 2035.
The two quantum superpowers: Shor vs. Grover
Most of the “quantum breaks crypto” discussion comes down to two algorithms that target different parts of modern cryptography.
Shor’s algorithm is the disruptive one for public-key systems. The National Institute of Standards and Technology (NIST) has repeatedly noted that widely deployed public-key cryptosystems based on integer factorization and discrete logarithms — the foundations behind Rivest-Shamir-Adleman (RSA) and common elliptic-curve approaches — are known to be vulnerable if a sufficiently large, fault-tolerant quantum computer becomes practical.
In short, digital signatures and key agreement methods that rely on those hardness assumptions come under threat.
Grover’s algorithm is different: It speeds up brute-force search. The NIST’s reports on post-quantum cryptography describe this as a quadratic speedup and explicitly frame its practical relevance as uncertain. The impact is typically modeled as reducing the effective security level of symmetric cryptography and hash functions rather than breaking them outright.
The first-order quantum concern tends to relate to authentication and authorization, the cryptography used to prove that a transaction was signed by the owner rather than a sudden, universal failure of all encryption.
What fails first onchain?
On most blockchains, the right to move funds is enforced by digital signatures.
Instead of checking identity, the network checks whether a transaction includes a valid signature under the rules of that chain. In Bitcoin’s common transaction flow, for example, spending is authorized by proving control of a key pair. Bitcoin uses the Elliptic Curve Digital Signature Algorithm (ECDSA) with the secp256k1 curve for that purpose.

Ethereum’s base account model is also signature-driven. The protocol specification describes a precompiled contract for ECDSA public-key recovery, the “ecrecover” function, which is a building block for verifying who signed a message or transaction.
This is why standards discussions often treat signatures as a primary onchain breakpoint in a CRQC scenario. The NIST maintains that Shor’s algorithm would undermine the hardness assumptions behind widely used public-key systems, including those families, if scaled quantum hardware becomes practical.
None of that specifies when such a machine will exist. It only clarifies which kinds of cryptography would be pressured first and why.
Did you know? BlackRock’s iShares Bitcoin Trust disclosures flag quantum computing as an unpredictable emerging risk and warn that sufficiently advanced quantum technology could undermine the cryptography Bitcoin relies on, potentially impacting the network, wallets and shareholders.
Who gets hit earliest: The “exposure window” ranking
If a CRQC ever arrives, the earliest pressure points are likely to be the public keys that have been visible for the longest time, especially in places where there is limited ability to coordinate a fast, clean upgrade across wallets, users and infrastructure.
Keys with long, already-public exposure
In Bitcoin’s common pay-to-public-key-hash (P2PKH) flow, the full public key is revealed in the unlocking data when an output is spent, as the signature script includes the unhashed public key. Analyses treat this as a key variable. Once a public key is published, the question becomes how feasible it is to derive the corresponding secret key under different CRQC assumptions.
In-flight spends (time-window exposure)
Even when public keys are only revealed at spend time, there can be a narrow window between broadcast and durable inclusion in the ledger. This is often framed as the main vector for a signature-break style attack, from broadcast until inclusion in a block and some period after.
Whether this window is meaningful depends on how fast quantum key recovery could be in a CRQC scenario, something the literature treats as uncertain and highly assumption-sensitive.
Ethereum’s “externally owned accounts” at scale
On Ethereum, externally owned accounts (EOAs) are controlled by private keys, and the protocol’s account model specifies a particular signature scheme for EOAs (secp256k1 ECDSA). If that signature assumption were weakened by a CRQC, the potential exposure would be broad simply because EOAs are the default authorization surface for a large portion of user activity.
The ordering would ultimately depend on CRQC capability, network conditions, wallet and account patterns, and how quickly ecosystems can coordinate migrations.
What doesn’t instantly fail: Hashing, PoW and “harvest now, decrypt later”
A useful limiter is that a quantum breakthrough would not switch off blockchains in the way people sometimes imply.
The most discussed quantum algorithm for asymmetric cryptography, Shor’s algorithm, targets the mathematics behind widely used public-key systems, not the general idea of hashing.
By contrast, Grover’s algorithm is typically described as a quadratic speedup for brute-force search, which can reduce the effective security margin of symmetric cryptography and hash functions rather than making them trivially breakable overnight.
The NIST’s post-quantum reports note that if Grover’s algorithm were practically relevant, a common heuristic would be to increase symmetric key sizes, though they also flag uncertainty and limits in current quantum cryptanalysis understanding.
For blockchain systems, that means components such as hashing and proof-of-work (PoW) are not usually framed as the “first domino” in standards discussions. The sharper early pressure is still on signature-based authorization, meaning who can validly sign.
Meanwhile, a quieter risk discussed by the NIST is offchain, where adversaries can capture encrypted data today and potentially decrypt it later if quantum capabilities advance, called “harvest now, decrypt later.” This matters for long-lived sensitive records regardless of what happens onchain.
Did you know? The first cryptographically relevant quantum computers would likely be controlled by nation-states or major, well-funded labs. Bad-actor risk mostly comes from state-aligned misuse or access concentration. A US Federal Reserve paper notes that this could involve a rogue nation-state or a malicious consortium acquiring the capability.
The widely discussed fix: Migration
If quantum-capable attacks ever become practical, the fix lies in migration. New cryptography would need to be standardized, implemented in wallets and infrastructure, and then adopted in a way that preserves backward compatibility and minimizes breakage. That is why standards bodies focus so heavily on transition mechanics and risk mapping.
On the standards side, the NIST has already published finalized post-quantum standards for key establishment (ML-KEM) and signatures (ML-DSA and SLH-DSA). It also emphasizes that timelines for a cryptographically relevant quantum computer remain uncertain and that transition planning should proceed under uncertainty.
In blockchain terms, the migration patterns being discussed publicly tend to look like this:
Protocol-level support for new spend conditions that can verify post-quantum signatures, such as new output or script types
Hybrid periods where legacy and post-quantum authorization can coexist, allowing systems to phase in new verification rules without assuming hard deadlines
Account abstraction on Ethereum, where authorization logic lives in programmable account contracts rather than being permanently fixed to one signature scheme, enabling evolvable validation as standards mature.
The hard part, consistently, is coordination, making upgrades usable and widely adopted before any real risk materializes.
Key signals analysts and standards bodies monitor
If a CRQC ever becomes practical, the earliest onchain stress would concentrate in signature-based authorization, meaning who can validly sign, while the timeline remains uncertain.
Instead of treating quantum risk as a single break moment, analysts and standards bodies point to signs of transition:
Finalized PQC building blocks, such as the NIST’s first post-quantum standards for key establishment and signatures
Migration guidance for internet protocols, including IETF drafts
Explicit warnings about long-lived confidentiality through “harvest now, decrypt later” scenarios in national cybersecurity guidance.

