Cardano’s charles hoskinson on quantum computing and blockchain security

Cardano founder Charles Hoskinson believes the cryptocurrency industry is heading toward a collision with quantum computing—and that getting ahead of the threat won’t be painless.

He argues that the main challenge is not figuring out *how* to become post‑quantum secure, but *when* to actually implement those protections. Move too late, and blockchains could be vulnerable to powerful new attacks. Move too early, and networks may suffer from serious performance and cost penalties that users and validators are not prepared to absorb.

Hoskinson notes that the cryptographic building blocks for a post‑quantum future already exist. In 2024, the U.S. National Institute of Standards and Technology (NIST) published a set of post‑quantum cryptography (PQC) standards designed to withstand attacks from large‑scale quantum computers. These include new algorithms for securing digital signatures and key exchanges—the very foundations of modern blockchain security.

From a purely theoretical standpoint, this means blockchains can already start migrating away from classical schemes like ECDSA and EdDSA toward quantum‑resistant alternatives. But Hoskinson stresses that the issue is not the *availability* of these algorithms, but the deeper economic and technical consequences of deploying them at scale on live, permissionless networks.

According to him, post‑quantum cryptography typically comes with a steep performance bill. “Post‑quantum crypto oftentimes is about 10 times slower, 10 times larger proof sizes, and 10 times more inefficient,” he said. That inefficiency doesn’t just show up on paper: it directly impacts block size, transaction throughput, storage requirements, and the hardware specs needed for nodes and validators to stay in sync.

Larger proofs and signatures would inflate the size of each transaction, meaning fewer transactions can fit into a block. This could reduce throughput or force networks to increase block sizes, which in turn raises bandwidth and storage demands. Light clients could become heavier, archival nodes more expensive, and full node participation increasingly out of reach for ordinary users. In other words, hardening blockchains against quantum attacks risks undermining one of their key values: decentralization.

Validators and miners would also face higher computational loads. If every signature verification or zero‑knowledge proof takes significantly more time and resources, block production becomes more demanding. Node operators might have to upgrade their hardware or rely on specialized accelerators, shifting the ecosystem toward those who can afford more powerful machines. For Hoskinson, this is where the “timing” question becomes existential: upgrading too early could centralize the network long before quantum computers pose a real‑world threat.

Hardware support sits at the center of this dilemma. Many of today’s cryptographic operations are heavily optimized at the hardware level, with CPUs and specialized chips designed to process classical algorithms efficiently. Post‑quantum schemes, by contrast, are far less optimized and often lack dedicated hardware acceleration. Until chipmakers and hardware vendors catch up, running post‑quantum algorithms at scale will remain expensive and slow.

Hoskinson’s warning implies that the industry will likely need a transitional phase, during which both classical and post‑quantum mechanisms co‑exist. Hybrid signatures, layered key schemes, or dual‑address systems could allow users to gradually move funds to quantum‑safe setups without forcing an abrupt, network‑wide migration. But these designs introduce their own complexity, increase code surface area, and potentially open new attack vectors.

Cardano, which places heavy emphasis on formal methods and long‑term protocol research, is watching the quantum horizon closely. However, Hoskinson’s comments suggest that its approach will be conservative: invest in research and design now, but avoid locking the network into heavy post‑quantum choices before the surrounding ecosystem—especially hardware and tooling—can support them efficiently.

The timing problem is amplified by the uncertainty around quantum computing itself. Estimates of when a practical, cryptographically relevant quantum computer will exist vary widely, from “within a decade” to “not anytime soon.” For blockchain architects, this means designing for a moving target: they must anticipate a future threat without knowing precisely when it will materialize or how powerful it will be at the moment of first impact.

At the same time, there is a risk of complacency. Even if powerful quantum machines are years away, data harvested today can be stored and decrypted later using quantum capabilities—a scenario often described as “harvest now, decrypt later.” Long‑lived secrets, state‑level adversaries, and high‑value addresses could be particularly vulnerable. Networks that wait too long to adopt migration paths might find that historic data or old keys become retroactively exploitable once quantum computers mature.

From a governance standpoint, the shift to post‑quantum security will likely require major protocol upgrades, possibly hard forks. These changes affect every participant: users, wallet providers, exchanges, node operators, and infrastructure firms. Coordinating such a large‑scale transition demands clear roadmaps, extensive testing, and strong social consensus. If different stakeholder groups disagree on the right timing or technical choices, networks could see fragmentation or parallel chains with incompatible security assumptions.

Economically, the trade‑offs extend beyond raw performance. Higher on‑chain costs could change which use cases remain viable. Small payments and micro‑transactions might become uneconomical if transaction sizes and verification times balloon. DeFi protocols that rely on dense, complex smart contracts could see gas costs spike under a post‑quantum regime. Projects may be forced to reconsider which logic runs on‑chain and which can be safely handled off‑chain or via rollups designed with quantum resistance in mind.

There is also the user experience dimension. Migrating to quantum‑safe keys may require users to rotate their addresses, back up new seed phrases, or move funds through special migration contracts. Every additional step is a potential point of failure or confusion. For mass adoption, the industry will have to hide much of this complexity beneath intuitive wallet interfaces and automated flows, while still giving power users and institutions the control and assurances they require.

In practice, the race to post‑quantum security is likely to unfold in layers. Core consensus mechanisms, account key schemes, and cross‑chain bridges will be the first places where quantum resistance becomes critical, because a compromise there could endanger the entire network. Application‑level protocols, messaging schemes, and optional privacy layers may follow at different speeds, depending on their risk profiles and the sensitivity of the data they protect.

Hoskinson’s broader message is that becoming post‑quantum is not a simple “upgrade and forget” event. It is a systemic transformation that will ripple across protocol design, hardware, economics, governance, and user behavior. The industry will have to accept real trade‑offs: some combination of lower efficiency, higher complexity, and short‑term centralization pressures in exchange for resilience against a future class of attacks that cannot yet be fully tested in the wild.

For now, his stance suggests a middle path: intensify research and prototyping, track NIST and academic progress, pressure hardware vendors to prioritize post‑quantum acceleration, and design migration strategies well in advance—while resisting the temptation to prematurely burden live networks with heavy cryptographic machinery before the infrastructure and incentives are ready to handle it.