A Cryptography Engineer’s Perspective on Quantum Computing Timelines

33 points by fkooman


andrewrk

I think this is a well-articulated point, and it's pretty damn important for our industry. I don't have an opinion to offer but I look forward to reading lobste.rs commentary about this.

nadim

I spent the evening looking into quantum computing timelines as a non-expert in quantum computing. Here is what I’ve learned:

We currently have machines with ~1,000–1,500 physical qubits at error rates around 10⁻³, and Google’s algorithm requires ~500,000 physical qubits operating coherently together with surface code error correction, yoked qubit storage, magic state cultivation producing ~500K T states per second, and reaction-limited execution at 10μs cycle times — none of which has been demonstrated beyond small-scale proof-of-concept experiments.

Scaling from where we are to where this needs to be isn’t a matter of incremental improvement along a Moore’s Law curve; it requires solving qualitatively new engineering problems in qubit fabrication yield, correlated error suppression across a massive chip (or multi-chip interconnects that don’t exist yet), cryogenic wiring and control electronics for half a million qubits, real-time classical decoding at the required throughput, and sustained coherence of a “primed” quantum state across minutes of wall-clock time — any one of which could prove to be a multi-year bottleneck, and all of which must be solved simultaneously.​​​​​​​​​​​​​​​​

Given the above, I just don’t see how we’re going to get to a cryptographically relevant quantum computer by 2030, especially given that we need a ~350× increase in physical qubit count with simultaneously tighter error correlations, an entirely new cryogenic control and wiring architecture to address half a million qubits, real-time decoding infrastructure that doesn’t exist yet, magic state distillation factories operating at industrial throughput, and multi-minute coherent idle times for primed states — and historically, solving even one of these at scale has taken the field the better part of a decade.

bitshift

“Doesn’t the NSA lie to break our encryption?” No, the NSA has never intentionally jeopardized US national security with a non-NOBUS backdoor

I believe the author when they say ML-KEM and ML-DSA are trustworthy. And I believe the NSA does good things sometimes! But there's a little too much hedging in that statement—maybe they're just being precise, but to my layman's eyes I feel like it doesn't support the author's argument very well.

das

I can't find the source, but I remember reading an article once that maintained that backward compatibility with legacy crypto while transitioning to PQ (AKA, "crypto-agility") is a bad idea, since it threatens everybody with protocol downgrade attacks. I think of that claim as uncontroversial, but I'd be interested in other perspectives, if there are any.

YogurtGuy

Authentication is not like that, and even with draft-ietf-lamps-pq-composite-sigs-15 with its 18 composite key types nearing publication, we’d waste precious time collectively figuring out how to treat these composite keys and how to expose them to users.

Here's the list of the key types. There are so many because of the combinatorial explosion of classical signature algorithms × post-quanatum signature algorithms.

But it's not clear that all those types necessarily need to be supported! e.g. people have been suggesting that RSA-2048 be avoided for awhile now. If we said that the exclusive classical signature algorithm was Ed25519 (or Ed448, if it'd make people feel better to pick a different FIPS-certified signature algorithm, but with >128-bits of security), then we avoid the issue of a million different key types altogether.