A bet on whether ML-KEM-768 or X25519 will break first
55 points by figsoda
55 points by figsoda
It's always fun to see friendly public bets like this.
If neither is broken [by the end of 2040], the main wager is a push and no donation is made.
Personally, I think that's the most likely outcome, maybe about a 60% chance no money changes hands? Interested to hear what other folks think.
How I made up that number:
(1 - 0.15) * (1 - 0.3) = 0.595 ≈ 60%.Filippo Valsorda buys Matthew Green a reasonable round of drinks if, by the deadline, ML-KEM-512 is no longer considered secure for new deployments.
I kind of want to see this happen—just to force the issue of what constitutes a "reasonable" round!
It's arguable that in the context of cryptography, a single round is never really reasonable! Best practice seems to be 12-24 rounds.
The fact that implementation-level defects are excluded sidesteps the main motivating factor for the advocates of “only-hybrids everywhere”: all implementations of PQC are quite novel compared to ECC implementations, in addition to the fact that new defects in new or existing ECC implementations are not unheard of either.
We want hybrids because while their cost is negligible, it requires the attacker to exploit vulnerabilities in 2 implementations at the same time, providing an extra layer of security for everyone.
If implementation bugs are factored in, I would bet triple on pure PQ. The ML primitives are just a lot easier to implement safely, two primitives means double the risk of memory safety issues, and the hybridization layer and logic (and all the bookkeeping around key types and OIDs etc.) are 10x more likely to have bugs than the ML primitives.
But are the safety issues in the hybridization layer and logic so much more common, as your intuition suggests? So far we have ample evidence of many bugged algorithm implementations and not as many on the combiners, if the hybridization is done right.
In the wide spectrum of all possible hybrid constructions, I do fully agree with you: that is why I believe Composite signatures, where the combiner acts as a standalone algorithm and does not expose to applications the combination logic or the decisions on what to do when either part fails or how to handle parallel chains of trusts, etc. , to be preferable to the other hybrid signature proposals that are more focused on backward compatibility or performance.
The currently preferred hybrid solutions, like the TLS KEM hybrids already deployed, X-Wing for a more general purpose, or the CompositeMLDSA construction for signatures, actually include in their considerations also ease of implementing the combiner logic: their complexity is orders of magnitude lower compared to Lattices, EC, and even modular integer arithmetic.
For example, you do know first hand what takes to write implementations that abide to const-time programming practices, and how much more it takes to make sure the the whole stack (from compilers to the various hardware features of modern platforms) does not introduce data-timing dependencies in CT-as-written code.. These have been a recurring source of key-recovery attacks for old and new implementations of old and new algorithms, even today. This is an entire class of very frequent defects that in general hardly applies to the various hybrid combiners in the IETF pipeline, as their designs strive to only process fixed size byte-strings through primitives immune to data-timing dependencies beyond the byte length of their inputs (either naturally oblivious or already heavily hardened against them).
I agree with you, hybrids done wrong can be terrible for security. But as a community we have put the effort to build good hybrid combiners, and wherever their cost is negligible, they do provide users an extra layer of protection, by requiring an adversary to find defects at the same time in both combined algorithms rather than in just one.
Security does need to be affordable, and it’s inevitable that there will be use cases in which an hybrid solution won’t be viable, but now that we have solid hybrid constructions to pick from, I believe we should strive for hybrid solutions for the long term unless there is strong evidence of those not being affordable.
We almost agree, but I break the complexity vs benefit tradeoff in favor of hybrid KEMs (for now) and against composite signatures. The fully abstracted hybridization of the latter is an illusion. That would mean 18 new key types and private key formats. For an example of the cost of carrying more types, look at the most recent WolfSSL sev:crit, which is arguably the result of supporting just ECDSA and Ed25519.
There’s also a maintainer bandwidth argument: my time is limited and I can either spend it making ML-DSA safer, or maintaining ECDSA and the hybrid layer.
So far we have ample evidence of many bugged algorithm implementations
Can you give me any examples? I’m only aware of one security vulnerability in production ML-KEM or ML-DSA libraries ever, and it’s a side channel in ML-KEM that doesn’t impact ephemeral key exchange like in TLS.
The funniest outcome for the secondary wager would be a quantum algorithm to attack current-generation lattice cryptosystems, followed by a quantum computing implementation advance.