<!DOCTYPE html>
<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8">
</head>
<body>
<div class="moz-cite-prefix">On 1/11/26 5:01 AM, Viktor S.
Kristensen via cryptography wrote:<br>
</div>
<blockquote type="cite"
cite="mid:LlXj5FRqnIdQLmpbtAxoVjohpXLPf_gINt6FPSDKFYuJJJf7n6PGr4jGeit6yFlJZEvbVz3L0V6g73ax-Ycjzc8JJLZop45begpg5X6fTBc=@pm.me">
<pre wrap="" class="moz-quote-pre">John,
Your DES parallel is apt, and the "sucker bet" framing deserves a serious response.
You're right that the DES history was: NSA knew 56 bits was attackable, told everyone it was fine, and reaped the intelligence benefits. The pattern repeated with Dual_EC (backdoor) and arguably Crypto AG (wholesale compromise). Seven decades of documented adversarial behavior toward civilian cryptography.</pre>
</blockquote>
<span style="white-space: pre-wrap">
</span>
<p>It's clear that, whether best-effort or deliberate-sabotage, any
standard for quantum-resistant cryptography is a ludicrously
premature standard made without any input from real-world systems
or threats. As such it can only be harmful. </p>
<p>The real damage that DES did wasn't done by being completely
inadequate starting just three or four years after its
introduction. The real and lasting damage it did was by being
standardized.</p>
<p>Because there was a standard, its use was mandated for many
users. Because there was standard, other users wound up using it
because of FUD. Because there was a standard, research in
cryptography was hobbled for a decade or more. Because there was
a standard, American businesses (and much of the rest of the
world) was left vulnerable to every spy and crook for almost
twenty years. Because there was a standard, an entire generation
of software and byte-level protocols were written with just seven
or eight bytes of space for keys, and had to be refactored,
patched, and used with improvised solutions in light of cold
reality. And key management is the hardest part of cryptographic
software. Inevitably a lot of those patch jobs were ill-conceived,
badly designed, overly complex, and leaked.</p>
<p>DES was a ludicrously premature standard that protected, just
barely, against threats facing civilian businesses, as they were
understood in 1977. IBM and the NSA either didn't understand, or
deliberately ignored, the greatest threat that faces
cryptographically secured data: the fact of computer speed and
memory increasing exponentially over time. It was secure, against
attacks that actually mattered to most business users, for maybe
five years, and making it a standard kept it in use for almost
thirty. </p>
<p>And that is what ludicrously premature standards do, and that IMO
is why NIST's suggestions in this matter ought to be ignored. </p>
<p>Premature standards are made without regard to things that turn
out to be the crucial risks facing real users, and do not stand
when the real world differs from the theory that informed the the
standard. This is true even when the standard is an honest
best-effort in the first place. As several people have pointed
out this one might not be. In fact considering the history, the
odds aren't even very good.</p>
<p>The odd fixation on using the "QC-resistant" lattice algorithm
_by_itself_ is particularly suspicious. That algorithm hasn't yet
had sufficient attention, expertise, and time devoted to its
analysis. As such it should ONLY be deployed in multi-layered
implementations, and any recommendation to the contrary seems like
a red flag that some kind of shenanigans are going on. </p>
<p>NIST is, at the very least, striking before the iron even begins
to get warm. We have no way of knowing, at this point, what good
quantum cryptography protection will look like, and premature
standards are _actively_counterproductive_ because they are
inevitably formed on too little information and inevitably inhibit
the development of diverse systems. When the iron does start to
get hot, it would be nice to look over a stable of a few hundred
different attempted solutions, consider whatever the best ten or
twelve turn out to be against whatever reality happens to turn out
to be, and THEN try to discern the criteria necessary for a
standard to address. Otherwise we only have one attempted
solution (ie, one premature standard) and all the eggs are in one
basket. If that standard turns out to be wrong (or an act of
deliberate security sabotage) then the world is one big omelette. </p>
<p>The only take-home lesson I've got from quantum cryptography so
far, is that if in some future version of the world quantum
computers are reliable, fast, and scalable as classical computers,
then ordinary symmetric-key crypto algorithms would need keys and
internal state twice as long.</p>
<p>And as some have pointed out, Quantum Cryptography Is Pure
Bollocks as far as our practical experience really tells us today.</p>
<p>Bear</p>
<p><br>
</p>
</body>
</html>