[Cryptography] quantum computers & crypto

Ray Dillinger bear at sonic.net
Fri Oct 29 21:21:24 EDT 2021


Many of the software fixes for key distribution infrastructure relate to
the sheer size of post-QM keys. But those aren't the big problems or
most difficult changes. 

As I think of it, the biggest problem we face is one of the biggest
problems we've always faced - key management.  QC presents problems that
will require a return to at least some form of algorithm agility, and we
don't want to give control of algorithm agility to "cipher and key
negotiation" that allows users to fail to upgrade or gives attackers
their choice of attacks against users who haven't been warned or who've
disregarded the warnings. 

Post-QC math is not a mature field, and discoveries are being made all
the time.  We have very little assurance that any algorithm we've
already thought of won't be discovered next year to be equivalent after
some transformation to some problem we've already discovered how to
solve.  So it's going to require some degree of agility. Here is my
proposal.

Software ought to have three to seven algorithms, at most, built in.  No
plugin anything, plugins are an attack surface. Not even any use-time
negotiation among those algorithms; they're strictly ordered, become
unavailable when decertified, and the software always uses the first
available.  If all those algorithms have been de-certified, that
program, or at least that version of it, has reached the end of its
useful life and ceases to work, period.

Ideally that's implemented as contagious de-certification - with some
form of accepted, certified proof that the de-certification is real. 
Refusing the current "standard" algorithm requires showing proof of
de-certification to prevent it from being spoofed, and having seen that
proof the other party to the connection must immediately start refusing
that algorithm flatly, roll forward to the designated "next"
still-certified algorithm, and showing the de-certification proof to any
contactee that attempts to use it.

The users get no choice and no "go-backs" and convert whether or not
they have any warning, because the attacker will make choices
maliciously and strike hoping to find users who haven't converted.

But if de-certification is contagious and automatic, then it hits when
people aren't ready for it, the same way the new attacks will hit if
de-certification isnt contagious and automatic.  The problem with that
is all the key servers that have their certs for de-certified algorithms
and all the customers whose certs are in terms of de-certified
algorithms.  This is where we have always gotten the hard pushback from
willing victims who won't give up their ass-backward compatibility.

In order to prevent willing victims from demanding that their info and
money should be given to crooks, Software must be designed for
glitch-free continued operation across algorithm decertifications. 
First we need to redesign security certs.  Each and every security cert
compatible with a given program must carry keys for not just one
algorithm but for every algorithm that program might use. When
contagious de-certification hits the key server, they cease to provide
any cert information about de-certified keys and roll onward to the same
"next" non-decertified algorithm that the software will use (in addition
to becoming a new vector of contagion themselves).  That way as long as
there's even one certified algorithm remaining, users don't get caught
flatfooted and programs continue without crashes.  When a new software
version adds algorithms, users should be invited to update their certs
to the new version to be compatible with the new ciphers.

That's all just my opinion.  But I don't mind if other folks use it.

            Bear









More information about the cryptography mailing list