[Cryptography] quantum computers & crypto
bear at sonic.net
Sat Oct 30 18:03:08 EDT 2021
On 10/30/21 3:55 AM, Joshua Marpet wrote:
> Adam and Victor have it right. Ray, fascinating idea!!! Revolver
> cryptography! There, I've named your new startup, you owe me at least
> a couple points on it. ;)
> Peter, holy crap, I'm totally stealing the homeopathic lasagna!!! ROTLFMAO
> Look, I think it's more a business issue than a technical issue right
> now. Again, in a few years, let's reassess. Hell, if it's a horrifying
> monstrosity of a problem in three years, I'll sponsor a dinner for the
> whole list at one of the conferences I help run. BSidesDE is mine, and
> I'm on the board of BSidesDC. Steak all round? :)
The technical end of it though, is that short of actually deploying
QC-specific encryption algorithms, the things we need to do to get ready
for a post-QC world are harmless or beneficial in a pre-QC world. Or
even a non-QC world if as some expect it never becomes viable.
Everything we need to do to get ready to deploy post-QC encryption
algorithms, is stuff that we arguably needed to do anyway.
All software with symmetric keys upgraded to handle keys twice as long
so symmetric crypto can be secure in a post-QC world? Harmless. If you
spot cases where key length doubling has a practical benefit in a non-QC
world, consider key length tripling instead.
The main threat of QC is to public-key systems, but QC public-key
algorithms do exist so we have some idea what kind of infrastructure
changes we need - and all of that infrastructure is either useful or
harmless. Mostly the infrastructure problem is that post-QC public-key
algorithms need MUCH longer keys than our key management software is
written to accommodate. Being able to handle ridiculously long keys in
key servers and pubkey software is again harmless, and in some cases
As an Old Phart I'm going to issue a reminder - in fact a dire warning -
about new algorithms that need longer keys getting stuck into software
that can't handle anything longer than it's handling. Y'all other Old
Pharts still remember what debacle I'm talking about, right? If you
have protocols or libraries that reserve 8192 bits for a "fat" RSA key
and assume that's all the key any algorithm will ever need, it's time to
fix them - but it's BEEN time to fix them for years.
Longer keys affect other things too. For example If you are responsible
for RNGs or PRNGs, it's time to audit code and make sure things stay
okay when software starts grabbing six-megabyte chunks of "pure"
randomness because it needs to generate a six-megabyte key.
And the last item - Certs with keys for multiple algorithms allowing
quick,easy,transparent upgrades if an algorithm is decertified? Best
idea EVER to protect users from Backward compatibility, and Backward
compatibility has been something like 90% of our real-world day-to-day
security problems for decades. Why haven't we already been doing this
for the last 30 years? I thought of it to support the "contagious
decertification" idea, but it's useful for almost everything you need
More information about the cryptography