[Cryptography] Crypto Standards v.s. Engineering habits - Was: NIST about to weaken SHA3?

Jerry Leichter leichter at lrw.com
Tue Oct 1 11:59:26 EDT 2013


On Oct 1, 2013, at 3:29 AM, Dirk-Willem van Gulik <dirkx at webweaving.org> wrote:
> ...I do note that in crypto (possibly driven by the perceived expense of too many bits) we tend to very carefully observe the various bit lengths found in 800-78-3, 800-131A , etc etc. And rarely go much beyond it*.
> 
> While in a lot of other fields - it is very common for 'run of the mill' constructions; such as when calculating a floor, wooden support beam, a joist, to take the various standards and liberally apply safety factors. A factor 10 or 20x too strong is quite common *especially* in 'consumer' constructions....  
It's clear what "10x stronger than needed" means for a support beam:  We're pretty good at modeling the forces on a beam and we know how strong beams of given sizes are.  We have *no* models for "the strength of a crypto system" that would allow one to meaningfully make such comparisons in general.  It's like asking that houses be constructed to survive intact even when hit by the Enterprise's tractor beam.

Oh, if you're talking brute force, sure, 129 bits takes twice as long as 128 bits.  But even attacking a 128-bit cipher by brute force is way beyond anything we can even sketch today, and 256 bits is getting into "if you could use the whole known universe as a computer it would talk you more than the life of the universe" territory.

If, on the other hand, you're talking analytic attacks, there's no way to know ahead of time what matters.  The ultimate example of this occurred back when brute force attacks against DES, at 56 bits, were clearly on the horizon - so people proposed throwing away the key schedule and making the key the full expanded schedule of 448 bits, or whatever it came to.  Many times more secure - except then differential cryptography was (re-)discovered and it turned out that 448-bit DES was no stronger than 56-bit DES.

There are three places I can think of where the notion of "adding a safety factor" makes sense today; perhaps someone can add to the list, but I doubt it will grow significantly longer:

1.  Adding a bit to the key size when that key size is small enough;
2.  Using multiple encryption with different mechanisms and independent keys;
3.  Adding rounds to a round-based symmetric encryptor of the design we currently use pretty universally (multiple S and P transforms with some keying information mixed in per round, repeated for multiple rounds).  In a good cipher designed according to our best practices today, the best attacks we know of extend to some number of rounds and then just die - i.e., after some number of rounds they do no better than brute force.  Adding a few more beyond that makes sense.  But ... if you think adding many more beyond that makes sense, you're into tin-foil hat territory.  We understand what certain attacks look like and we understand how they (fail to) extend beyond some number of rounds - but the next attack down the pike, about which we have no theory, might not be sensitive to the number of rounds at all.

These arguments apply to some other primitives as well, particularly hash functions.  They *don't* apply to asymmetric cryptography, except perhaps for case 2 above - though it may not be so easy to apply.  For asymmetric crypto, the attacks are all algorithmic and mathematical in nature, and the game is different.
                                                        -- Jerry



More information about the cryptography mailing list