[Cryptography] Crypto Standards v.s. Engineering habits - Was: NIST about to weaken SHA3?

Dirk-Willem van Gulik dirkx at webweaving.org
Tue Oct 1 12:27:54 EDT 2013


Op 1 okt. 2013, om 17:59 heeft Jerry Leichter <leichter at lrw.com> het volgende geschreven:

> On Oct 1, 2013, at 3:29 AM, Dirk-Willem van Gulik <dirkx at webweaving.org> wrote:
>> ...I do note that in crypto (possibly driven by the perceived expense of too many bits) we tend to very carefully observe the various bit lengths found in 800-78-3, 800-131A , etc etc. And rarely go much beyond it*.
>> 
>> While in a lot of other fields - it is very common for 'run of the mill' constructions; such as when calculating a floor, wooden support beam, a joist, to take the various standards and liberally apply safety factors. A factor 10 or 20x too strong is quite common *especially* in 'consumer' constructions….  

> It's clear what "10x stronger than needed" means for a support beam:  We're pretty good at modeling the forces on a beam and we know how strong beams of given sizes are.  

Actually - do we ? I picked this example as it is one of those where this 'we know' falls apart on closer examination. Wood varies a lot; and our ratings are very rough. We drill holes through it; use hugely varying ways to glue/weld/etc. And we liberally apply safety factors everywhere; and a lot of 'otherwise it does not feel right' throughout. And in all fairness - while you can get a bunch of engineers to agree that 'it is strong enough' - they'd argue endlessly and have 'it depends' sort of answers when you ask them "how strong is it 'really'" ?

> Oh, if you're talking brute force, sure, 129 bits takes twice as long as 128 bits.  
...
> If, on the other hand, you're talking analytic attacks, there's no way to know ahead of time what matters.  

So I think you are hitting the crux of the matter - the material, like most, we work with, is not that easy to gauge. But then when we consider your example of DES:

> The ultimate example of this occurred back when brute force attacks against DES, at 56 bits, were clearly on the horizon - so people proposed throwing away the key schedule and making the key the full expanded schedule of 448 bits, or whatever it came to.  Many times more secure - except then differential cryptography was (re-)discovered and it turned out that 448-bit DES was no stronger than 56-bit DES.

with hindsight we can conclude that despite all this - despite all the various instutitions and interests conspiring, fighting and collaborating roughly yielded us a fair level of safety for a fair number of years - and that is roughly what we got. 

Sure - that relied on 'odd' things; like the s-boxes getting strengthened behind the scenes, the EFF stressing that a hardware device was 'now' cheap enough. But by and large - these where more or less done 'on time'. 

So I think we roughly got the minimum about right with DES. 

The thing which facinates/strikes me as odd - is that that is then exactly what we all implemented. Not more. Not less. No safety; no nothing. Just a bit of hand waving to how complex it all is; how hard it is to predict; so we listen to NIST* et.al. and that is it then.

*Despite* the fact that, as you so eloquently argue, the material we work with is notoriously unpredictable, finnicky and has many an uncontrolled unknown.

And any failures or issues come back to haunt us, not NIST et.al.

> There are three places I can think of where the notion of "adding a safety factor" makes sense today; perhaps someone can add to the list, but I doubt it will grow significantly longer:
> 
> 1.  Adding a bit to the key size when that key size is small enough;
> 2.  Using multiple encryption with different mechanisms and independent keys;
> 3.  Adding rounds to a round-based symmetric encryptor of the design we currently use pretty universally (multiple S and P transforms with some keying information mixed in per round, repeated for multiple rounds).  In a good cipher designed according to our best practices today, the best attacks we know of extend to some number of rounds and then just die - i.e., after some number of rounds they do no better than brute force.  Adding a few more beyond that makes sense.  But ... if you think adding many more beyond that makes sense, you're into tin-foil hat territory.  We understand what certain attacks look like and we understand how they (fail to) extend beyond some number of rounds - but the next attack down the pike, about which we have no theory, might not be sensitive to the number of rounds at all.

Agreed - and perhaps develop some routine practices around which way you layer; i.e. what is best wrapped inside which; and where do you (avoid) padding; or get the most out of IVs.
 
> These arguments apply to some other primitives as well, particularly hash functions.  They *don't* apply to asymmetric cryptography, except perhaps for case 2 above - though it may not be so easy to apply.  For asymmetric crypto, the attacks are all algorithmic and mathematical in nature, and the game is different.

Very good point (I did not consider the PKI's at all when I wrote above.)

Thanks,

Dw

*: s/NIST/your local applicable standards body or politically correct regulator/g.
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 438 bytes
Desc: Message signed with OpenPGP using GPGMail
URL: <http://www.metzdowd.com/pipermail/cryptography/attachments/20131001/2408a287/attachment.pgp>


More information about the cryptography mailing list