[Cryptography] Sha3

Jerry Leichter leichter at lrw.com
Fri Oct 4 10:38:20 EDT 2013


On Oct 1, 2013, at 5:34 AM, Ray Dillinger <bear at sonic.net> wrote:
> What I don't understand here is why the process of selecting a standard algorithm for cryptographic primitives is so highly focused on speed. 
If you're going to choose a single standard cryptographic algorithm, you have to consider all the places it will be used.  These range from very busy front ends - where people to this day complain (perhaps with little justification, but they believe that for them it's a problem) that doing an RSA operation per incoming connection is too expensive (and larger keys will only make it worse), to phones (where power requirements are more of an issue than raw speed) to various embedded devices (where people still use cheap devices because every penny counts) to all kinds of devices that have to run for a long time off a local battery or even off of microwatts of power transferred to an unpowered device by a reader.
 
> We have machines that are fast enough now that while speed isn't a non issue, it is no longer nearly as important as the process is giving it precedence for.
We do have such machines, but we also have - and will have, for the foreseeable future - machines for which this is *not* the case.

Deciding on where to draw the line and say "I don't care if you can support this algorithm in a sensor designed to be put in a capsule and swallowed to transmit pictures of the GI tract for medical analysis" is not a scientific question; it's a policy question.

> Our biggest problem now is security,  not speed. I believe that it's a bit silly to aim for a minimum acceptable security achievable within the context of speed while experience shows that each new class of attacks is usually first seen against some limited form of the cipher or found to be effective only if the cipher is not carried out to a longer process.  
The only problem with this argument is that "the biggest problem" is hard to pin down.  There's little evidence that the symmetric algorithms we have today are significant problems.  There is some evidence that some of the asymmetric algorithms may have problems due to key size, or deliberate subversion.  Fixing the first of these does induce significant costs; fixing the second first of all requires some knowledge of the nature of the subversion.  But beyond all this the "biggest problems" we've seen have to do with other components, like random number generators, protocols, infiltration of trusted systems, and so on.  None of these is amenable to defense by removing constraints on performance.  (The standardized random number generator that "ignored performance to be really secure" turned out to be anything but!)

We're actually moving in an interesting direction.  At one time, the cost of decent crypto algorithms was high enough to be an issue for most hardware.  DES at the speed of the original 10Mb/sec Ethernet was an significant engineering accomplishment!  These days, even the lowest end "traditional computer" has plenty of spare CPU to run even fairly expensive algorithms - but at the same time we're pushing more and more into a world of tiny, low-powered machines everywhere.  The ratio of speed and supportable power consumption and memory between the average "large" machine and the average "tiny" machine is wider than it's ever been.  At the low end, the exposure is different:  An attacker typically has to be physically close to even talk to the device, there are only a small number of communications partners, and any given device has relatively little information within it.  Perhaps a lower level of security is appropriate in such situations.  (Of course, as we've seen with SCADA systems, there's a temptation to just put these things directly on the Internet - in which case all these assumptions fail.  A higher-level problem:  If you take this approach, you need to make sure that the devices are only accessible through a gateway with sufficient power to run stronger algorithms.  How do you do that?)

So perhaps the assumption that needs to be reconsidered is that we can design a single algorithm suitable across the entire spectrum.  Currently we have SHA-128 and SHA-256, but exactly why one should choose one or the other has never been clear - SHA-256 is somewhat more expensive, but I can't think of any examples where SHA-128 would be practical but SHA-256 would not.  In practice, when CPU is thought to be an issue (rightly or wrongly), people have gone with RC4 - standards be damned.

It is worth noting that NSA seems to produce suites of algorithms optimized for particular uses and targeted for different levels of security.  Maybe it's time for a similar approach in public standards.

                                                        -- Jerry




More information about the cryptography mailing list