brute force physics Was: cleversafe...

Jerry Leichter leichter at lrw.com
Tue Aug 11 19:47:54 EDT 2009


On Aug 10, 2009, at 4:42 AM, Alexander Klimov wrote:

> On Sun, 9 Aug 2009, Jerry Leichter wrote:
>> Since people do keep bringing up Moore's Law in an attempt to justify
>> larger keys our systems "stronger than cryptography," it's worth
>> keeping in mind that we are approaching fairly deep physical limits.
>> I wrote about this on this list quite a while back.  If current
>> physical theories are even approximately correct, there are limits to
>> how many "bit flips" (which would encompass all possible binary
>> operations) can occur in a fixed volume of space-time.
>
> A problem with this reasoning is that the physical world and the usual
> digital computers have exponential simulation gap (it is known at
> least in one direction: to simulate N entangled particles on a digital
> computer one needs computations exponential in N). This can be
> considered as a reason to suspect that physical world is
> non-polynomially faster than the usual computers (probably even to an
> extent that all NP computations can be simulated).
When the first results about exponential speedup of factoring came  
out, people assumed that this was true in general.  But it isn't.  In  
particular, simple search, where you have only an equality test so you  
can't build a hash table or some kind of ordered structure - is O(N)  
on a traditional computer - and O(sqrt(N)) on a quantum computer.  I'm  
not sure what the current knowledge about what a quantum machine can  
do for NP computations, but there's no "probably" here.

> While it is possible to use physical world to simulate usual computers
> in the straightforward way (namely by using voltage levels of a
> circuit to represent separate bits), it is not clear that doing
> computations in this way is the best way to do computations, for
> example, if the meaning of our computations are simulation of the
> physical world, then it can be better to use direct
> physical-to-physical mapping instead of physical-to-usual followed by
> usual-to-physical: analog computers, such as wind tunnels, are still
> in use.
>
> I am not aware of any plausible argument why a brute force search in
> general (a quintessence of NP class, by the way) or a key search
> against any particular algorithm cannot be implemented in a direct way
> significantly faster than in the indirect way, that is NP-to-physical
> instead of NP-to-usual followed by usual-to-physical. All the fuss
> about quantum computing is exactly because people believe that a
> different mapping (not thru usual computers) can be more efficient (if
> I understand correctly, right now neither the class of algorithms that
> can be sped up this way is understood, nor the quantum computers of
> practical capacity exist).
The physical arguments to which I was referring say *nothing* about  
how the computation is done.  It can be a QM computation as well.

In any case, the simple search result above applies directly to brute  
force:  For that problem, you only get a polynomial speedup anyway.

>> All the protocols and standards out there calling for AES-256 - it's
>> obviously "better" than AES-128 because after all 256 is *twice as
>> large* as 128! - were just a bunch of nonsense.  And, perhaps,
>> dangerous nonsense.
>
> I see the situation in the positive way: the recent AES attacks
> stress the fact that the key management should be done
> correctly, in particular, keys should be derived thru KDF (not
> a simple xor) and must be authenticated. With this attack in
> hand it is much easier for us now to say why one should not use
> K to encrypt messages of one type and K+1 for another type, or
> why it is a bad idea to encrypt a key in CTR mode and store the
> result without a MAC. I doubt it is possible to find any
> professionally designed protocol or standard that becomes weak
> due to the recent discovery.
That's a ... bizarre point of view.  :-)  Should freedom from related- 
key attacks be part of the definition of a "secure" encryption  
algorithm?  We should decide that on some rational basis, not on  
whether, with care, we can avoid such attacks.  Clearly, a system that  
*is* secure against such attacks is more robust.  Do we know how to  
build such a thing?  What's the cost of doing so?  But to say it's an  
*advantage* to have a weakness seems like some kind of odd moral  
argument:  If you're hurt by this it's because you *deserve* to be.
                                                         -- Jerry

---------------------------------------------------------------------
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to majordomo at metzdowd.com



More information about the cryptography mailing list