[Cryptography] How does the size of a set of target results influence the complexity of a preimage attack?

Natanael natanael.l at gmail.com
Sun Oct 18 06:44:53 EDT 2015


Den 17 okt 2015 11:22 skrev "Natanael" <natanael.l at gmail.com>:
> 128 bit hashes can be attacked the same way with standard birthday
collisions at 2^64 operations. Also note that a larger target set reduces
the work of takes almost linearly IIRC, as 2^64 applies for one single
target hash to match. 2^16 hashes in the set to match thus reduces it to
2^40, I think.
>
> Just stick with strong 196 bit hashes or longer. SHA256 is good enough,
but SHA3 ought to be a good choice too if you're starting from scratch.

I misremembered things here, sorry.

The pre-existing set of files can be considered to be precomputation done
for you, on average 2^(128/2) = 2^64 total hashes still need to be computed
to find a collision. With for example 2^32 existing hashes for 4 billion
files, the effect isn't really noticeable:

2^(Log2(2^64-2^32)) = 2^63.9999999996641 work factor

However you get collisions more often once you finally got to this number
of computed hashes, so at 2^65 work you likely will have 2 collisions
already, and so on.

Since 2^(192/2) = 2^96 which will likely remain out of reach for decades,
that recommendation still stands. Don't go below that number.

Also, I just remembered the Flame MD5 certificate collision attack. That's
yet another attack algorithm faster than bruteforce.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://www.metzdowd.com/pipermail/cryptography/attachments/20151018/b2ab095a/attachment.html>


More information about the cryptography mailing list