[Cryptography] Google announces practical SHA-1 collision attack

Kevin W. Wall kevin.w.wall at gmail.com
Tue Feb 28 21:52:18 EST 2017


On Tue, Feb 28, 2017 at 9:18 AM, Theodore Ts'o <tytso at mit.edu> wrote:
> On Sun, Feb 26, 2017 at 09:04:45PM -0600, Nikita Borisov wrote:
>> On Sat, Feb 25, 2017 at 5:06 PM, Peter Gutmann <pgut001 at cs.auckland.ac.nz>
>> wrote:
>>
>> > They announced an attack that requires a nation-state's worth of resources
>> > and
>> >
>>
>> The cost estimates were around $500K at normal EC2 prices and $100K at spot
>> prices. I'd have imagined that nation states command rather more resources
>> than that!
>
> If I'm not mistaken, those are the costs for the *second* phase of the
> attack (110 GPU years).  However, you have to first carry out the
> *first* phase of the attack, which takes 6200 CPU years.
>
> Aside from throwing out numbers which are much scarier, which make for
> good headlines and scaring clients to score more consulting time, is
> there a reason why people are fixated on the 110 GPU year "second
> phase" number, and not the 6200 GPU years "first phase" number?

I've wondered that as well, unless somehow it is expected that the
first phase produces results that can somehow be reused for multiple
collisions.

But more specifically, a question that I've tried to get an answer to and
so far have been unable to turn up is:
    What exactly type of CPU / GPU is Google basing these ill-defined
    "CPU year" and "GPU years" terms on?

For example, if the "CPU years" were based on a 1 (bogo)MIPS Vax 11/780,
then garnering 6200 "CPU years" of that would seem several orders of
magnitude simpler than if they are basing it on (say) whatever is today's
faster supercomputer flavor-of-the-month, or even single faster CPU.
It seems there's an awful lot of wiggle room for variation here. Even with
GPUs. A low end, on-board Intel GPU versus the latest high end
Nvidia GeForce or AMD Radeon GPU.  I mean, what exactly is Google
basing their CPU and GPU figures on? Does anyone know/

Even the fact that it's on an "average CPU" or "average GPU" tells me
little. For one, *my* "average" is a 5+ year old mid-range laptop with
a relatively low end GPU (even for 2011). I'm sure for CPU, there are
talking about a server, but how many cores? What clock speed? Etc.
And are they referring to a single CPU (which is what I thought) or a
server with multiprocessors?

Maybe it was there in the Google blog or one of the links and I just
read past it; wouldn't be the first time. Anyway, if any one has the
answer, I surely would like to know, as would several of my colleagues.

Thanks,
-kevin
-- 
Blog: http://off-the-wall-security.blogspot.com/    | Twitter: @KevinWWall
NSA: All your crypto bit are belong to us.


More information about the cryptography mailing list