[Cryptography] Are Momentum and Cuckoo Cycle PoW algorithms broken?
Bill Cox
waywardgeek at gmail.com
Mon Jul 6 14:18:28 EDT 2015
Is it known whether Momentum and/or Cuckoo are memory-hard? I don't think
either are.
Both claim to be "memory-hard". I would consider any algorithm that
defeats the memory*time defense of a memory-hard algorithm like Scrypt to
be a complete break, in the sense that we can no longer claim they are
memory-hard. Such algorithms are prone to effective custom-hardware
attacks, and should probably be avoided as a PoW system which is trying to
avoid ASIC implementations.
Momentum <http://www.hashcash.org/papers/momentum.pdf> is a cool
algorithm. I particularly like it's simplicity. Unfortunately, we can
arbitrarily speed up the computation with parallel processing, without
increasing memory, violating Momentum's memory*time hardness. To speed up
Momentum with custom hardware, simply run any effective parallel sorting
algorithm <http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.211.5725>.
Detecting repeated entries is then simply comparing adjacent values in
parallel.
For a concrete example (though not the most efficient) of a parallel
machine that breaks Momentum's memory-hardness, consider a hyper-cube of
nodes. Each node generates birthday hashes in it's assigned range of input
nounces. Each node is also assigned the task of storing a specific range
of birthday hashes in a hash table. This is done in parallel by having
each node generate birthday hashes and transmit them through the network to
the nodes that will store them. Each node puts incoming hashes in a hash
table in constant time, looking for collisions.
The runtime of this algorithm is proportional to the time to generate the
hashes, which is O(n/p), plus the time to transmit them through the
network, which is O(log(p)). The total is O(n/p + log(p)). both terms can
be made arbitrarily small compared to n as n and p both increase. At the
same time, we only used n total memory. Therefore, we've reduced the
memory*time cost by an arbitrary factor, as n increases, with parallel
computation, breaking Momentum's memory-hardness.
Cuckoo Cycle also claims to be memory-hard, but I don't buy it. They're
looking for cycles of length L by building a forest of trees, and detecting
when loops form, and measuring the length. If we do a parallel sort as
above, we can store entries in a Cuckoo hash table
<https://en.wikipedia.org/wiki/Cuckoo_hashing> rapidly in parallel.
Collisions force use to move entries from one node's table to another, but
only on average a small constant number of times. The total network
traffic per hash store is still O(log(n)) on a hyper-cube, just like
Momentum. On each failed store, we detect a cycle, along with its length.
The time to do this is already factored into the store time.
Cuckoo is rather complex, and I am likely missing something, but I do not
see how it can be considered memory-hard. Did I get this wrong?
Thanks,
Bill
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://www.metzdowd.com/pipermail/cryptography/attachments/20150706/32c9815d/attachment.html>
More information about the cryptography
mailing list