[Cryptography] cheap sources of entropy
John Denker
jsd at av8n.com
Mon Jan 20 13:59:47 EST 2014
On 01/20/2014 09:45 AM, Jerry Leichter wrote:
> Getting quality random bits when you have (a) almost any kind of
> high-rate real-world sensor and (b) a human being willing to help is
> an easy problem. Any modern cellphone can provide tons of randomness
> if a person moves it around, talks to it, waves his hands around in
> front of it.
That is not the smart way to think about it.
Let's consider a few basic types of signal:
1) At one extreme there is perfect randomness, by which I mean
100% entropy density, such as a source that produces 8 bits of
entropy in each 8-bit byte. There is no pattern whatsoever.
2) There is also pseudo-randomness, which means that there is
a pattern, but it is computationally infeasible for the adversary
to discover the pattern. For a wide range of purposes, the
adversary finds this to be just as unpredictable as real entropy.
3) At the opposite extreme, there is perfect determinism, i.e.
no randomness at all, such as source the produces an endless
string of bytes all the same, or perhaps successive digits of π.
4) There is also something I call squish, which is neither reliably
predictable nor reliably unpredictable.
5) Combinations of the above. For example, there could be a
distribution over 32-bit words having a few bits of entropy in
each word, with the other bits being deterministic and/or squish
For details, including diagrams and additional discussion, see
http://www.av8n.com/turbid/paper/turbid.htm#sec-def-random
============
In computer science and engineering there is the notion of
"proof of correctness". Can you /prove/ that a human waving
his arms around is random? More specifically, can you establish
a provable lower bound on the entropy-content?
All the literature I've seen says that humans are /not/ in
fact very good randomness generators.
Until there is a proof to the contrary, we should assume that
a human waving his hands around is squish, not high-quality
randomness. It is squish, not entropy. It is not reliably
predictable, but alas it is not reliably unpredictable.
==================================
When we want real entropy, as opposed to squish, we should
not depend on human behavior. It is much better to depend on
the second law of thermodynamics.
Any sensor will have a certain amount of Johnson noise. This
applies to audio inputs, video inputs, accelerometers,
thermometers, et cetera. This applies whether or not anybody
is talking to the device or waving it around.
An adversary could inject additional noise, but
a) In the linear regime, that would do him no good, because
the Johnson noise would still be there, and
b) Long before the system went nonlinear the attack would
be easily detectable. It would be nothing more than a
crude denial-of-service attack.
Any attempt to /remove/ the Johnson noise or replace it with
something else would either
a) violate the basic laws of physics,
b) render the sensor unusable for its nominal purpose,
c) be ridiculously easy to detect, and/or
d) be completely impractical, in the sense that it would be
so difficult to carry out that other, more direct attacks
would be a far better use of resources.
The proof of correctness is not easy, but it is entirely doable.
It requires an understanding of physics, electrical engineering,
computer science, and cryptography.
Any such device needs to be /calibrated/. That includes ascertaining
the input impedance and the gain*bandwidth product. This is not
trivial. Sometimes it is not doable ... but sometimes it is. Audio
inputs have the advantage of being relatively easy to calibrate.
To say the same thing the other way: When I see a discussion
of "high-quality randomness" that does not include a calibrated,
provable lower bound on the entropy content, I have a hard time
taking it seriously.
More information about the cryptography
mailing list