entropy depletion (was: SSL/TLS passive sniffing)

Enzo Michelangeli em at em.no-ip.com
Thu Jan 6 03:35:05 EST 2005


----- Original Message ----- 
From: "John Denker" <jsd at av8n.com>
Sent: Thursday, January 06, 2005 3:06 AM

> Enzo Michelangeli wrote:
[...]
>  > If the PRNG uses a
>  > really non-invertible algorithm (or one invertible only
>  > with intractable complexity), its output gives no insight
>  > whatsoever on its internal state.
>
> That is an invalid argument.  The output is not the only source of
> insight as to the internal state.  As discussed at
>     http://www.av8n.com/turbid/paper/turbid.htm#sec-prng-attack
> attacks against PRNGs can be classified as follows:
>    1. Improper seeding, i.e. internal state never properly initialized.
>    2. Leakage of the internal state over time. This rarely involves
>      direct cryptanalytic attack on the one-way function, leading to
>      leakage through the PRNG’s output channel.  More commonly it
>      involves side-channels.
>   3. Improper stretching of limited entropy supplies, i.e. improper
>      reseeding of the PRNG, and other re-use of things that ought not
>      be re-used.
>   4. Bad side effects.
>
> There is a long record of successful attacks against PRNGs (op cit.).

Yes, but those are implementation flaws. Also a true RNG could present
weaknesses and be attacked (e.g., with strong EM fields overcoming the
noise of its sound card; not to mention vulnerabilities induced by the
quirks you discuss at
http://www.av8n.com/turbid/paper/turbid.htm#sec-quirks).

Anyway, I was not saying "RNG's are useless because PRNG's are more than
enough": the scope of my question was much narrower, and concerned the
concept of "entropy depletion".

> I'm not saying that the problems cannot be overcome,
> but the cost and bother of overcoming them may be such
> that you decide it's easier (and better!) to implement
> an industrial-strength high-entropy symbol generator.

Sure, I don't disagree with that.

>  > As entropy is a measure of the information we don't have about the
>  > internal state of a system,
>
> That is the correct definition of entropy ... but it must be correctly
> interpreted and correctly applied;  see below.
>
>  > it seems to me that in a good PRNGD its value
>  > cannot be reduced just by extracting output bits. If there
>  > is an entropy estimator based on the number of bits extracted,
>  > that estimator must be flawed.
>
> You're letting your intuition about "usable randomness" run roughshod
> over the formal definition of entropy.  Taking bits out of the PRNG
> *does* reduce its entropy.

By how much exactly? I'd say, _under the hypothesis that the one-way
function can't be broken and other attacks fail_, exactly zero; in the
real world, maybe a little more. But in
/usr/src/linux/drivers/char/random.c I see that the extract_entropy()
function, directly called by the exported kernel interface
get_random_bytes(), states:

        if (r->entropy_count / 8 >= nbytes)
                r->entropy_count -= nbytes*8;
        else
                r->entropy_count = 0;

...which appears to assume that the pool's entropy (the upper bound of
which is POOLBITS, defined equal to 4096) drops by a figure equal to the
number of bits that are extracted (nbytes*8). This would only make sense
if those bits weren't passed through one-way hashing. Perhaps, a great
deal of blockage problems when using /dev/random would go away with a more
realistic estimate.

Enzo


---------------------------------------------------------------------
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to majordomo at metzdowd.com



More information about the cryptography mailing list