Fri Oct 26 18:56:05 EDT 2007

```Alex Pankratov wrote:
> Say, we have a random value of 4 kilobits that someone wants
> to keep secret by the means of protecting it with a password.

It would assist understanding, I feel, if we thought about 4 kilobits of
entropy, rather than a 4 kilobit value. I want to make this distinction
because I'd like to talk about secret keys, which have to be rather
larger than 4 kbits to have 4kbits of entropy for modular arithmetic stuff.

> Empirical entropy estimate for an English text is 1.3 bits of
> randomness per character, IIRC.
>
> Assuming the password is an English word or a phrase, and the
> secret is truly random, does it mean that the password needs
> to be 3100+ characters in size in order to provide a "proper"
> degree of protection to the value ?
>
> Or, rephrasing, what should the entropy of the password be
> compared to the entropy of the value being protected (under
> whatever keying/encryption scheme) ?
>
> I realize that this is rather .. err .. open-ended question,
> and it depends on what one means by "protected", but I'm sure
> you can see the gist of the question. How would one deem a
> password random enough to be fit for protecting an equivalent
> of N bits of random data ? Is it a 1-to-1 ratio ?

Given the above, it seems there's an obvious formulation.

Let's say the cost of a brute force attack on the secret itself is 2^xn
for n bits of entropy in the secret (it seems that this is actually the
interesting definition of entropy in this case, somewhat circularly).
Similarly, the cost of a brute force attack on the encryption protecting
the secret is 2^ym, where y is the entropy in the password.

So, when 2^ym < 2^xn, it is worth attacking the password.

So, ym < xn and hence m < xn/y.

In other words, your password needs to be x/y times the size of the
secret (in bits), where x and y are the costs of attacking the secret

--