[Cryptography] Entropy is forever ...
leichter at lrw.com
Sun Apr 19 18:54:46 EDT 2015
On Apr 19, 2015, at 5:09 PM, Florian Weimer <fw at deneb.enyo.de> wrote:
>> On the other hand, there is a notion of algorithmic complexity of an
>> individual string, i.e., the length of the shortest program that
>> produces it in some universal programming language. In this sense,
>> most strings are (Kolmogorov) random but those easy to generate (e.g.,
>> all nines, or the digits of pi) are not.
> When just talking about the a single password, the universal
> programming language might have (among other things) just a special
> instruction to output it....
Kolmogorov complexity (and the essentially equivalent measures others came up with - I now recall Chaitin, but there's another name there, too) is "universal" only in some infinitistic sense. Any finite set of strings can be special-cased and dumped into the machine used to define complexity. If you take all the passwords revealed in all the password leaks we've seen and make them special cases for the machine ... you have a reasonable start at a model of the *effective* complexity of any password chosen from that set today: Pretty much none, since the attackers try all of those first. Fold in dictionaries for a large group of languages, a few billion variations constructed using the kinds of algorithms hackers try (pairs of words, 133t speak, etc.) and you just have a larger, but still finite set to dump into your base machine.
Kolmogorov complexity is a beautiful theory, and the fact that it actually allows you to make sense of the feeling that some strings are "more random" than others is itself remarkable and unexpected. But beyond some hints about heuristic methods for judging password quality, by its nature it tells you nothing about the real-world complexity of any specific thing, as you can't pin down the relevant information about the real world specifically enough to get it off the ground.
More information about the cryptography