# [Cryptography] Fwd: Whitening Algorithm

Sebastian Gesemann s.gesemann at gmail.com
Fri Jul 24 07:46:42 EDT 2015

```Oops, I mistakenly sent my response as PM and not to the list.
I see I'm one of three people that likes cryptographic sponges. :)

---------- Forwarded message ----------
From: me
Date: Thu, Jul 23, 2015 at 3:36 PM
Subject: Re: [Cryptography] Whitening Algorithm
To: Rob Seward

On Thu, Jul 23, 2015 at 4:50 AM, Rob Seward wrote:
> Hi,
> I’m trying to whiten a random noise source (a reverse biased transistor)
> with a low-powered microprocessor. I figured out a technique that seems to
> work well, and I want to know if there is anything insecure or subpar about
> it.
>
> (Earlier, I had heard that XORing a random stream with alternating 0s and 1s
> could remove bias. However, this strikes me as very insecure, because an
> attacker could reverse the mask by XORing the mixed stream with the same 01
> mask. Also, I had used the Von Neumann algorithm, but this slows output
> significantly.)
> [...]

What you do is basically a convolution in a 256-element Galois field.
It's a linear filter with the following difference equation:

out[n] = in[n] + in[n-1] + in[n-4] + in[n-5]   (+ means XOR)

This filter is easily invertible. Sure, it might fool the statistical
tests but this wouldn't mean you magically created some entropy out of
nothing. You didn't. The entropy is still below 1.

Depending on what your goal is, you might want the entropy per output
bit to be 1. This *requires* a reduction in the output rate.

If you're interested in a high output rate and are fine with "entropy
stretching" (eg. faking entropy in a "cryptographically secure" way)
you *can* do whitening and entropy stretching in one go. But it takes
more than just 3 XORs. I've grown quite fond of cryptographic
sponge-like constructions. They can "absort" bits and if you "squeeze"
them, they produce bits. Here's a pseudo code example:

S = 0^512;  # state vector initialized to 512 zero bits
e = 0.1;    # entropy estimate for input per bit
r = 3;      # output rate relative to input rate

nin  = ceil(256/max(1,r));  # no more than 256 bits at a time
nout = floor(256*min(1,r)); # no more than 256 bits at a time

# Fill the state with "enough" entropy before outputting anything
for k in range(ceil(1/e)):
S(0:256) = fetch_biased_bits(256); # overwrite first 256 bits
S = P(S); # apply bijection P on the whole 512 bit state

while True:
S(0:nin) = fetch_biased_bits(nin); # overwrite first nin bits
S = P(S); # apply bijection P on the whole 512 bit state
write(S(0:nout)); # output nout bits

Here, P is some "pseudo-random" bijection. The asymptotic entropy per
output bit should be max(1, e/r). And if e/r<1 this is whitening +
entropy stretching. if e/r>=1 you just do whitening. In the case e/r<1
P should be chosen to be "cryptographically good", for example 8
rounds of ChaCha20's round function which conveniently happens to map
512 bits to 512 bits. But this function might be overkill for the case
e/r>=1.

Cheers!
sg
```