[Cryptography] Fwd: [IP] 'We cannot trust' Intel and Via's chip-based crypto, FreeBSD developers say
Bill Cox
waywardgeek at gmail.com
Tue Dec 10 16:26:42 EST 2013
I think there may be weaknesses in Intel's hardware RNG. I took a good
look at Intel's hardware random number generator source. There's a paper
analyzing it here:
http://www.cryptography.com/public/pdf/Intel_TRNG_Report_20120312.pdf
The basic idea is that back-to-back inverters, when powered on, flip one
way or the other randomly, sort of like DRAM memory when our computer's
power on. By powering on a single pair of back-to-back inverters over
and over, they can generate a random bit per cycle, at about 3
Giga-bits/second, which is amazing! Here's my concerns about the the paper:
- I saw no mathematical analysis of how much noise exists in the system
and how strongly it will influence the result each cycle. There were
generalities about how the noise could cause the output to be random,
but no numbers at all.
- There is an assumption that the capacitors are charged/discharged by
10% of the standard deviation of the noise. I saw no justification for
this. It seems they simply assumed best case.
- The paper is about as objective as a mother talking about her
children. For example: "Overall, the Ivy Bridge RNG is a robust design
with a large margin of safety that ensures good random data is generated
even if the ES is not operating as well as predicted." Based on what?
- I am not convinced they have the right model for the entropy source.
They add noise to the bias on the capacitors, and compare that to 0 to
determine the next output bit in their model. I think the main source
of noise may be the randomness in number of electrons added/subtracted
each cycle, and that the back-to-back inverters in the absence of other
noise may be acting almost as an ideal comparator. However, if this
were the case, even if there were 10% noise in the number of electrons,
there would be considerable correlation between bits.
I also have questions about the design itself. My main concern is that
noise on the VDD rail could easily determine the output. For example,
if the transistors are mismatched, which of course they will be, and the
bias is set exactly right on the caps so there's a 50-50 chance of a 0
or 1, and suddenly VDD drops 10% due to a rising edge of the the main
system clock, then the inverter with higher gate thresholds will become
weak faster than the other one, thus determining which one wins. Since
this circuit runs asynchronously from the main system clock, I could
easily see the 3MHz system clock phase relative to the entropy generator
clock determining most of the results from the entropy source, while
looking fairly random. Any weakness in the raw random data stream is
hidden from us by the AES encryption done as a post-process.
I simulated back-to-back inverters in my .35u low power CMOS process in
SPICE to see if I could figure out how to make a practical circuit using
Intel's topology. If it works, it would be fantastic. I think I can
get rid of most of the supply noise issues. I had a similar problem in
my "Infinite Noise Multiplier", so I switched to powering the circuit
with nothing but large W and L constant current sources, and using the
range from 0V to Vref, rather than 0V to VDD, because Vref is stable
relative to AVSS. However, I wasn't able to get enough noise to make
Intel's ciruit work, though that may be due to limitations in the SPICE
simulator.
Has anyone else had success using Intel's RNG topology?
More information about the cryptography
mailing list