[Cryptography] RAM memories as one source of entropy

John Denker jsd at av8n.com
Fri Feb 7 15:40:06 EST 2014


On 02/07/2014 05:37 AM, Joachim Strömbergson wrote:

> My very loose ideas for the
> architecture of the RNG follows ....

It is good to recognize these ideas as very loose.
They are at present far to loose to be usable for
any serious purpose.

> One wild idea is to use decay effects in DRAM or initial state in
> powered up SRAM memories as source of entropy. There has been some
> research into this:
> 
> [1] http://goo.gl/25TFov

That resolves to
  Keaton Mowery, Michael Wei, David Kohlbrenner, Hovav Shacham, and Steven Swanson
  "Welcome to the Entropics: Boot-Time Entropy in Embedded Devices"
  http://cseweb.ucsd.edu/users/swanson/papers/Oakland2013EarlyEntropy.pdf

I do not consider that to be serious research.  For those who are looking 
for entropy (as mentioned in the Subject: of this thread), you need to 
look elsewhere.  The key statement in the paper is:

>>   the existence of true randomness is an open philosophical 
>>   question (and therefore beyond the scope of this paper)

Then the paper proceeds to use statistical tests to measure some kind
of "unpredictability".  This whole approach -- statistical testing --
has long since been completely discredited.  To borrow a phrase from
Dykstra:  Testing can prove the absence of entropy;  it can never prove
the presence of entropy.

The stuff Mowery et al. obtain is not entropy;  it is /squish/ by which
I mean it is neither reliably predictable nor reliably unpredictable.


> [2] http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.164.6432

That resolves to
  Daniel E. Holcomb, Wayne P. Burleson, and Kevin Fu
  "Initial SRAM State as a Fingerprint and Source
     of True Random Numbers for RFID Tags"

The paper mentions "thermal noise".  I am willing to believe that thermal
noise plays /some/ role in SRAM initial state ... but this paper utterly
fails to calibrate or even estimate the size of the effect.  It cites
Nyquist and Johnson, but doesn't learn from them.  What is the source
impedance?  What is the gain?  What is the bandwidth?  I see nothing
resembling a lower bound on the entropy content;  instead I see non-
quantifiable statements like

>>   The parallel true random
>>   number generation of FERNS relies on the law of large numbers to ensure that
>>   entropy is harvested from within the array.

Like reference [1], this paper [2] proceeds from the ludicrous Manichaean
assumption that anything that is not reliably predictable must be reliably
unpredictable.  They ignore the third possibility, namely squish.

There are all sorts of ways the FERNS approach can go wrong.  For starters,
it is sensitive to how long the device was powered down before powering it
up.  It is also sensitive to how quickly the supply voltage ramps up.  The
entropy production, if any, will vary from manufacturer to manufacturer and
from year to year.

Last but not least, there are intrinsic limitations associated with using
a one-bit digitizer (such as an SRAM cell) as opposed to a high-resolution
linear digitizer (such as a sound card).  This has to do with /headroom/.
Headroom allows you to detect attempted interference at a point /before/
it becomes an operational problem.  It also makes it easier to do a proper
calibration.

> [3] https://eprint.iacr.org/2013/304.pdf

That resolves to
   Anthony Van Herrewege, Vincent van der Leest, André Schaller,
     Stefan Katzenbeisser, and Ingrid Verbauwhede
  "Secure PRNG Seeding on Commercial Off-the-Shelf Microcontrollers"

This work was carried out with a modicum of intelligence.  They tested two
families of chips, and noticed that one family exhibited a pattern that
made the whole approach obviously unworkable.

As for the other family, I say that just because there is no obviously
disastrous pattern does not mean there is no pattern.  There could easily
be a less-obvious pattern.

Approaches like this will not be taken seriously until somebody answers
the basic questions about source impedance, gain, and bandwidth.

======================================

ALSO:  From an engineering point of view, for a very wide range of
applications (albeit not all), you will be much better off using a
well-characterized PRNG, provisioned with an exogenous seed stored 
in NVRAM.

If you want to supplement this with some endogenous entropy, that's
fine ... but it can happen on a much slower timescale, using a
relatively low-rate HRNG if necessary.

Bottom line:  Security depends on attention to detail.  An ounce of
calibration is worth more than a ton of wishful thinking.  An ounce
of real entropy is worth more than a ton of squish.





On 02/06/2014 11:16 AM, Jerry Leichter wrote:

> See why a couple of weeks ago I said that I avoid using "entropy" in
> discussions of cryptography.  Yes, it has a meaning, and yes, that
> meaning is significant in cryptographic discussions - but most of the
> time, when people discussing cryptography use "entropy", they don't
> really mean much of anything.

It is true that /some/ people are unable to distinguish S from Shinola.

However, that is not a law that applies to the rest of us.  Most of 
the people on this list have progressed beyond the baby-talk stage.
I expect most people on this list to act like adults ... indeed like 
professionals.  I expect they will make some effort to think clearly 
and communicate clearly.



More information about the cryptography mailing list