[Cryptography] Secure erasure

Jerry Leichter leichter at lrw.com
Sat Sep 10 19:20:52 EDT 2016


  
>> The entire stack, hardware to OS to libraries and compilers and languages, are designed to be (a) general purpose; (b) efficient; (c) usable.
>> 
>> Cryptographic code represents a vanishingly small percentage of either bytes or cycles used on any system.
>> 
>> The system is not and *will not* be designed around its needs.
> 
> I disagree.  The next 50 years of computer science will revolve around not just computing, per se, but computing with secrets & computing while keeping secrets.  Since computing can be attacked at every level, every level of computing will have to be "hardened" to protect itself.  This will require an entire restructuring of computing hardware, operating systems, computer languages and compilers.
Frankly ... I don't see it happening.  The demand is simply not there.  The sophisticated attacks we talk about here are *not* how hacking is done today.  We haven't even seen evidence of the government actors going that far.  There are way too many easier attacks.  If you some how manage to build really secure traditional OS's and eliminate all the easy attacks, people may start doing the obscure stuff.  Until the attackers move there, no one will pay for the defenses.

>> No matter what you do ... you have to trust the hardware.
> 
> Not true; that's what oblivious computing is all about: you can watch the computation all you want, but you won't learn anything about what it's computing.
> 
> During the past 80 years or so, we've learned how to reliably compute with unreliable hardware.  During the next 50 years we'll learn how to securely compute with untrusted hardware and software.

But at some point the unencrypted inputs and outputs have to enter and leave the system - so in the end we're saying the same things in different words:  Assume the bulk of the system is and will forever remain untrustworthy, and only some secure core element *is* trustworthy.  Design your system so that the trustworthy part has enough "leverage" that it doesn't matter what the untrustworthy part does.  Using various forms of "oblivious computation" - if we can get it efficient enough - is certainly one approach to doing this.

This is hardly a new concept.  We called that core the "reference monitor" (and other similar terms) since the 1960's at least.  Back then, most descriptions assumed that the hardware itself could be trusted to keep the reference monitor isolated - and fully in control of the rest of the software.  Given the hardware of the time, that was probably reasonable.  What I'm arguing is that, given today's extremely complex hardware, you have to partition off the hardware support for the reference monitor, too.
                                                        -- Jerry



More information about the cryptography mailing list