[Cryptography] Secure erasure

Jerry Leichter leichter at lrw.com
Sun Sep 11 07:23:54 EDT 2016


> 
>> If you some how manage to build really secure traditional OS's and
>> eliminate all the easy attacks, people may start doing the obscure stuff.
>> Until the attackers move there, no one will pay for the defenses.
> 
> I have difficulties following this logic. Because if you continue to keep
> traditional OSes insecure, which isn't very unlikely, easy attacks remain
> and no-one will want (or pay for) a secure system?
The argument is:  No one will pay for special compiler modes, special code in OS's, special hardware support, all to carefully route around optimizations that may, as a side effect, in special circumstances, cause data to leak - when that's not a leakage path that anyone is likely to actually attack, because much easier ones exist.

A true story on the other side of the argument - make of it what you will.  (I may have told this here before.)

The VAX was the second mainstream machine to be built to a carefully specified, published standard.  (The IBM 360 was the first, by a good long margin, though actually I'm not sure when IBM first published its standard.  In both cases, the reason for a careful standard was the same:  Both companies wanted to create a line of fully compatible computers - the IBM 360 series were the first such line - and without a standard to write to, there was to way to ensure that the different machines really were "the same".)  DEC, at least, really published its standard as a printed book.  (There was an internal-only version, but both versions were produced from the same source with different option settings - and the only thing missing from the public version where (a) historical information about proposals that were abandoned during development; (b) hints to hardware developers about the best way to do some things.)

The VAX architecture had two carefully-defined terms used to describe things that were *not* fully nailed down by the architecture:  "Unspecified" and "undefined".  "Unspecified" referred to the explicit results of an operation, and said that the bit pattern that ended up in them could be anything.  "Undefined" meant the machine could *do* anything - the proverbial "cause bats to fly up your nose".  Only privileged instructions could have "undefined" outcomes.

One day, someone pointed out that "unspecified" did *not* constrain how the output value was derived.  Could it depend on values that were not otherwise accessible to the user-mode caller?  For example, could an unspecified write to a register leave in that register the value some other process had written there?  (In modern terms:  Imagine a design with register renaming, and a unspecified operation in which the output register gets assigned a new renamed register from the pool - but then nothing is written to the register and it retains its previous value, which we'll posit has been left unchanged since the last process changeover.)

Well, this lead to a huge discussion in the VAX architecture and designer community. One of the designers actually spent the following weekend pin-pointing all uses of "unspecified" in the architecture, then checking all extant VAX implementations to see what each one did.  As it turned out, they all actually fell into two groups:  Either the output location was unchanged from what it had been; or the result was always zero.  So all existing and in-design hardware was safe.

The discussion then turned to how to modify the architecture specifications to ensure that all future hardware would be designed safely - without losing the freedom that the "unspecified" marking gave to implementers to do things efficiently.  The eventual conclusion was that it was impossible; we just added an internal note to implementers warning them of this issue.

However ... when the Alpha architecture was written, the writers already knew of this issue.  So they *specified "unspecified"* :-)  They enumerated the machine state that must functionally determine the value written when the result of an operations was "unspecified".

I'm not sure anyone has followed up on this in specifications for still-living architectures.  Perhaps it was done, at least internally, for the x64 architecture:  Many of the guys involved in its design were ex-Alpha designers.  But they had to build on a many-year-old legacy design going back to the 8080, and retrofitting that would probably have been impossible.
                                                        -- Jerry



More information about the cryptography mailing list