NCipher Takes Hardware Security To Network Level

Jerrold Leichter jerrold.leichter at smarts.com
Mon Oct 13 12:46:52 EDT 2003


| ... An extreme example of this is the use of formal methods for
| high-assurance systems, as required by FIPS 140-2 level 4.  Why is it in
| there?  Because ... the Orange Book ('85) had it in there at the highest
| levels.  Why was it in there?  Because the proto-Orange Book ('83) had it in
| there at the highest levels.  Why was it in there?  Because in the 1970s
| some mathematicians hypothesised that it might be possible to prove
| properties of complex programs/systems in the same way that they proved
| basic mathematical theorems....
|
| To continue: At what point in that progression did people realise that this
| wasn't a very practical way to build a secure system?  Some time in the late
| 1970s to early 1980s, when they actually tried to reduce the theory into
| practice.... So why is this stuff still present in the very latest certifi-
| cation requirements?  Because we're measuring what we know how to measure,
| whether it makes sense to evaluate security in that way or not....
I think this is a bit of an exaggeration, even if the net effect is the same.

Very few real efforts were made to actually produce a "provably correct" OS.
The only serious commercial effort I know of was at DEC, which actually did
a provably-correct virtual machine monitor for VAXes.  I knew some of the
guys who worked on this.  It was difficult, expensive, required a level of
integration that only a company like DEC - the did both the OS and the hard-
ware, which had to be modified to make the VMM run decently - was in a position
to provide.  But it was quite possible.

The whole thing died because by the time they finished it, the VAX was being
replaced by the Alpha, and you basically would have had to start over.

There was also an effort in England that produced a verified chip.  Quite
impressive, actually - but I don't know if anyone actually wanted the chip
they (designed and) verified.

The technical obstacles to actually doing a verified OS appear to be less
of an issue than is commonly supposed.  Unfortunately, as is often the case,
the economic factors are what kill you.  The VAX VMM designers were willing
to accept a factor of 10 slowdown to get provable security - and believed
their customers (mainly military) would too.  However, this makes for such
a small market that it's just not worth the investment.  Further, maintaining
your provable security is also difficult, expensive and time-consuming -
slowing you down in a market where everyone else is just speeding up.  And
you're certainly going to lose any time people start counting features -
which in turn means you're always going to be "non-standard", since you'll
never be able to keep up even with those standards that don't have insecurity
built in.

Could we have, say, provably-secure systems to run our water supplies and
nuclear power plants and aircraft carriers?  Technically, the answer is
probably yes.  (The question of what classes of attack you are provably
secure against is, of course, always present.  The Orange Book implicitly
defines the kinds of attacks of interest.  It's rather limited by today's
standards - a great deal has changed over the years - but it at least gives
you some solid ground to stand on.  About all you can say about today's
systems is that they are provably *in*secure against tons of well-known
attacks.)  Economically and practically, however, the answer is probably no.

							-- Jerry

---------------------------------------------------------------------
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to majordomo at metzdowd.com



More information about the cryptography mailing list