[Cryptography] Formal Verification (was Re: Trust & randomness in computer systems)

Ron Garret ron at flownet.com
Sat Mar 19 17:05:06 EDT 2016


On Mar 19, 2016, at 1:35 PM, Perry E. Metzger <perry at piermont.com> wrote:

> On Sat, 19 Mar 2016 12:56:29 -0700 Ron Garret <ron at flownet.com> wrote:
>> On Mar 18, 2016, at 12:55 AM, Viktor Dukhovni
>> <cryptography at dukhovni.org> wrote:
>>>> On Mar 18, 2016, at 2:51 AM, Ron Garret <ron at flownet.com> wrote:
>>>> 
>>>>> Formal verification is not a panacea
>>>> 
>>>> Indeed not.
>>>> 
>>>> http://spinroot.com/spin/Doc/rax.pdf
>>>> 
>>>> TL;DR: In the 1990s we formally verified the a portion of the
>>>> flight software for an unmanned spacecraft.  Despite this, and
>>>> despite extensive ground testing, the software nonetheless
>>>> failed in flight.  (Happily, the failure was not catastrophic.)
>>>> 
>>>> There's no getting around the GIGO problem.
>>> 
>>> I'm confused.  Are you claiming that formal methods failed here?
>> 
>> No, it was the application of formal methods that failed.  But it's
>> not just a random bit of stupidity, there was an interesting
>> structure to this particular failure. It was never practical to
>> verify the entire software stack, there was just too much of it.
>> So we tried to structure the software with a secure, formally
>> verified "core" that was supposed to maintain certain invariants
>> (like no deadlocks).  The theory was that anything built on top of
>> that core would maintain those invariants.
> 
> How could that possibly be when there were unverified components that
> could clearly violate the deadlock invariant?

The RAX executive was designed like an operating system (indeed, it arguably *was* an operating system, though it ran as an application on top of a “real” operating system) with a formally verified kernel and non-verified application code running on top of it.  The kernel exposed high-level operations with semantics like, “Do the following tasks in parallel until either one task succeeds (in which case terminate all the others and report success) or they all fail (in which case report failure).”  Application coders were supposed to use these high level operations exclusively, and if they had it probably would have worked.  But the only enforcement mechanism we had to ensure that only the high-level operations were being invoked was coder discipline and code audits.  Basically, we looked at the code to make sure that none of the low-level primitives (like mutexes) were being invoked.

Well, what happened (to the best of my recollection — this was 20 years ago) was that the application programmer had a task for which he needed a mutex, and though he knew that he wasn’t allowed to use them, he ended up (inadvertently) inventing one.

rg



More information about the cryptography mailing list