[Cryptography] defaults, black boxes, APIs, and other engineering thoughts

Jerry Leichter leichter at lrw.com
Mon Jan 6 07:46:37 EST 2014


On Jan 5, 2014, at 7:55 PM, Albert Lunde wrote:
>> They looked to the promised land of HTML5 as a way to do Flash-like
>> stuff without a plug-in and without Flash. But just what reason does
>> anyone have to think that HTML5 implementations will be any better than
>> Flash?
> 
> HTML5 plus CSS is said to be Turing Complete
> 
> An example of the effect of creeping featurism... :-(
A tale for the newbies:  Algol 68 had an extensible type system which implied and extensible syntax.  The Algol 68 report describe the language using W-grammers (W for their developer, Adriaan van Wijngaarden).  W-grammars were intended to provide just enough of a step beyond context-free grammars to represent the context sensitivity implied by Algol 68's type system.

It was a great embarrassment to all involved when it turned out that W-grammars were, in fact, universal.  As was pointed out at the time, this implied that you could not solve the termination problem for an Algol 68 compiler, so if it ran for a very long time, you couldn't tell if it had a bug or just really needed the time.

(I learned about this many years ago, probably from Alan Perlis.  One thing I hadn't heard - I found it just now in the Wikipedia article about W-grammars - is that Don Knuth was working on attribute grammars at about the same time, for similar purposes, and that there are some underlying similarities in the approaches.  Knuth apparently spoke to the Algol 68 committee about this.  Attribute grammars survived, while W-grammars faded away.  But most compilers to this day use ad hoc semantic techniques - i.e., reference to a symbol table - to handle the non-context free aspects of their languages during parsing - for example, the type names that typedef produces in C.)

Years later, we've learned to embrace the inner universality in our languages.  For example, the C++ template language is universal.  (This kind of thing is most often shown these days by implementing the lambda calculus, which tends to be straightforward in anything that has recursion plus macro-like substitution.  TeX is easily seen to be universal through that mechanism - in fact, there are implementations of the lambda calculus that are used to build useful TeX macros.)  We seem to have forgotten about the downsides.

But ... bringing this back to security:  There's some recent work that makes the point that our network protocols are too complex for us to prove much about them - but they formalize this by looking at it from the view of language complexity theory.  Nice work that I need to find some more time to read more about - see http://www.cs.dartmouth.edu/~sergey/langsec/.  (I *may* have found this a while back through a message on this list.  If you posted it previously - thanks.)
                                                        -- Jerry

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://www.metzdowd.com/pipermail/cryptography/attachments/20140106/b902463d/attachment.html>


More information about the cryptography mailing list