[Cryptography] Ada vs Rust vs safer C

Arnold Reinhold agr at me.com
Tue Sep 20 15:40:24 EDT 2016


> On Sep 20, 2016, at 12:29 PM, Jerry Leichter <leichter at lrw.com> wrote:
> 
>> Ab initio safety improvements would consist of clarification of undefined C behaviors to make unsafe code generation less likely. For example assuming twos complement arithmetic.
> Actually Standard C long ago settled on "binary and either one's or two's complement" for signed integral types; it may by now say "two's complement".  But that isn't the issue!  The Standard works just fine *as long as you don't exceed the bounds of the integral type*, whether two's complement or otherwise.  As soon as you do, the results are undefined.  Unfortunately, even testing at run time whether the result of an arithmetic operation will be undefined inherently gets you into undefined behavior.
> 
> I've never understood the reluctance of the C Standards guys to pin this down further.  They agreed long ago that *unsigned* arithmetic was purely mod 2^n.  And way, way back, it may have made sense to be inclusive and allow C to be used on machines with non-binary arithmetic (IBM 1620 decimal arithmetic), or with odd-ball binary arithmetic (sign and magnitude, anyone?)  But the fact is no one has built machines like that in many years, and no one is likely to - and if they do, it'll be because they have an architecture that's so different C won't work on it anyway.
> 
> At this point, it's all two's complement, and overflows either produce an easily predictable result, or trap.  C could move this out of "undefined" to "one of the following small sets of possibilities; you can check which using the following macros".  And many of the issues would go away.
> 
> BTW, the relationship between "undefined semantics" and "dangerous semantics" isn't simple!  Unsigned integral values have fully specified semantics in C - but unsigned integral values *as loop indices* are dangerously easy to get wrong!  (Been there, done that.)  In fact, Google's standards for C and C++ forbid the use of unsigned integers except in very limited circumstances (e.g., bit operations).  Back when int was 16 bits, the ability to represent larger numbers in unsigned int was important.  But that need faded years ago.
> 
>                                                        -- Jerry
> 

Consider what I said shorthand for what you said. This issue and its consequences, such as deleting integer overflow checks, has been discussed many times on this list over many years. So has the problem of secure erasure, and others. My question is what is the best way to get things changed? The literature on organizational behavior and even psychological abuse may be more relevant here than formal language design or type theory. (“You didn’t perform your overflow check in exactly the approved way, so you must be punished.” “We can’t change our standard because of computer designs from the 1950s that were obsolete before work on C began.” “Other people will suffer if we make the changes you request.” An abuse counselor would have no difficulty recognizing the constructs.)

Possible ways forward include:

* Holding a meeting
* Engaging some compiler developers in a non-hostile conversation
* Writing a group letter to FSF
* Talking to Richard Stallman (I remember when he used to wear an “Impeach God” button. That is what we seem to be asking.)
* Publicizing the issue
* Going to the standards bodies
* Starting our own standards organization
* Forking GCC
* Looking for funding (grants, Kickstarter, sugar parent, …)

If not now, when?

Arnold Reinhold



More information about the cryptography mailing list