[Cryptography] Other obvious issues being ignored?

Peter Gutmann pgut001 at cs.auckland.ac.nz
Fri Oct 23 04:58:59 EDT 2015


John Gilmore <gnu at toad.com> writes:

>I have worked on many programs, and whenever I found such a problem
>(typically called a "portability problem"), where the code was assuming
>something that the language did not guarantee, I fixed the program rather
>than bitching about the compiler.

And how do you know you found and fixed all the problems?  Since compilers
(and by "compilers" I mean gcc mostly) quietly break your code behind your
back, you have no way of telling whether you really fixed things or not.

(Also, "portability" != UB, see further down).

>I know this is too much trouble for some people. 

Just to provide some data on this, the SOSP paper that inroduced the STACK UB
analyser found that UB code occurs in 40% of all Debian Wheezy packages (that
use C).  That means that gcc, the worst offender of the lot in terms of
breaking code (again from the SOSP paper) will break nearly half of all Debian
packages when it builds them (the figure of 40% is a lower bound since STACK
doesn't do the same level of analysis that gcc does).

I'll just repeat that again, gcc will quietly break nearly half of all the
packages that it compiles.

One of the examples they give is a pointer-overflow check.  Say you're
compiling code for x64, Sparc, Power, ARM, whatever.  You include code to
check for pointer overflows.  Because the Intel 8088, which shipped in 1979,
eight years before gcc existed and for which gcc has never been able to
generate code, had a segmented architecture for which the pointer-overflow
check doesn't work, gcc feels justified in removing the pointer-overflow
check.

This is totally, utterly stupid behaviour for a compiler.

>GCC in particular has a -Wstrict-overflow option that will diagnose code that
>is being optimized with the compiler's assumption that it does not cause
>overflows.

It's pretty much useless, it doesn't warn about situations where it'll break
code, and the small number of locations where it warns in my code are ones
where, no matter how hard I stare at the code, I can't see how any overflow
can take place.  So in the few places where it does produce a warning, it's
effectively telling me that it's going to break my code due to a false
positive.  Elsewhere it just silently breaks the code without notice.

>Consciencious programmers can use this option to identify code that was
>nonportable, so they can rewrite it.

Nonportable != UB.  I can write code that assumes twos-complement behaviour
and it'll work the same on any CPU architecture I can get it to.  It's
completely portable, but according to the C spec it's UB, so gcc feels
justified in breaking it.

Even if you look at a more architecture-specific level, if I'm compiling for
(say) x86-64 then gcc knows that there exists an instruction "multiply the 64-
bit value in RAX by another 64-bit value to get a result in the 128-bit
register pair RDX:RAX", but claims that it doesn't know that adding 10 to
INT_MAX on the same CPU that it knows every intimate detail of produces a
negative value.

This isn't "compliance to the C standard", this is just the gcc developers
being idiots.

Peter.


More information about the cryptography mailing list