[Cryptography] "NSA-linked Cisco exploit poses bigger threat than previously thought"

Henry Baker hbaker1 at pipeline.com
Sat Aug 27 00:05:10 EDT 2016


At 04:28 PM 8/26/2016, Tom Mitchell wrote:
>Performance... inside an application programmers try to run fast by removing
>bounds checks and "assert(all-data-is-good-here)".  Most have no quality idea
>how to check that the assertion is true and test it in a correct and safe way.
>The difficulty in making a quality assert() is evident in the way to common bogus
>assert related bugs i.e. these are often hard.

Which is precisely why I always try to make the point that compiler optimizations
should *never* be "unsafe"; pointer checks and bounds checks *cannot be removed*
until the compiler can mathematically *prove* that these checks are redundant.

Part of your job as a programmer is to help convince the compiler that certain
checks -- that the compiler couldn't prove on its own were redundant -- can
be removed after noticing various "assert" statements/expressions.  Now the
"assert" statements themselves must also be proven to be redundant prior to
eliding their run-time tests, but by sprinkling these "assert" statements
liberally throughout the code, an assertion checked outside of a loop (which
is checked but once) can often allow the compiler to deduce that the checks
*inside the loop* are redundant and can then be removed without sacrificing
safety.

Of course, in these days of multi-gigahertz CPU's in which a run-time check
often costs *precisely nothing* because it is performed while the CPU is
waiting for tens or hundreds of cycles for the result of a main memory read,
the silliness of removing bounds checks is even more glaring.

Although the aptitude tests that Microsoft used to hire programmers in the
1980's came too early to be distributed on the Internet (besides, who at
Microsoft in the 1980's knew anything about networking?), these tests were
famous for bit-twiddling and non-bounds-checked loops.  You literally
couldn't get hired at Microsoft in the 1980's if you wrote high quality
code.

Microsoft (and its customers) got what they deserved with incredibly
bug-laden code which later cost multiple billions of dollars to find
and fix these bugs one-by-one.

While hardware engineers have long utilized increased "margins" and
error-correcting codes to reduce the chances of failure in the field,
software "engineers" never embraced such quality-enhancing techniques
until *forced* by hackers who could turn nearly every software bug
into an exploit.

It is ironic to see that the NSA TAO hackers aren't much better at
software engineering than the Microsofties; some recently released
"Equation" code has bounds overflow problems as well as facepalm
encryption failures.



More information about the cryptography mailing list