[Cryptography] Ada vs Rust vs safer C

Arnold Reinhold agr at me.com
Wed Sep 21 21:55:03 EDT 2016


> On Sep 21, 2016, at 6:22 PM, John Denker <jsd at av8n.com> wrote:
> 
> On 09/21/2016 12:56 PM, Florian Weimer wrote:
> 
>> We need (reasonably) hard evidence which of the many open issues
>> actually matter, and based on that, determine the direction to move
>> in.
> 
> Agreed.

Disagree. Cyber security is a major problem, a major threat to national security, even arguably to civilization as we know it. The head of the CIA, John Brennen, says it "really is the thing that keeps me up at night.” We also know that unreported security flaws, "zero days,” are the underpinning of the most dangerous cyber attacks and they are avidly stockpiled by cyber warfare groups and, likely, criminal enterprises. There is a thriving black market for such zero days. We are not going to see most of them until it is too late. Computer security must be proactive, not just reactive. We can make reasonable judgements about compiler behaviors that weaken security without waiting for a disaster post mortem.

> 
>> Currently, no one has that kind of data (and is willing to share).
> 
> Well, there is "some" data.  For example:
> 
>  Baishakhi Ray, Daryl Posnett, Vladimir Filkov, Premkumar T Devanbu
>  "A Large Scale Study of Programming Languages and Code Quality in Github"
>  Proceedings of the 22Nd ACM SIGSOFT International Symposium
>    on Foundations of Software Engineering (2014)
>  http://macbeth.cs.ucdavis.edu/lang_study.pdf
> 
> 
...
> The observed effect is very modest indeed.
> 
> [snip]
> 
>>> these modest effects arising from language design are overwhelm-
>>> ingly dominated by the process factors such as project size, team
>>> size, and commit size.

But the article also says that the effect is larger for specific issues, such as memory bugs and security; see Table 8 and Result 4. Security bugs were only 2% of the total bugs studied, however.
...
> 
> Presumably there are other parts of the software development process
> that are more important than the choice of language.

Not doubt, but the issues we are discussing affect them as well, see below.
> 
> ----------
> 
> Also NASA has coding rules for life-critical and mission-critical
> software.
>  http://lars-lab.jpl.nasa.gov/JPL_Coding_Standard_C.pdf
> 
> Most "modern" language architects would consider this to be
> the Luddite approach:  No recursion, no dynamic allocation
> of variables, et cetera.  However the NASA guys seem to think
> this is the "only" way they can "guarantee" correct execution.
> It's not some passing fad;  they've been doing it this way
> for years.

I wouldn't be surprised if NSA had similarly restrictive rules.  

> 
> Probably the biggest issue is this:  Most developers don't care
> very much about reliability or security.  The motto of Silicon
> Valley is "ready, fire, aim".  That is, get /something/ out the
> door fast, and (maybe) tune it up later.

I agree, but that’s something of an over simplification. There have been many efforts to get programmers to follow better security practices. The issues with C that we are discussing sabotage those efforts. It’s one thing to try to get programmers to take extra effort to check for overflows or zero out sensitive data after it is no longer needed. Quite another thing to teach them the exacting incantations that C requires be used to accomplish those tasks, if they are even possible. Consider Rule 16 of the JPL guidelines cited above:

"Rule 16 (use of assertions) Assertions shall be used to perform basic sanity checks throughout the code. All functions of more than 10 lines should have at least one assertion.”

The C optimizer can and does remove those sanity checks if the behavior is considered undefined.

> 
> This has got to change.  Maybe it will.  Sony says the 2011 PSN
> hack cost them 175 million dollars.  At some point shareholders
> are going to demand that people clean up their act.

Large companies like Sony are more likely to get their act together than the smaller firms in the IoT space. 

> 
> The existence of botnets that could easily DDoS the entire internet
> is just intolerable.  At some point somebody is going to impose a
> duty of care on the vendors, to require them to make sure this
> cannot happen.  By way of analogy:  To operate a car on public
> roads, it must meet safety standards.  You also need insurance
> and a licensed driver.  Now imagine enforcing similar standards 
> on anything that connects to the public network.

Imagine how much more progress we could make it the tool makers were on our side. 

Arnold Reinhold




More information about the cryptography mailing list