Cryptography and the Open Source Security Debate

J Harper jsec at peersec.com
Wed Jul 21 14:35:43 EDT 2004


There doesn't appear to be a discussion forum related to the Web post, so
I'll reply here.

We've gone through a similar thought process at my company.  We have a
commercial security product (MatrixSSL), but provide an open source version
for many of the good points Daniel makes.  There are a few additional
reasons that the general open-source conclusion isn't as clear cut however.

One of the benefits of having security algorithms open source is that the
code can be better trusted in terms of back doors.  So while Daniel may
trust the NSA, others may not.  Standard software typically isn't treated
with the same "positive paranoia", and therefore doesn't often have the same
requirement.  Back doors in closed source networked software, for example
can be somewhat tested for through network protocol analysis and controlled
through firewalls.  Cryptography must be trusted at an algorithm level,
since protocol analysis of the resultant data (should be) very difficult.

A larger problem with the expansion from cryptographic algorithm benefits to
open source benefits in general is one of pure code size.  The number of
lines of code of both Windows and Linux distributions is 30-40 million.  The
number of developers that 1. bother reading the code, 2. can actually
understand it, 3. have the expertise and motivation to suggest/correct
problems is a very small subset of developers for either proprietary or open
source solutions.  Developers that understand the complex interactions
between software components are even fewer.  When it comes down to it, a
very small set of people becomes trusted in whatever code area they
specialize in.  Crypto algorithms are typically on the order of 1000 lines
of code, with a large portion containing static key blocks.  Experts looking
at an open crypto algorithm require a great deal of expertise, but can also
be much more focused in their analysis.

> "Do you trust both the talent and moral integrity of the company you are
> getting your closed source software from so much that you think the
> products they are supplying you are superior (from a security standpoint)
> to those that can be created via peer review and attack by tens of
> thousands?"
This question is less related to the crypto specific conclusion, so I'll
keep it short.  The power of capitalism to produce competition and better
products is underestimated here.  Market pressure is growing for closed
source to become more secure, and when money is at stake, the products will
improve.  Peer review by tens of thousands of largely unpaid, untrained
developers vs. a corporation's all important shareholder value in this
increasingly security conscious market is a much more even fight than one
might expect.  This is why we have combined both sides of the debate to
produce a dual licensed product that has the security benefits of open
source, with the market responsibility of a commercial product.

J Harper
PeerSec Networks
http://www.peersec.com

----- Original Message ----- 
From: "R. A. Hettinga" <rah at shipwright.com>
To: <cryptography at metzdowd.com>
Sent: Tuesday, July 20, 2004 9:46 AM
Subject: Cryptography and the Open Source Security Debate


> <http://www.osopinion.com/print.php?sid=1792>
>
>
> osViews | osOpinion
>
> Cryptography and the Open Source Security Debate
>
> Articles / Security
> Date: Jul 20, 2004 - 01:03 AM
>
>
> Contributed by: Daniel R. Miessler
>  :: Open Content
>
> If you follow technology trends, you're probably aware of the two schools
> of thought with regard to security and/or cryptography. Does cryptography
> and security solutions become more secure as the number of eyes pouring
> over its source code increases or is a private solution which leverages
> security through obscurity provide a more secure environment?
>
>  Daniel R. Miessler submitted the following editorial to
osOpinion/osViews,
> which offers some compelling arguments for both scenarios. In the end, his
> well thought out opinion, comes to a universal conclusion.
>  --
>
>  I've been reading Bruce Schneier's Book on cryptography for the last
> couple of days, and one of the main concepts in the text struck me as
> interesting.
>
>  One of the points of discussion when looking at the security of a given
> algorithm is its exposure to scrutiny. Bruce explicitly states that no one
> should ever trust a proprietary algorithm. He states that with few
> exceptions, the only relatively secure algorithms are those that have
stood
> the test of time while being poured over by thousands of cryptanalysts.
>
> Similar Situations
>
>  What struck me is the similarity between this mode of thought and that of
> the open source community on the topic of security. In that debate there
is
> much disagreement about which is better - open or closed -, while in the
> crypto world it's considered common knowledge that open is better.
> According to the crypto paradigm, having any measure of an algorithm's
> security based on the fact that it's a secret is generally a bad thing.
> There, keys are what makes the system secure - not the algorithm being a
> secret.
>
>  I realize there are some differences in these two models, but they are
> small enough, in my opinion, to say that those participating in the
> open/closed source debate could learn something by tapping into the body
of
> knowledge held by this related field.
>
>  Can we, for example, take the analogy at face value and compare "Joe's
> Nifty Algorithm" and DES, with the source code for the Windows kernel +
IIS
> and the Linux kernel + Apache?  Windows to Linux isn't quite fair since
> Linux is a kernel alone...If we can, you realize, then it's fairly easy to
> see which is more secure - Linux. Why? Well, to put it simply, because
> there are all those people looking for ways to attack it while having full
> access to the source code/(algorithm). For Windows and IIS, it's a
> relatively paltry number in comparison.
>
>  In the cryto world, an experienced analyst would never even consider
> encrypting sensitive data with "Sally's Uber-Secure Algorithm" simply
> because of one concept:  "Who the hell is Sally? Why is she qualified to
> create algorithms that are secure? Why should I trust her?"
>
> Take This Code... Please
>
>  Well, to take the analogy to its conclusion, Sally's algorithm is very
> much like Windows, Sun, or any other proprietary company's source code.
> What you have is something like this:
>
>  Proprietary Company: "Trust us, this stuff is pretty solid."
>  Analyst/Skeptic: "Show me."
>  Proprietary Company: "I can't...it's secret. But trust me, it's secure."
>
>  If that were an algorithm up for review, the conversation would look more
> like this:
>
>  Algorithm Designer: "This is secure."
>  Cryptanalyst: "Show me all the source code, let the world attack it for
> years, and if it holds up, I'll believe you - a little bit."
>
> Who Can you Trust?
>
>  Call me crazy, but the second paradigm is the only system that I can
place
> any significant amount of trust in.
>
>  Now, there is a trump card up the sleeve when it comes to secrets and
> algorithms. When the organization or person creating the secret is trusted
> to do a good job simply because of who they are, i.e. the NSA or Bruce
> Schneier himself,  then, and only then, can a secret possibly be an asset.
> The NSA doesn't, for example, give out its algorithms so that they can be
> scrutinized even though they know this could potentially lead to the
> discovery of weaknesses.
>
>  To them, it's more important that it's secret. I would be relatively
> confident that their algorithms were stout, despite the fact that it
hasn't
> been torn apart by the world's best. Why? Because they have many of the
> world's best working for them, while Bruce...well, he's just the man.
>
>  I would also place a fair amount of trust in the security of a closed
> source web server package if that package was known to come from Weitse
> Venema, for example. Realize though that Weitse himself would most likely
> demand that the project was open source precisely so that the world could
> verify its relative security. He's not likely to even trust himself to
> produce code that's as secure as the world could make it, and that should
> say something right there.
>
> Final Thoughts
>
>  So, now the question becomes very simple:
>
> "Do you trust both the talent and moral integrity of the company you are
> getting your closed source software from so much that you think the
> products they are supplying you are superior (from a security standpoint)
> to those that can be created via peer review and attack by tens of
> thousands?"
>
>  For me, in the case of Microsoft, Sun, and countless others, the answer
is no.
>
>  Go Linux; go BSD. ::
>
>
>
> Technology journalism should belong to the community. Take it Back. Get
> Published on osOpinion / osViews!
>
>
>
>
>  This article comes from osViews | osOpinion
> http://www.osopinion.com/
>
>  The URL for this story is:
>
http://www.osopinion.com/modules.php?op=modload&name=News&file=article&sid=1792
>
> -- 
> -----------------
> R. A. Hettinga <mailto: rah at ibuc.com>
> The Internet Bearer Underwriting Corporation <http://www.ibuc.com/>
> 44 Farquhar Street, Boston, MA 02131 USA
> "... however it may deserve respect for its usefulness and antiquity,
> [predicting the end of the world] has not been found agreeable to
> experience." -- Edward Gibbon, 'Decline and Fall of the Roman Empire'
>
> ---------------------------------------------------------------------
> The Cryptography Mailing List
> Unsubscribe by sending "unsubscribe cryptography" to
majordomo at metzdowd.com

---------------------------------------------------------------------
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to majordomo at metzdowd.com



More information about the cryptography mailing list