[Cryptography] best practices considered bad term

ianG iang at iang.org
Sun Feb 8 06:57:15 EST 2015


On 8/02/2015 01:48 am, Ralph Holz wrote:
> Hi Ian,
>
> I find your argument intriguing, but it is not supported yet by
> evidence, and there may be different interpretations to the same
> statistics. I am not saying you are wrong, but having actual data to
> support that would be great - this is probably an area for qualitative
> analysis followed by quantitative:


Yeah, data would be good.  But how to do that?  I'm not sure, the 
experiment has already been run, and isn't repeatable.

One thing that might be measurable is the switch over from monoculture 
to duoculture if that's the term.  People have speculated that you need 
10% or more to make attacks for a single platform worthwhile.  So a 
prediction is that we should start seeing more trouble at the user level 
for Macs.


>> I'm not so sure.  If you look at the 2000s, Apple shipped gear that was
>> remarkably free from bugs and attacks.  Their security bug list was in
>> the 3 figures whereas Microsoft was in the 5 figures.  I suspect that is
>> still the case, although I don't track it.
>
> This may also be an artefact of inaccurate reporting of bugs, e.g. due
> to a security team that was too small. It may also be an artefact of not
> enough people outside poking the software.


Oh yes - the business about bug reports not matching actual bugginess or 
even damages is well known.  I'm just using this as a proxy for "trouble."

What I do know is that throughout the 2000s people on Macs had zero 
trouble with viruses, re-installs, slowdowns etc.  In comparison to the 
Microsoft crowd.

Now, you can "explain" this by monoculture, versions, user idiocy, etc. 
  My point is that wherever it came from, it happened, and that was a 
significant factor in this:


>> Now, here's the sell:  Over the 2000s, people drained out of the
>> Microsoft world to the Apple Mac OSX world pretty consistently.  At the
>> start, Apple was tiny.  At the end, the biggest.
>>
>> And -- my hypothesis -- they did that in significant part because the
>> Mac OSX product was more secure.  By this I mean, no requirement to run
>> virus scanners, and until last few years, very little update and change
>> requirement.  Which meant more time and more $$$ in users' pockets.
>
> I cannot recall any colleague who gave security as the first argument
> for a switch to OS X. It was almost always the convenience of the OS,
> plus the coupling to other devices. Sure, this is not a representative
> sample - but it makes me ask for some real data.


No, it wasn't security for the users, it was reliability and usability. 
  And it was only apparent if you took a longer view, like a year or 
three in comparison.


>> I'd say, *in the long run*, Apple beat Microsoft on software security.
>> It helped that their hardware was good too, and that they had the sense
>> to aim for the premium price range.  By that, I mean Jobs took the long
>> view, a decade.  Wouldn't fly in other circumstances of course.
>
> Given that Microsoft has a good development lifecycle in place, this
> makes me ask for data even more so. :)


Sure.  But their development lifecycle failed to kick in.  Remember 
Vista?  That was the product of the lifecycle.  It had to be replaced by 
W7.  Which was good, but *then* not as many upgraded, the legacy bit.

It's complicated.  I think in the long run, Microsoft dug themselves 
into a security hole out of which they had a lot of trouble digging 
themselves out of.  Remember they'd been digging that hole for 20 years 
... unlike the spin-on-a-dime Blackbird thing in 1997, security turned 
out to be an oil tanker, and Bill Gates' famous memo wasn't enough to 
turn the thing around.



iang


More information about the cryptography mailing list