Papers about "Algorithm hiding" ?

Ian G iang at systemics.com
Tue Jun 7 12:41:12 EDT 2005


On Tuesday 07 June 2005 14:52, John Kelsey wrote:
> >From: Ian G <iang at systemics.com>
> >Sent: Jun 7, 2005 7:43 AM
> >To: John Kelsey <kelsey.j at ix.netcom.com>
> >Cc: Steve Furlong <demonfighter at gmail.com>, cryptography at metzdowd.com
> >Subject: Re: Papers about "Algorithm hiding" ?
>
> [My comment was that better crypto would never have prevented the
> Choicepoint data leakage. --JMK]

OK, yes, you are right, we are talking about two
different things.

The difficulty here is that there is what we might call
the Choicepoint syndrome and then there is the
specific facts about the actual Choicepoint heist.
When I say Choicepoint I mean the former, and the
great long list of similar failures as posted last week.
I.e., it is a syndrome that might be characterised as
"companies are not protecting data" or in other words
"the threat is on the node not the wire."

Whereas in the specific Choicepoint heist, there is
the precise issue that they are selling their data to
someone.  That's much more complex, and crypto
won't change that issue, easily.


> >Sure it would.  The reason they are not using the tools is because
> >they are too hard to use.  If the tools were so easy to use that it
> >was harder to not use them, then they'd be used.  Consider Citigroup
> >posted today by Bob.  They didn't encrypt the tapes because the tools
> >don't work easily enough for them.
>
> So, this argument might make sense for some small business, but
> Citigroup uses a *lot* of advanced technology in lots of areas, right?
> I agree crypto programs could be made simpler, but this is really not
> rocket science.  Here's my guess: encrypting the data would have
> required that someone make a policy decision that the data be
> encrypted, and would have required some coordination with the credit
> agency that was receiving the tapes.  After that, there would have
> been some implementation costs, but not all *that* many costs.
> Someone has to think through key management for the tapes, and
> that's potentially a pain, but it's not intractible.  Is this really
> more complicated than, say, maintaining security on their publically
> accessible servers, or on their internal network?

No it's not rocket science - it's economic science.
It makes no difference in whether the business is
small or large - it is simply a question of costs.  If
it costs money to do it then it has to deliver a
reward.

In the case of the backup tapes there was no reward
to be enjoyed.  So they could never justify encrypting
them if it were to cost any money.  Now, in an unusual
exception to the rule that laws cause costs without
delivering useful rewards, the California law SBxxxx
changed all that by adding a new cost:  disclosure.
(Considering that banks probably lose a set of backups
each every year and have been doing so since whenever,
it's not the cost of the tapes or the potential for ID theft
that we care about...)

Now consider what happens when we change the
cost structure of crypto such that it is easier to do it
than not.  This is a *hypothetical* discussion of course.

Take tar(1) and change it such that every archive is
created as an encrypted archive to many public keys.
Remove the mode where it puts the data in the clear.
Then encrypt to a big set of public keys such that
anyone who can remotely want the data can decrypt
it (this covers the biggest headache which is when
you want the data it is no longer readable).

So, now it becomes trivial to make an encrypted
backup.  In fact, it is harder to make an unencrypted
backup.  What are companies going to do?  Encrypt,
of course.  Because it costs to do anything else.



> >The other way of looking at Choicepoint - change the incentives - is
> >a disaster.  It will make for a compliance trap.  Compliance *may*
> >protect the data or it may have completely the opposite effect, the
> >situation with 'unintended consequences' in such a case is likely to
> >be completely unpredictable.  The only thing we can guarantee is that
> >costs will go up.
>
> Well, Choicepoint is a bit different, right?  I mean, as I understand
> it the big disclosure happened because they sold peoples' data to
> criminals, but they were in the business of selling peoples' data.
> They just intended to sell it only to people of good intention, as far
> as I can tell.  (Perhaps they should have demanded X.509 certificates
> from the businesses buying the data and checked the "evil" bit.)  I
> just can't see how cryptography could have helped prevent that attack,
> other than by making the data that Choicepoint depends on harder to
> get in the first place.

Yes, you are right, I was thinking "Choicepoint syndrome"
here.  In order to address "Choicepoint-actual" with crypto
we'd have to look at Rights systems: nyms, caps and Brands,
or address it at the business level.

> >It's much cheaper and much more secure to simply
> >improve the tools.
>
> But this does no good whatsoever if there's not some reason for the
> people holding the data to use those tools.

Yes, that's why I'm saying that the tools should actually
make it easier to use the crypto than to do the alternate.
If we need to explain the reason, we've already lost.

> Everyone with a network 
> presence and any kind of high profile does, in fact, use moderately
> complicated computer security tools like routers, firewalls, VPNs,
> virus scanners, and spyware detectors.  Everyone has to deal with
> keeping their boxes up to date on patches.  However imperfectly, it
> seems like Citigroup and Choicepoint and the rest can actually do
> those things.  So when you excuse their failures to secure customer
> data with "the tools aren't there," this sounds absolutely implausible
> to me.

The presence of some tools doesn't effect the argument
that other tools should be easier.  All this says is that
they - the bigger businesses - are capable of fielding
and paying for some tools.

> I'm not crazy about a HIPAA-style mandate for encryption and shredders
> either, but we have this basic problem:
>
> a.  It's basically easy to buy or find some amount of data about many
> people.
>
> b.  It's basically easy to use that amount of data to get credit in
> their name.
>
> I suspect a better solution than trying to regulate data brokers is to
> make it more expensive to give credit to Alice under Bob's name.  The
> thing that imposes the cost on me isn't when someone finds my SSN,
> it's when someone takes out a bunch of loans which I'm then expected
> to pay back.  Then it becomes my problem to resolve the disputes
> created by the lender's desire to extend credit at minimal cost.  (The
> lender also loses money, of course.  But much of the cost is shifted
> to the identity theft victim.)

Yes, I'd agree with that.

iang
-- 
Advances in Financial Cryptography:
   https://www.financialcryptography.com/mt/archives/000458.html

---------------------------------------------------------------------
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to majordomo at metzdowd.com



More information about the cryptography mailing list