[Cryptography] Best internet crypto clock

Jerry Leichter leichter at lrw.com
Sun Oct 5 08:06:38 EDT 2014


On Oct 4, 2014, at 7:14 PM, Henry Baker <hbaker1 at pipeline.com> wrote:
>> The first isn't hard to solve, in the traditional way of producing trustworthy random number generators:  Simply have NIST, the NSA, the EFF, the Russian and Chinese governments - whoever is willing - implement beacons.  To produce a beacon you trust, choose any subset, combine the "random" numbers, and sign the result in the usual way.  The subset and the method of combination are all public and committed to; all the inputs are public.  Since the individual beacons can only be corrupted by entirely stopping them, or by producing predictable (to the attacker) values, unless someone corrupts *all* the sources, the combination is unpredictable.
> 
> Yes, one could do as you say, but _checking_ this calculation isn't going to be easy unless a large number of place on the Internet _store the appropriate sequences_.  You then have the problem of _checking that all (or most) of these sequences are the same_.
I'm not sure what it is you would want to check.  The protocol for each beacon would be, at each time T, to send the triple <Tp, T, R>, where T is the current time, Tp < T is the value that T had in the last emitted triple, and R is the random value.  The triple is signed using the beacon's signature.  Any collection of such value can be merged to create a new triple.  The Tp/T values are re-computed relative to the new triple.  A careful "combining beacon" will forward the triples that went into its computation.  (There are some obvious requirements for how the various input Tp/T value can be combined to produce the output Tp/T values.)

A signed triple from a beacon is self-identifying - there is no need for anyone who doesn't intend to use it as part of an assertion to store it, and there's no need for a history.  Chaining the values together makes it harder for a beacon to go back and "revise history", though how much that adds isn't clear - anyone who *uses* a beacon value will present a signed triple, and if the beacon ever lies about a past value that someone has used, it will get caught.  (If it wishes to lie about past values that no one used ... why should we care?)

(The reason for including Tp is that it makes the question "what was the triple emitted by a given beacon that had the smallest value T >= t?" unambiguously answerable.)

> In the case of the Fortune 500 or the Dow Jones share prices, a large number of sources _already publish_ these numbers, so you only have to go check a sufficient number (for your purposes) of probes to these public databases to convince yourself that they are all consistent, and then perform your checking calculations on the published share prices.
Someone here a couple of months back discussed an actual, real-world attempt to compute a value this way.  It failed.  (I searched around for it but was unable to find it....)  The numbers get corrected and changed after the fact.  The changes may be trivial, and they may be infrequent, but they are frequent enough to make the process fail.

> Other sources might be sec.gov, which store all the submissions by public companies of their quarterly reports, changes in ownership, etc.  While the current sec.gov makes no attempt (that I'm aware of) to blockchain these submissions, it would be pretty easy to change sec.gov to require a "previous" hash for incorporation into any SEC report submission, and this would have the effect of partial ordering all the submissions in such a way that it would be essentially impossible for _anyone_ to change the ordering (including the SEC itself), without everyone else being about to notice that someone was trying to make a change.
While this wasn't part of the previous posting, I think the lesson to be learned is that public sources like this *make no claim that what they've published will never change*.  There's no reason why they should - such a claim isn't relevant to the reason the sources exist.  Errors get corrected, stuff gets reformatted to match some new standard.  The nominal semantics of what's in the database is supposed to never change, but that's based on human understanding, not something you could readily create a hash from.  And, in fact ... even *that's* probably not true.  Documents get screwed up in production - someone leaves out a paragraph, or includes some material in both old a new forms by mistake, or does something else that doesn't get noticed until later.  Then the document gets fixed and the database updated.  *Maybe* the new one gets a "Revised" marker.  Almost certainly, however, the old document gets deleted:  Saving it doesn't add to - and likely detracts from - the value of the database.

> It would be nice to have an in-the-clear/public Internet database with the following properties [list of 7 properties to provide a "perfect" record of documents]
Ask yourself:  To whom would this be valuable?  Would the value exceed the cost of maintaining such a thing?  You cite the SEC as an example of a potential user, but there is, as far as I can tell, nothing in any SEC regulation that would require such a thing.  It would supposedly be for protection against someone producing a faked version of a document from the past.  While such issues aren't common, they do occur - consider Paul Ceglia's claim that he owns half of Facebook.  We already have plenty of ways of investigating the validity of such a claim.  Unless you *require* that all documents be added to this database, anyone creating a fake will simply say "Oh, we didn't think it was important at the time so we didn't send it in."

                                                        -- Jerry



More information about the cryptography mailing list