[Cryptography] ideas for (long) Nothing up my sleeve numbers
decoy at iki.fi
Tue Apr 1 21:02:26 EDT 2014
On 2014-04-01, Albert Lunde wrote:
> In this kind of construction, the "something up my sleeve" may appear
> to be how you picked the starting point and other details.
That is why I put in Elias's omega code, because to my knowledge it's
the first optimal construction of a universal code Levenstein's, and it
always starts from one, so that you can't repeatedly extract zero bits
and by that way get around the idea that you shouldn't be able to
precisely choose where you start your extraction.
Second, since the set of extant cryptographers is disributed and
ostensibly a novel algorithm needing low-sleeveness numbers might even
come from outside the known mainstream, I can't think of any lower
complexity protocol which just lets you extract arbitrary amounts of
nice bits from a commonly trusted source, than this one, combined with
some kind of plausible enough notarization to establish precedence in
case of dispute over who owns the currently-considered-to-be-spent-bits.
And third, the whole point is that you don't get to choose the length of
the extraction and the bits extracted separately. They come as an
integral whole. If you want to have a certain number (which in a good
primitive means you have as many bits of evil design freedom), that'll
translate to as long a linear search starting from the current status
quo as there are values in the total range your extracted number might
eventually have. After that, we're also using a reasonably efficient
coding, at least asymptotically speaking: the omega code is pretty well
distributed for this sort of thing, and if you consider the possibility
of overlap, without formal proof I'd assert that the overlap more or
less asymptotically compensates for the efficiency loss of prepending a
length prefix in the first place.
Of course, the computational complexity of finding any reasonable amount
of low-sleeveness bits isn't the real issue here, even if it might help
disincentivize a naive crypto designer from building an optimization
farm for the purpose. The real point is that the construction forces --
I hope, fingers crossed -- the search for any specific bit string to
take on roughly the same cost as either just taking the next n bits
without having choice in what n is, or alternatively to find some
deterministic structure in the binary development which is
subexponential in sequence length, and thus proving pi not to be normal.
Or if my intuition is broken here, then maybe the basic reasoning helps
some real cryptographer strengthen the construction to something you can
actually prove something nasty about; as far as the original question
goes, it's a win any which way.
> Also note there are algorithms to compute the Nth digit of pi in
> various bases without computing the previous digits, so there is no
> particular reason to stick with digits in published literature.
Those are precisely the spigot algorithms I mentioned. The requirement
that you start with something which wasn't already spent comes from two
reasons. First, obviously if you want your number to be low-sleeveness
and you have to ask about it on-list, typically you'd want it to be
something which wasn't already utilized and analyzed to death. For
instance, the community might have noticed that despite the conjectured
semi-randomness of the first bits of pi, statistical fluctuation still
made them fall too far in the tail of the distribution to satisfy the
finite, truncated probability assumptions you sometimes have to make in
order to make your cryptosystem secure.
And second, you really do have to get to choose the number of bits you
use for your constants. That is something we can't easily stop, and so
it represents a degree of possible gaming, leading to sleeveness. We can
just hope it's somehow evident from the proposed structure that no bits
extracted were lost in the process (i.e. all of them behave much like
ideal key bits in the proposed construction, or we even have some sort
of objective, future proof of how many bits you need for any future
circuit). But what we really, *really* don't want anybody to have is a
second, independent optimand they can freely choose, because that'd at
the very least lead to birthday sort of sleeveness in the hands of a
rationally malevolent designer. Thus, make them fix at least the
starting point of the search at the status quo -- which is required in
any case if we need new, non-overlapping constants -- and then tie the
length and the actual bits found irrevocably together.
> But I'd argue you might as well start at the first non-zero binary
> digit of pi, and reduce the degrees of freedom in the choice.
That doesn't take care of the length selection problem. What my
construction does is it forces each length to start at a different
offset as well, so that you can't easily take advantage of series
equivalences and the like to give structure to your bits, even after
there's been a while since the last cryptographer used the construction
to extract anything. And of course precedence in publication serves as a
natural, already necessary, external source of randomness to the
process: if you wait, running your ostensibly exponential optimization
algorithms to do your evil deed, as soon as somebody else publishes
something using the construction, you're forced to start again, on pain
of considered having something up your sleeve if you don't.
Sampo Syreeni, aka decoy - decoy at iki.fi, http://decoy.iki.fi/front
+358-40-3255353, 025E D175 ABE5 027C 9494 EEB0 E090 8BA9 0509 85C2
More information about the cryptography