no possible brute force Was: Cruising the stacks and finding stuff
Leichter, Jerry
leichter_jerrold at emc.com
Wed Apr 23 12:34:56 EDT 2008
On Wed, 23 Apr 2008, Alexander Klimov wrote:
 Date: Wed, 23 Apr 2008 12:53:56 +0300 (IDT)
 From: Alexander Klimov <alserkli at inbox.ru>
 To: Cryptography <cryptography at metzdowd.com>
 Subject: no possible brute force Was: Cruising the stacks and finding stuff

 On Tue, 22 Apr 2008, Leichter, Jerry wrote:
 > Interestingly, if you add physics to the picture, you can convert
 > "no practical brute force attack" into "no possible brute force
 > attack given known physics". Current physical theories all place a
 > granularity on space and time: There is a smallest unit of space
 > beyond which you can't subdivide things, and a smallest unit of
 > time.

 I guess you are talking about Planck units, so let's make the
 calculations:

 Planck length is L = \sqrt{hbar G / c^3} ~ 1.6E35 m,

 Planck time T = L/c ~ 5.4E44 s,

 so a cubicmetercomputer can have

 N = (1/L)^3 ~ 2.4E104 elements

 and thus

 N/T ~ 4.5E147 operations per second

 that is approximately 2^{490} operations per second in a cubic meter
 of "Planck computer". Given a year (3.15E7 s) on a Earthsize
 (1.08E21 m^3) Planck computer we can make approximately 2^{585}
 operations.
Except that this doesn't quite work. You can't actually have anywhere
near that many distinct states in that volume of space. The physics
does indeed get bizarre here: The maximum number of bits you can
store in a given volume of space is determined by that space's
*surface area*, not its volume. So you actually "only" get around
1E70 elements and 4.5E112 operations. :)
 OK, it was amusing to do these calculations, but does it mean
 anything? I think it is more like garbageingarbageout process.

 If I understand correctly, L and T are not nondivisible units of
 spacetime, but rather boundaries for which we understand that our
 current theories do not work, because this scale requires unification
 of general relativity and quantum mechanics.
Kind of.
 Even more bizarre is to think about the above "processing element" as
 representing a single bit: according to quantum mechanics, states of a
 system are represented by unit vectors in associated Hilbert space of
 the system and each observable is represented by a linear operator
 acting on the state space. Even if we have only two qubits they are
 not described by two complex numbers, but rather by 4:

 a_000> + a_101> + a_210> + a_311>

 and thus description of the state of quantum registers requires
 exponential number of complex numbers a_i and each operation
 simultaneously process all those a_i. Since we cannot measure
 them all, it is an open question whether quantum computations
 provide exponential speedup and all we know right now is that
 they give at least quadratic one.
For simple search problems, where there are no shortcuts and you really
can only do brute force, quadratic is the best you can do.
 By the way, if you have a computer with the processing power larger
 than that of all the cryptanalysts in the world, it makes sense to use
 your computer to "think" to find a better attack than a bruteforce
 search :)
Of course. But the mice are already running that computation.
 > One place this shows up, as an example: It turns out give a volume
 > of space, the configuration with the maximum entropy for that volume
 > of is exactly a black hole with that volume

 [OT] This is a strange thesis because all black hole solutions in
 general relativity can be completely characterized by only three
 externally observable parameters: mass, electric charge, and angular
 momentum (the nohair theorem) and thus in some sense they have zero
 entropy (it is not necessary a contradiction with something, because
 it takes an infinite time for matter to reach the event horizon).
You're looking at this backwards. Suppose I want to store a large
amount of data in a given volume of space. To be specific, I have a
unit diameter sphere in which to build my memory device. Initially, I
fill it with magnetic domains, and see how many distinct bits I can
store. That gets me some huge number. I replace that with
racetrackstyle memory, with information on the edges of the domains.
Then with single electrons, with bits stored in multiple quantum states.
Can I keep going indefinitely? The answer is no: There is in fact a
limit, and it turns out to be the number of bits equivalent to the
entropy of a black hole with a unit diameter sphere. The entropy
isn't given by the number of measurements I can make on my black
hole  it's given by the number of possible precursor states that
can produce the given black hole. In fact, if you try to shove that
much information into your initial volume, you will inevitably
produce a black hole  not a good idea if what you want is a storage
device, since the data will be there in some sense, but will be
inaccessible!
 > and its entropy turns out to be the area of the black hole, in units
 > of square Planck lengths.

 [OT] Although Hawking showed that the total area of the event horizons
 of a collection of black holes can never decrease, which is *similar*
 to the second law of thermodynamics (with entropy replaced by area),
 it is not clear that it allows to conclude that area *is* entropy.
The actual computation requires using either string theory or loop
quantum gravity or, presumably, some other theory that combines GR
and QM. The entropy is proportional to the area by very general
arguments, but to figure out the units, you have to make some more
assumptions. According to the "Black Hole Thermodynamics" article in
Wikipedia, string theory implies that the units are exactly the Planck
length squared, while loop quantum gravity makes it proportional to
the single free parameter in that theory.
No one really understands what this means in any deep sense. It
certainly doesn't say that a Plancklength x Plancklength square
is "a minimal bit" in any physical sense. That's just a suggestive
interpretation. If it were even approximately true ... that's
a 2d "object". Why can't I just stack a whole bunch of them in
an arbitrary volume?
What is important is the profound general principle that's emerged from
70 years or so of work on QM, which physicists now take for granted but
hasn't really made it into the general conciousness: The universe is a
fundamentally finite thing. The amount of information that can be
"embedded" into any finite volume of space and time is bounded. (Note
that in Newtonian mechanics, or in GR, the value of even the single
gravitational field is given by a real number at every point in space
and time, so an arbitrarily small volume of spacetime can "hold" an
arbitrary amount of information. This is not the case in QM; in fact,
it's inherently incompatible with it. See the Bekenstein Bound.) Given
this and a limit on the speed at which information can propagate, that
there are fundamental physical limits on any computation follows. "All
the rest is just detail." :)
 Jerry

The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to majordomo at metzdowd.com
More information about the cryptography
mailing list