[Cryptography] Post Quantum Crypto

Ron Garret ron at flownet.com
Fri Nov 13 04:09:38 EST 2015


On Nov 12, 2015, at 2:35 PM, Krisztián Pintér <pinterkr at gmail.com> wrote:

> 
> i would like to hear the opinion of someone who actually knows some
> details.
> 
> to my knowledge, most quantum experiments require many many repeated
> attempts, simply because the detection rate is poor. the more stuff we
> need to measure, the less chance a reading will be the actual result,
> and not blank or noise. like for example if detecting photon
> polarization has detection rate of 1%, then a simultaneous pair
> detection will have 0,01%, and a triple detection will have 0,0001%.
> 
> 1, is this the case with d-wave and quantum computers in general?
> 
> 2, if so, is the detection rate / repeat number changes exponentially
> with the number of qubits, like in the photon example above?
> 
> because if these are true, the detection probability might be the main
> issue with QC. it can compute its thing in O(1), but we need O(2^n)
> measurements, which is not an achievement over classical.

Disclaimer: I am not an expert.  All this is AFAIK/CT:

The number of measurements required  is not the limiting factor.  The limiting factor is the implementation of quantum error correction (QEC), and *combining* that with quantum gates.  Both quantum gates and QEC have been demonstrated in the lab, but combing them is really hard because the way QEC works is by splitting up a single q-bit into its own set of mutually entangled particles.  The more you split, the more robust your q-bit becomes, but the harder it becomes to use that q-bit to control a Hadamard gate.  It’s far from clear whether the “sweet spot” of q-bit robustness and Hadamard implementability can actually be achieved, and D-wave sheds no light on this because what they aren’t implementing Hadamard gates.

rg



More information about the cryptography mailing list