Friday, June 21, 2013

On The Computational Power of Linear Optics II

"Classical Computers Very Likely Can Not Efficiently Simulate Multimode Linear Optical Interferometers with Arbitrary Fock-State Inputs-An Elementary Argument," Version 3.0.

With thanks to Scott Aaronson and a referee for suggestions for improvement, we have expanded our argument, originally based on the exponential blowup in the Hilbert space for a bosonic, Fock-state interferometer, to analyze the similar exponential blowup for a fermionic, Fock state interferometer. We then examine both interferometers using an input-output formalism that allows us to treat them both as a boundary value problem. In the limit that the particle coherence length is much larger than the interferometer we then, sticking very close to the physics, argue that in order to access these exponentially large Hilbert spaces, the particles must arrive at the beamsplitters in a way that they are indistinguishable there, and thence their multi-particle  wavefunctions must be properly symmetrized everywhere in space.

In this model, then, to compute an arbitrary output state (our goal) we must simply prepare an arbitrary input state and then use the efficient matrix transfer to transform the input to the output. Therein lies the rub. For fermions there is an efficient protocol for constructing an arbitrary input state but for bosons there is not. In other words, for a large interferometer, we cannot even setup the initial conditions in the general case, much less compute the corresponding output. 

For fermions the appropriate anti-symmetrization of the total input wavefunction (the product of the spatial and spin wavefunctions) may be carried out efficiently (in polynomial time) by constructing and evaluating the Slater determinant of the basis functions. For bosons the equivalent procedure would be to symmetrize the total input wavefunction by using the 'Slater' permanent. (Slater never wrote down any such thing.) In the case of the bosons the most efficient known algorithm for computing the permanent is the Ryser algorithm, discovered 50 years ago, which scales exponentially in the size of the matrix.

Thus we conclude our elementary argument that linear optical interferometers cannot likely be efficiently simulated on a classical computer. We do this using simple arguments from quantum optics and undergraduate quantum physics and without resorting to complexity theoretic notions such as 'collapse of the polynomial hierarchy.'  The small amount of quantum computer complexity that remains in our argument can now discuss in language that can be understood by the ordinary quantum physicist on the street (or in the gutter).

An amusing and informative lecture on these results, given this past week at the Quantum Information and Measurement Conference in Rochester, NY,  may be found here [PDF, PPT].

Friday, June 14, 2013

Entagled-Photon Holes!

Entangled-photon-hole-boosted quantum teleportation of Schrödinger-cat states

We study the teleportation of non-Gaussian, non-classical Schrodinger-cat states of light using two-mode squeezed vacuum light embedded with Franson's entangled photon holes. We discuss two figures of merit a) the fidelity of teleportation and b) the maximum negativity of the Wigner function at the output. We show that the addition of entangled photon holes to two-mode squeezed vacuum light lowers the requirements on the amount of squeezing necessary to achieve any given fidelity of teleportation, or to achieve negative values of the Wigner function at the output.
arXiv:1306.3168

Sunday, June 2, 2013

What IS a Quantum Computer?

Having a delightful time in Beijing visiting the Computational Science Research Center, which last week hosted the Fifth International Workshop on Quantum Optics and New Materials — this time to celebrate the 60th birthday of my old friend and collaborator, M. Suhail Zubairy.

On the last day, Wednesday, I sat at lunch at the vegetarian table with Zubairy, Barry Sandars, Selim Shahriar, Wolfgang Schleich, Girish Agarwal, and Jörg Evers. The discussion turned to the computational complexity of linear optical interferometers (the topic of my talk that morning) and then more broadly to D-Wave's machine and then somebody (might have been me but I had so many vegetables who can tell) asked each person in turn to define a quantum computer.

To us quantum opticians the fact that there was no agreement at all came as a bit of a surprise. Shahriar held out for a universal machine capable of running Shor's algorithm. I pointed out that, as per a 1998 article by Seth Lloyd, that coherence and strong projective measurements were necessary and sufficient to run Grover's algorithm with a quadratic speedup — shouldn't such a machine be called a type of quantum computer?

Shahriar rejected that as it was not universal. Then came the D-Wave discussion. If the D-Wave machine can do something useful, also exploiting quantum coherence, should it not also be classified as a type of quantum computer? Sanders seemed to side against this. I was more positive.

And so the discussion continued as the vegetables arrived in an never-ending sea of serving platters. In the end Zubairy thought it was quite surprising that we all could not agree on the definition of a quantum computer. I rejoined that people still argue over what was the first classical computer. The Babbage difference engine? The ENIAC? The Atanasoff–Berry Computer? How about that ancient Greek computer the Antikythera Mechanism? If historians of computer science can't even agree on what is a a classical computer what hope do we have of now agreeing on what is a quantum computer.

My own definition? Any computer that exploits some of the weirder features of quantum mechanics to attain a computational advantage over the best classical machine, no matter how slight, should be at least entertained as a type of quantum computer.