For years I have puzzled over what some have called the
‘tension’ between non-relativistic quantum theory and non-quantum relativity theory. Standard quantum
information theory is of the non-relativistic variety based on ordinary quantum
mechanics without any appeal to, say, relativistic quantum field theory. In run
of the mill quantum information theory there is nary a

*c*, the speed of light, in sight. The term ‘relativity’ shows up only four times in the Nielsen and Chuang book,*QuantumComputation and Quantum Information*(a book we all call ‘Mike and Ike’ after the first names of the authors). In all four of these places it appears only so much as a foil against which quantum information theory is proffered. The authors point out that, for example, you cannot use quantum informatic teleportation to send signals faster than the speed of light — but why not?
I would naively expect non-relativistic quantum
theory to make predictions that outright conflicted with the predictions of
non-quantum relativity theory. For example non-relativistic Newtonian physics makes just such
conflicting (and wrong) predictions. Newtonian mechanics, for example, states
that an object’s mass is conserved and that said object may always be
accelerated up to arbitrarily high velocities and that information-bearing
signals may travel faster than light. This directly contradicts the confirmed
predictions of special relativity but nobody cares because nobody expects
non-relativistic Newtonian mechanics to be consistent with relativity.

And yet, over time, we have come to expect — in many venues
— non-relativistic quantum information theory (and the non-relativistic quantum
theory upon which it is based) to be consistent with non-quantum relativity. One example is
the caveat in Mike and Ike that quantum teleportation cannot be used to send
signals faster than light. The correct statement is more close to quantum
teleportation cannot be used to send signals instantaneously. According to the
non-relativistic version of quantum teleportation, in some sense the state to
be teleported is transferred instantaneously, but it can only be extracted with
help from a luminal-speed communication from Alice to Bob through some
classical channel.

Another example of this ‘tension’ comes to us from Nick
Herbert’s publication years ago on a superluminal-signaling scheme he called
FLASH. This scheme is discussed in my book,

*Schrödinger’s Killer App*, in David Kaiser’s book,*How the HippiesSaved Physics*, and in an arXiv posting by Asher Peres, “How the No-Cloning Theorem Got It’s Name.” The FLASH scheme, like teleportation, required two parties, Alice and Bob, to have set up shared entangled states, but unlike teleportation, also a noiseless state-amplification machine. The communication scheme was not only superluminal; it was instantaneous. Many physicists, including Peres and Glauber, knew the scheme had to be wrong, and the downfall of FLASH came with the invention of the no-cloning theorem and its closely related cousin, the no non-noiseless amplification theorem. Cloning and noiseless amplification devices violate the unitary and linear nature of quantum theory and so cannot exist. Everybody heaved a sigh of relief, but I was puzzled. Why were Glauber and Peres so sure that non-relativistic quantum theory should not contradict the theory of non-quantum relativity theory? If it had, I would have said this is of no more concern than noting that Newtonian mechanics contradicts relativity. One would not expect any non-relativistic theory to be consistent with a relativistic one and would in fact expect them to make inconsistent predictions.
Another example along these lines is the Bohr’s resolution
to Einstein’s photon-in-a-box paradox from the Sixth Solvay Congress in 1930.
Einstein cooked up a thought experiment about weighing a box before and after a
photon was allowed to escape from it. He showed that this scheme
apparently violated the Heisenberg energy-time uncertainty relationship. Bohr
resolved the paradox and saved the uncertainty principle by invoking the
gravitational red shift, a general relativistic effect. Why on earth should
general relativity have anything to do with the non-relativistic Heisenberg
uncertainty principle? And yet the consistency of the latter requires the former.

A final example, that I lift from Scott Aaronson’s book,

*QuantumComputing Since Democritus,*involves black holes. Here the idea is to resolve the black hole information paradox. If you throw information-bearing quantum states into a black hole and they disappear forever then this would violate unitarity. One proposed resolution is that, near the event horizon, the quantum state is somehow duplicated — in apparent violation of the no-cloning theorem — and one copy vanishes into the singularity and the other is remitted as Hawking radiation. (This is a concept close to a resolution proposed by Chris Adami and Greg Ver Steeg in their 2004 arXiv posting, “Classical Information Capacity of Quantum Black Holes.”)
To
violate the no-cloning theorem you could grab the copy that comes out and then
rocket into the black hole and grab the other copy and thus have two copies of
the unknown quantum state. However, if you try to do this, apparently it takes
so long for the first copy to be emitted that by the time you grab on to it the
second copy has always already been gobbled up by the singularity and the no-cloning theorem
is saved. Why on earth should the non-relativistic version of the no-cloning
theorem be consistent with the relativistic theory of black holes? To quote Aaronson,
“So it’s funny that there are these little things that seem like they might
cause a conflict with quantum mechanics … but when you really examine them, it
no longer seems like they do.”

It’s not funny — it’s downright bizarre.

What this tension between non-relativistic quantum theory
and non-quantum relativity theory suggests to me is that there is some
ur-theory, likely a phenomenological one, which unifies non-relativistic
quantum theory and non-quantum relativity theory. Now I know what you are all going
to say, “Sure — it's a quantum theory of gravity!” Indeed, if we had that, I
expect all this tension would go away. But for quantum theories of gravity —
string theory and loop-quantum gravity — the tension between relativity and
quantum mechanics is down at the Planck length or up at the Planck energy. In
the examples I discuss above, this tension is at ordinary length and energy
scales. I don’t need physics at the Planck scale to talk about superluminal
signaling, photons in boxes, or black hole information paradoxes.

Hence I suggest that there is some intermediate unified
theory between quantum gravity and what we have now and that this theory in certain limits produces non-relativistic quantum theory and non-quantum relativity theory. The best lame analogy I
can come up with is the unification of electricity and magnetism within Maxwell’s
equations, which are phenomenological equations derived from close observations
of experiments. We know now that the electromagnetic field must be quantized — à la quantum electrodynamics — when wavelengths are short and energies large and
photons are needed. But the unquantized Maxwell equations served us quite well
for 100 years before we hit that point. In this lame analogy, electricity and
magnetism are the analog of non-relativistic quantum theory and non-quantum
relativity theory; Maxwell’s equations are the analog of the unifying
(but yet unknown) ur-theory, and quantum electrodynamics is the analog of a full theory of quantum gravity.

So how to find this phenomenological ur-theory that unifies
non-relativistic quantum theory with non-quantum relativity theory? Continue to
explore these tensions between the two; both in theory and experiment. Gisin
and his group have done experiments measuring the speed of the collapse of the
wave function of entangled photon pairs over large distances with detectors in different moving frames. Work along these
lines should be encouraged and not disparaged.

(THU 26 SEP 2013)

Just found this comment while reading the book by Haroche and Raimond:

"The consistency between the EPR correlations and relativity is by itself also

strange, in some way. Quantum laws do ultimately satisfy relativity in their ﬁeld theory

version, but the measurement description we have implicitly invoked to compute the

EPR correlations is non-relativistic. If it had violated causality, we could have invoked

the necessity to use a relativistic version of measurement theory, dealing with the

proper time of detection events. We do not have to do this.

Haroche, Serge; Raimond, Jean-Michel (2006-08-10). Exploring the Quantum:Atoms, Cavities, and Photons (Oxford Graduate Texts) (Page 65). OUP Oxford. Kindle Edition.

(THU 26 SEP 2013)

Just found this comment while reading the book by Haroche and Raimond:

"The consistency between the EPR correlations and relativity is by itself also

strange, in some way. Quantum laws do ultimately satisfy relativity in their ﬁeld theory

version, but the measurement description we have implicitly invoked to compute the

EPR correlations is non-relativistic. If it had violated causality, we could have invoked

the necessity to use a relativistic version of measurement theory, dealing with the

proper time of detection events. We do not have to do this.

*Non-relativistic quantum*

physics is non-local in a way subtle enough not to contradict the inherently relativistic

causality principle." (Italics mine.)physics is non-local in a way subtle enough not to contradict the inherently relativistic

causality principle.

Haroche, Serge; Raimond, Jean-Michel (2006-08-10). Exploring the Quantum:Atoms, Cavities, and Photons (Oxford Graduate Texts) (Page 65). OUP Oxford. Kindle Edition.

There is another example.

ReplyDeleteOften, it is said that quantum operations (maps on density matrices) should be linear. One can justify this operationally from convexity as is done in Section 6.5 (page 11) of Hardy's http://arxiv.org/abs/quant-ph/0101012 . That is, Alice could prepare an unknown quantum state as a mixture of two individual states. A map then gets applied to the state and one then observes probabilities for measurement outcomes. They could do this many times and Alice could then reveal later which state she chose at which run of the experiment and Bob could check for consistency of the data (the data should in fact be consistent). For this reason, maps acting on quantum states should be convex linear and one can extrapolate linearity from there (as done in the appendix of Hardy's paper).

However, linearity is argued by appealing to instantaneous signaling as well. That is, if a map were nonlinear, then Alice and Bob would be able to signal instantaneously (and thus superluminally as well). The idea is similar to Herbert's FLASH idea and discussed on page 21 of Wolf's notes:

http://www-m5.ma.tum.de/foswiki/pub/M5/Allgemeines/MichaelWolf/QChannelLecture.pdf

Alice and Bob share an entangled state rho_AB before communication begins. Bob has access to a nonlinear map N, i.e., one for which

N(sigma) is not equal to \sum_x p(x) N(sigma_x)

where sigma = sum_x p(x) sigma_x

If Alice were to do nothing, then Bob could apply the nonlinear map N and Bob's local state would be described as

N(rho_B)

just by taking the partial trace of rho_AB and then applying the map N. However, Alice could perform a measurement on her side first, "collapsing" the state to an ensemble

{p(x), rho^x_AB}

If Bob then applies the nonlinear map N, then we would describe the density operator on his end as

sum_x p(x) N(rho^x_B)

Since the map is nonlinear, it follows that these two cases have some distinguishability better than random guessing, so that Alice can signal to Bob instantaneously if they share many copies of the initial entangled state and do this protocol many times.

So this is often invoked to justify that maps acting on quantum states should be linear. However, both reasons are good in my opinion. The first follows from operational considerations while the second is part of "evidence gathering" that maps should be linear (by evidence gathering, I mean consistency with other areas of physics and this only adds to the strength of the argument that maps should be linear).

So, I always vaguely thought about it like this: if you are willing to accept the postulates of non-relativistic QM, then it is clear that it is a non-signalling theory, i.e. the probability of any measurement outcome of Alice cannot depend on Bob's choice of measurement. Therefore, it must admit a finite velocity of information propagation.

ReplyDeleteAnd relativity can be obtained entirely from (a) the principle of relativity and (b) finiteness of speed of information propagation. So, isn't the ur-theory you're looking for simply QM+principle of relativity? Of course, I've never seen this done rigorously. Or am I missing something conceptually, or misinterpreting your question entirely?

Siddarth: I have never gone through this carefully but I believe if we simply take the axioms of non-relativistic quantum theory (NRQT) and add to those the axioms of non-quantum relativity theory (NQRT) we arrive at a theory that is internally self inconsistent. For example states in NRQT are invariant under Galilean transformations but not Lorenz transformations.

ReplyDeleteAlso, I recently realized that while the states in NRQT are non-signalling, you can still do instantaneous signalling using a long chain of local interactions. Any perturbation at one end will instantaneously result in an exponentially small (due to Lieb-Robinson) change in the local density matrix at the other end. But this change can be used to transmit a signal.

ReplyDeleteBut in NQRT, pushing a long rod at one end doesn't cause an instantaneous movement at the other end.