A new entry in the best title every contest, arXiv:0907.4152:
Born Again
Authors: Don N. PageAbstract: A simple proof is given that the probabilities of observations in a large universe are not given directly by Born's rule as the expectation values of projection operators in a global quantum state of the entire universe. An alternative procedure is proposed for constructing an averaged density matrix for a random small region of the universe and then calculating observational probabilities indirectly by Born's rule as conditional probabilities, conditioned upon the existence of an observation.
WWJD? Not quantum Born's rule, apparently.
- Log in to post comments
More like this
This is the alst week of the academic term here, so I've been crazy busy, which is my excuse for letting things slip. I did want to get back to something raised in the comments to the comments to the Born rule post. It's kind of gotten buried in a bunch of other stuff, so I'll promote it to a full…
Over at Neurophilosophy, Mo links to an article by a physicist, posted on the arxiv, that claims to explain visual perceptions using quantum mechanics:
A theory of the process of perception is presented based on von Neumann's quantum theory of measurement and conscious observation. Conscious events…
A new entry in the best title ever competition appeared last week on the arXiv:
arXiv:0806.4874
Why devil plays dice?
Authors: Andrzej Dragan
Abstract: Principle of Relativity involving all, not only subluminal, inertial frames
leads to the disturbance of causal laws in a way known from the…
This week's big story in physics is this Science paper by a group out of Austria Canada (edited to fix my misreading of the author affiliations), on a triple-slit interference effect. This has drawn both the usual news stories and also some complaining about badly-worded news stories. So, what's…
This needs to be taken in the context of his earlier paper, which was called "The Born Rule Dies", which could also be a candidate for best paper title. In fact, they work better as a pair.
Despite the cute titles, I must now become a nattering nabob on negativity because I think that these papers are fairly misleading. Firstly, we already know that the Born rule <\psi|P|\psi>, as Page states it, has to be replaced by the "generalized Born rule" Tr(E \rho) where E is a POVM element instead of a projector and \rho is a density matrix instead of a state vector. This can happen in quite mundane situations in the lab and has nothing to do with cosmology. All Don Page is saying is that, in a multiverse scenario, if you are not exactly sure which universe you are in then you have to mix the projection operators corresponding to different universes according to some probability distribution. Unsurprisingly, this gives you a POVM rather than a PVM, which is then supposed to be some sort of big crisis for the formalism of QM. It isn't. If it were, then all quantum information theorists ought to be going around proclaiming the death of the Born rule in dramatic sounding papers as well.
On the other hand, if you like, you can instead put all the uncertainty into the density operator by taking a mixture of all the different permutations of the universes in the state vector. Then you can calculate with projectors as usual. In fact, this solution seems preferable to me because you can trace out all the extraneous unobservable components of the multiverse and just work with a density operator for the local universe. However, for some rather vaguely specified region Don Page thinks that it might not be more elegant to do things this way. Perhaps it is because Page is some sort of an Everettian and so he is wedded to the idea of a pure state vector of the entire multiverse that is evolving unitarily. Well, in any case, both ways of calculating things are equivalent and neither of them is a radical change to any part of the quantum formalism.
The rest of the paper is just a discussion of what the probability weights should be for the different regions. This is really just the standard measure question that multiverse cosmologists are grappling with. In fact, it really has nothing to do with QM because the type of observations that cosmologists are interested in, e.g. temperature and distribution of the CMB, have outcomes that almost certainly correspond to branches of the state-vector that have already strongly decohered. Thus, the whole discussion could just be done with classical probability, which is what everyone except Don Page actually does. Bringing QM into the picture just muddies the waters.
Still, if there is one positive thing to be said about all this confusion it is that it nicely illustrates some of the more subtle points raised in this controversial Bayesian manifesto, particularly the idea that the assignment of operations to measurement devices is subjective.
We really don't know what Jesus would do, but we know what Don Page would do. Thus I think WWDNPD? is more appropriate.
I'm not sure what it says about me when, after seeing the line "Born Again" in the blog entry, my very first thought was "well of course, this will be about the Born rule..."
Domenic: be afraid, be very very afraid :)
Hey, Matt... thanks for the nice comment! I skimmed Page's paper, but not carefully enough to extract a lot from it, so your explication was very helpful.
Nothin' else to say; just wanted to give credit where credit is due.
I too thought Matt Liefer's comment was good ... and I will expand upon it as follows.
Page's article starts off along a generic path that manuy articles on quantum foundations embrace: Assuming that a complete physical theory of the universe is quantum, [a generic article on quantum foundations] would argue that it should contain at least the following elements: [generic attribute list follows]
We begin by noticing that these attribute list generically lacks mathematical depth (IMHO) ... typically the items in attribute list embody mathematical concepts that were well-known a full century ago.
Aren't there some new mathematical features we could include in the attribute list?
To help free ourselves from culture preconceptions, let's shift the focus by asking the isomorphic question: Assuming that a computational simulation is quantum, it should contain at least the following elements: (list follows)
Modern computational simulations typically include: (1) topological structure, (2) symplectic structure, (3) metric structure, (4) thermodynamic structure, (4) informatic structure .... it is notable that both classical and quantum simulation codes typically have all these structures ... in fact the boundary between classical and quantum simulations has become indistinct (in practice if not in the literature).
A mathematical feature that is conspicuously absent from modern simulation codes a global vector structure ... this is typically replaced by a (nonlinear) tensor network structure ... this saves enormously on computational effort, while not notably degrading the accuracy of the simulation.
Since dropping the global vector structure of Hilbert space works so well in practical computations, maybe the foundations-of-physics folks should try it too?
Thus, my main response to many foundations articles is that they regrettably cling to the least interesting---and least useful mathematically---aspect of quantum mechanics, namely its linear structure, instead of focussing on the most interesting (and most useful in practical calculations) mathematical aspects of quantum mechanics, namely, its symplectic, Riemannian, thermodynamic, and informatic stuctures.
... to continue the above thread, this non-vector approach to the foundations-of-physics has a very practical motivation.
At next month's Kavli Conference at Cornell, Molecular Imaging 2009: Routes to Three-Dimensional Imaging of Single Molecules, I'll be presenting a tutorial course on computational algorithms for large-scale spin simulations.
Here "large-scale" means 10^2 to 10^5 interacting spins ... so we have to grapple very seriously with the dilemma that (1) the spin dynamical interactions, and the measurement/observation of these spins in microscopy, are undeniably quantum processes, and yet (2) the dimensionality of a (linear) Hilbert space is infeasibly large for practical computations.
The tutorial will resolve this dilemma along lines modeled upon Chapter 4 of Frenkel and Smit's outstanding (IMHO) textbook Understanding Molecular Simulation: From Algorithms to Applications.
Where Frenkel and Smit simulate liquid systems, we will simulate spin systems (in effect regarding spins as a kind of condensed matter).
Of course, spin systems are quantum ... we know that there are many equivalent ways to describe quantum systems ... and so we are free to pick a way that maps as closely as possible onto existing simulation formalisms. And this turns out to be a formalism that is careful to respect the symplectic, Riemannian, causal, thermodynamic, and informatic aspects of "quantum goodness" --- necessarily sacrificing the linear Hilbert structure to do so.
It's pretty straightforward to transcribe the resulting quantum mechanical formalism into the mathematical language of bio-simulation science ... and straightforward to code-up the resulting algorithms ... and pretty easy to apply the code to simulate processes like dynamic spin polarization.
The effect is to blur the boundary between classical and quantum simulation. For example, it's perfectly feasible to pullback the quantum equations of motion onto low-dimension tensor-network manifolds ... in the resulting formalism the dynamics and measurement are fully quantum and yet the state-space is classical or semi-classical. It's highly enjoyable mathematically---and very efficient computationally---to factor the classical-quantum transition by these pullback methods!
For me, the main lesson-learned has been not to underestimate the mathematical sophistication of the biological simulation community ... because these biological folks have quietly begun to match, or even exceed, the physicists in the depth and sophistication of their understanding of physical processes.
... to continue the above thread, this non-vector approach to the foundations-of-physics has a very practical motivation.
At next month's Kavli Conference at Cornell, Molecular Imaging 2009: Routes to Three-Dimensional Imaging of Single Molecules, I'll be presenting a tutorial course on computational algorithms for large-scale spin simulations.
Here "large-scale" means 10^2 to 10^5 interacting spins ... so we have to grapple very seriously with the dilemma that (1) the spin dynamical interactions, and the measurement/observation of these spins in microscopy, are undeniably quantum processes, and yet (2) the dimensionality of a (linear) Hilbert space is infeasibly large for practical computations.
The tutorial will resolve this dilemma along lines modeled upon Chapter 4 of Frenkel and Smit's outstanding (IMHO) textbook Understanding Molecular Simulation: From Algorithms to Applications.
Where Frenkel and Smit simulate liquid systems, we will simulate spin systems (in effect regarding spins as a kind of condensed matter).
Of course, spin systems are quantum ... we know that there are many equivalent ways to describe quantum systems ... and so we are free to pick a way that maps as closely as possible onto existing simulation formalisms. And this turns out to be a formalism that is careful to respect the symplectic, Riemannian, causal, thermodynamic, and informatic aspects of "quantum goodness" --- necessarily sacrificing the linear Hilbert structure to do so.
It's pretty straightforward to transcribe the resulting quantum mechanical formalism into the mathematical language of bio-simulation science ... and straightforward to code-up the resulting algorithms ... and pretty easy to apply the code to simulate processes like dynamic spin polarization.
The effect is to blur the boundary between classical and quantum simulation. For example, it's perfectly feasible to pullback the quantum equations of motion onto low-dimension tensor-network manifolds ... in the resulting formalism the dynamics and measurement are fully quantum and yet the state-space is classical or semi-classical. It's highly enjoyable mathematically---and very efficient computationally---to factor the classical-quantum transition by these pullback methods!
For me, the main lesson-learned has been not to underestimate the mathematical sophistication of the biological simulation community ... because these biological folks have quietly begun to match, or even exceed, the physicists in the depth and sophistication of their understanding of physical processes.
... and finally (although Dave's filter may not pass the link given), here's what the quantum physics of Chapters 2 and 8 of Nielsen and Chuang looks like after being (1) transcribed into the symplectic language of Frenkel and Smit, and (2) optimized for efficient large-scale computation.
The resulting quantum formalism fits on a single page ... and looks very much like a classical symplectic formalism ... iff we agree to speak the mathematical language of modern simulation science.
For this reason, it turns out to be very much easier to learn/teach quantum physics as an add-on to simulation science, than it is to learn simulation science as an add-on to quantum physics.