Hoisted from the comments, Robin asks:
So, with that in mind, here's a question. What do you think about teaching quantum mechanics as noncommutative probability theory? In other words, by starting with probability theory and alluding to probabilistic mechanics (e.g., distributions on phase space), and then introducing quantum theory as a generalization of probability.This is how I think of quantum theory all the time now -- and it makes tremendous sense to me. I think it's how I want to teach it. And I'm curious what y'all think.
This is roughly how I like to introduce quantum theory, although I'm not sure that non-commutative probability theory is the right description. I mean, classical probability theory is not commutative: the order of dynamics certainly matters in both deterministic and classic theories. I mean pressing the accelerator and turning the wheel is different than turning the wheel and pressing the accelerator. In one you hit grandma, and in the other you miss her. But this got me thinking: what is the proper words to describe this approach: where one begins by introducing probabilistic theories and then moves to quantum theory. Probability theory with the wrong norm?
- Log in to post comments
There's a theory of "quantum random walks" that might be useful to look into here. It started out due to some questions in quantum computing, but it's of some mathematical interest in its own right.
This gives only part of quantum theory, I would say. It misses hbar. And it misses the fact that outcomes x_1 and x_2 of a position measurements are "close"(not in Hilbert space but in real space) if |x_1-x_2| is small; and similarly, it misses that a measurement of spin in the z-direction on a spin-17 particle with outcome m_z=17 is "close" (again, not in Hilbert space) to the outcome m_z=16.
In short, you get the information part of QM, but not the physics (which is kind of important, too, I vaguely believe).
[From talking to C.A.F. I know that C.M.C. (I don't want to display other people's names in public, but I think you know whom I'm referring to!) has said similar things.]
S.
It certainly doesn't give you the physics, but does it make more or less sense to introduce the foundation first?
Isn't hbar just the universe's way of making sure everything doesn't happen at the same time?
"It certainly doesn't give you the physics, but does it make more or less sense to introduce the foundation first?"
That depends on whether you're teaching QM to physicists or to computer scientists.
"Isn't hbar just the universe's way of making sure everything doesn't happen at the same time?"
You mean, if hbar were 0 nothing much would happen? Even that is not true though.
S.
If hbar were smaller, then for a fixed energy, to get the same evolution, you evolve for a shorter time. So not "nothing much would happen" but "everything would happen" right?
Also I've always wondered why cranks don't insist that there isn't one universal hbar, but different hbars for different interactions.
Why would I fix the energy?? That's an arbitrary choice: I prefer to fix energy/hbar...
Am I keeping you from serious work, or you me?
:-)
Dude I've got to write a final and homework solution sets!
I fix energy because I don't understand it, so I figure I'd better keep it fixed. :)
Now that you mention it, I have to prepare a final too!
Let's agree to keep the number of replies fixed to exactly what it is now :-)
hbar = 1.
Okay, this is getting interesting. Does Steven's last post ("Let's agree to keep the number of replies...") mean that I have to write a bunch of posts to keep up with him and Dave?
I don't have a final to write, but I do have a 9 AM train to Montreal, so... briefly and to the point.
1. "Noncommutative probability theory" is, I confess, a little vague. What I'm trying to get at is this: you get probability theory if you start with a sample space, define a sigma algebra over it, and apply Kolmogorov's axioms to the sigma algebra. Elements of the sigma algebra are propositions about Nature, they obey classical logic, and [at least for finite sample spaces] they can be represented by indicator functions so that the logical operations can be defined in terms of commutative function multiplication.
Quantum theory is what you get when you apply Kolmogorov's axioms, not to a sample space, but to the set of rank-1 projectors on a Hilbert space. Instead of a sigma algebra, you get the set of POVM effects -- which obey quantum logic, and can be represented by positive operators whose combination is not commutative.
This is not the introduction I'd give to... well, anybody. It does define the notion of noncommutative probability, though -- the things that are noncommutative are the propositions about the system. It seems a very powerful idea, to me, that you can get the structure of quantum theory by just changing the sample space, but keeping the rules the same!
2. Steven points out that you don't get the physics this way. I agree! But... that's exactly the point. The physics of spacetime (e.g., x1 and x2 are close, etc, etc) is the same whether you stick it on top of deterministic mechanics, probabilistic mechanics, or quantum mechanics. To me, the fact that the physics (fields, particles, angular momentum) is kinda decoupled from the underlying kinematical theory (quantum, classical, whatever) is an important and cool thing.
And... I'm too tired to think about this rigorously, but I'm not convinced that you don't get hbar (not its value, but the existence of a quantum of action) from the kinematical structure... maybe with one axiom about unitary dynamics or something.
Sweet. I have incited serious comment :)
Okay so the real question, it seems to me, is what the heck noncommutative propositions mean. Is there some easy way to understand why we might think noncommutative propositions make more sense that commutative ones?
Well, non-commutative don't make more sense (to us humans) than commutative propositions, because we evolved in regimes where commutative propositions worked well enough, and were much less resource intensive. The only reason we want to use them is because it turns out to be a reasonable formalism to describe nature that isn't captured by commutative propositions. The question is whether it's a better formalism than the standard physics language, which does more or less work.
I'd also argue that (at least for some systems) there is physics hidden in the way these propositions fail to commute.
We can model classical probability by a set of mutually commutative operators on a Hilbert space. If we introduce noncommutative operators on the Hilbert space we have used to model classical probability, we cannot generally construct a joint probability density over some combinations of those mutually noncommutative operators (stated right, this can be if and only if). There is a tradition that started around 1980 in which this is the content of the violation of Bell inequalities, although this has to be understood carefully in the context of the prevailing view that the content of the violation of BI is nonlocality.
"Cannot Construct Joint Probabilities" is not as hard to get a grasp of as "There Is Noncommutativity". CCJP also relates quite closely to Bohr's idea that measurements affect the possibility of making other measurements.
An elementary argument can be found in W.M.de Muynck, Phys. Lett. A114, 65(1985); an argument in terms of C*-algebras can be found in L.J.Landau, Phys.Lett. A120, 54(1987); both are short. The de Muynck argument is probably not so insightful, but the mathematics is much more accessible. I strongly recommend R.F.Streater, "Classical and Quantum Probability", J.Math.Phys. 41,3556(2000) [48 pages, particularly section III]. Sorry these are all journals.
I vote for Edward Nelson's "stochastic mechanics" shown nicely in his book "Quantum Fluctuations." He describes QM as classical particles interacting with a "background field" and develops a stochastic principle of least action. He shows that particles undergoing "conservative diffusions" generate equivalent probability distributions as wavefunctions. Don't mind the nonlocality of the theory, it is unimportant ;).
Another possibility is to teach quantum mechanics from the physics of sequential Stern-Gerlach experiments. This is how Schwinger did it and is my favorite. And, a friend just pointed out to me a recent paper on arXiv talking about the method.
For quantum system engineers, there is IMHO no better starting textbook than Nielsen and Chuang -- especially Chapters 2 and 8. Applying these principles to problems in practical engineering and control is reasonably straightforward, except that many essential details are not explicitly given in "Mike and Ike" ... our QSE Group has distilled these details in a write-up that is (colloquially) known within our group as Mike and Ike: the Missing Manual.
The overall picture that emerges (that we teach young engineers) is that (1) noisy quantum systems are compressible objects, and (2) quantum compressible objects can be accurately simulated with polynomial resources, and (3) the resulting quantum system engineering toolset is mathematically and algorithmically very similar to the toolsets of fluid dynamics, combustion theory, and VLSI design.
Our QSE Group therefore foresees that quantum system engineering is destined to emerge as one of the most intellectually vigorous, strategically important, and economically dynamic engineering disciplines of the 21st century. Good! :)
I think you will like the article Consistency, Amplitudes and Probabilities in Quantum Theory.
Hi Dave,
I agree that QM is sensible to think about as a generalization of probabilistic mechanics. In my opinion, the only generalization one needs is to complex from real numbers for probabilities. If we introduce h as the parameter of a universal action reservoir (just as T is introduced as the parameter of a canonical energy reservoir in stat mech), and allow a complex probability distribution over paths of a system, then we get the path integral formulation of quantum mechanics from stat mech,
p(path) = (1/Z) exp( - S(path) / i h )
And whenever we use this to calculate the probability for something we might measure, the imaginary parts cancel to give a real probability.
The only new thing for students to consider with QM is the weird fact that probabilities can now destructively interfere.
I was wondering if one could interpret the standard model fields as sort of fictitious ghost fields and interpret the path integral as solving some sort of combinatorial problem in which configurations are assigned only positive weights?