Some confusing headlines:

Quantum Theory's Release Date Now 'TBA'

Quantum Theory Faces Delay Due to Quality Reasons

Quantum Theory Delayed

- Log in to post comments

### More like this

My grandfather liked to write letters to the editor. I think I inherited this disease from him. Here are the contents of a recent letter I wrote to the editor of Physics Today which I hope some of you may find amusing.
I greatly enjoyed reading N. David Mermin's last two Reference Frame columns…

Andrew Landahl (who really should have a blog because he is certainly one of the most interesting people I get to talk to when I attend a conference) sends me a note about recent appearances of quantum computing on prime time TV which he has graciously let me post below.
I thought you'd be amused…

Bora's probably already aware of this, but the annual meeting of the Associated Professional Sleep Societies is currently underway in Minnesota. Like many professional society meeting, this has led to a giant blort of press releases on EurekAlert, as the PR office for the conference tries to drum…

Quantum computing is hair-brained, but then again so is classical probabilistic computing. Part I of my attempt to explain one of my main research interests in quantum computing: "self-correcting quantum computers."
Quantum Computing, a Harebrained Idea?
Quantum computing, at first sight, sounds…

My carefully crafted Google Alerts feed has been polluted with this stuff for at least a year. I can't wait for the game to be released and bomb already.

Shows you how out of touch I am. I had no idea it was a game.

Well, it's a "game" and also: real quantum theory doesn't have all the bugs ironed out. We really don't understand what happens in it, despite IMHO fallacious attempts to dodge and double-talk the problems.

In order for you to be able to say that Niel, you have to give one example of some experiment where quantum theory fails. Just because it is inconceivable doesn't mean its not the best stuff we've got.

Well, Neil B., a lot has happened since Niels B., and now the understanding is much less nebulous.

People often say that quantum computing can contribute to quantum physics by making it easier to teach. But I think it can also make quantum physics easier to accept. Once you realize that there's only a square-root speedup for Grover, that random-access coding doesn't get any easier, etc. the theory seems less like it's been held together with duct tape.

Dave:

1) Define fail.

2) It is best stuff we have... for a specific purpose.

As far as I can tell, for the problem of landing a man on the moon, quantum theory not only fails, but is not even close to the best stuff we have. I think you'll have a hard time swaying Neil A. that we should have waited another thousand years for the "quantum" solution.

Well, we don't yet have a self-consistent theory of quantum gravity. And it is true that there is a large class of problems (stuff that generally goes by the name "classical physics") for which quantum theory would be overkill. But the basic quantum theory of systems in flat spacetimes is pretty well worked out. Some of the results may be counterintuitive, and some surprises may still lurk in its applications, but all results so far are explainable in terms of the theory.

Dave, folks - (now that we're talking the science not just the game, but I mean no disparagement of it as "game" too) I know quantum theory gives correct predictions about outcomes so far, I meant the familiar "interpretation" issues of why and how come a spread out wave function, or superposition of such, "collapses" into some selection and localization that makes "measurement results." This is not a case of results being "explainable" but rather, "given." For example, finding the inverse square law did not explain it, and was not equivalent to "understanding" in terms of curved space-time etc. (I don't mean that directly as an issue about QG either, but that too is problematical. Then also consider renormalization and the mess of that.) This being "a problem" of QM in that sense is a venerable perspective.

And when I said fallacious talk about the conceptual problem being solved, I meant my old bugbear the decoherence interpretation and claims that Schrodinger's cat is no longer a puzzle etc. A set of superpositions with phase changed around from instance to instance is not "converted into (even FAPP) a mixture" by decoherence: a mixture means sometimes one thing and sometimes another and cannot be derived from fiddling the phases between superpositions without something intruding to yank away the unrealized possibility. (It's suspect to compare phase across instances anyway.) It is "collapse" - God knows what that is or what really causes it - that turns superpositions made "no longer coherent" by decoherence into mixtures, not decoherence itself that can turn them into mixtures. (Penrose made similar complaints.) Hence collapse is still really a "problem."

Click my link to see some critique of DI as well as a multiple interaction problem that may indeed reference internal conflict in what quantum theory should predict (it's a rough outline so far, more and hopefully better later.) Briefly though, in the thought experiment, it isn't easy to get a serially weak-interacting wave function of one photon to evolve into an eigenstate as would be needed to satisfy the projection postulate (which says we can only pick out from binary choice of states; e.g. RH or LH and then collapses to that, but not find along intermediate range such as how much elliptical or linear etc.)

For fun, I note that "Quantum theory (video game) on Wikipedia, and also "Quantum Game Theory."

http://en.wikipedia.org/wiki/Quantum_game_theory

Cute! Maybe there can be a video game about QGT. Does anyone make "games" for nerds, that actually deal with real science issues in a sufficiently interesting way?

Niel B: an ACTUAL experiment. Without that you're just not going to get anyone's attention, sorry.

Chris: I can use quantum theory to calculate classical trajectories for a spacecraft. So for all practical purposes, Niel Armstrong would be happy with me. Maybe I misunderstood you.

And no I'm not claiming quantum theory is the end all, I'm just saying that this slow rehashing by Neil B of "whah whah decoherence doesn't explain the measurement problem" is (a) obvious, and (b) not useful or interesting at this point.

Dave: so do it - you'd be the first. All I ever hear is that it *can* be done followed some excuse not to actually do it. You are really putting the "fundy" in to fundamental physics.

Ah Chris you are such a nice guy I want to give you a big hug! I hope you enjoy yourself as much as I do, I mean it must be fun living which such a nice person all the time.

What would you like me to do? Show how classical equations can arise from a suitable course graining and decoherence of quantum theory? http://prd.aps.org/abstract/PRD/v47/i8/p3345_1

My wrath got, unintentionally, directed at you. My apologies, that was rude. I was trying to make the point that, from an outsiders perspective, quantum theory looks a lot like a religion.

Students are told that it is mystical and magical *and* this is somehow its virtue. And when we question it, as I just did, all we ever get is another argument from authority, as you just gave me.

What I want you to do is model each of the subatomic particles in a spaceship exactly with their quantum mechanical description *not* just use the classical equations and point me to a paper written by some Nobel laureates full of big words with scare quotes around them.

I want one quantum aficionado to admit they are taking an enormous logical leap of faith in claiming how fundamental their theory is.

Chris: First off argument by authority is different from pointing you to a valid scientific paper. Yes you will need big math to understand it...that's the way things work, sorry. I sent the one by Gell-Mann because he writes well, not because I agree with him or he is a Nobel prize winner. I could point you to hundreds of similar papers by non-big wigs.

You are also making a big leap when you go from "no one knows how classical arises from quantum" to "I want one quantum aficionado to admit they are taking an enormous logical leap of faith in claiming how fundamental their theory is." These are not necessarily related. I'm happy to say (as I said above) that quantum may not be the be all that ends all. But this doesn't mean that we don't know how to explain the quantum to classical transition. Perhaps there is confusion here because the classical here is not "interpretation based". It's just explaining why there are classical equations describing spaceships and quantum equations explaining other setups.

Well thanks Dave for acknowledging that DI doesn't explain the measurement problem, but so many people out there claim it does - FWIW. It sure isn't obvious to them (see the poor deluded souls at http://chaos.swarthmore.edu/courses/phys6_2004/QM/24_CatDecoherence.pdf etc.) I will quit complaining as soon as they quit claiming! But I'm glad no one here, at least so far, needs convincing.

As for real experiments versus thought experiments: First, someone proposing a doable experiment may not be able to do it; someone else can pick up the ball (the same person need not carry a concept through all stages.) And even impractical TEs (like the most recent of mine, the previous is eminently doable) are useful tools for studying physical problems. Not as impressive and not the same implications as hard results, but they have a distinguished history of supporting points and suggesting problems, etc. It's still good to ask, how to predict what would happen *if* we could do suchandsuch. BTW although mentioned in the same breath, that multi-measurement proposal of mine is not about decoherence; hooray!

My thesis advisor (Lowell Brown) used to intimidate his grad students and post-docs by quoting *his* thesis advisor (Julian Schwinger) as follows: "Everyone should reformulate quantum mechanics at least once in their career."

At the time, I was pretty intimidated by this Lowell/Schwinger maxim, but nowadays (thanks to advances in QIT) reformulating quantum mechanics is becoming a pretty realistic goal for almost any young quantum theorist.

The basic strategy is to exploit the numerous invariances of quantum mechanic to construct a quantum framework that fits the problems *you* are interested in solving. Such constructions are enabled by (in Feynman's words) of "The very large number of different physical viewpoints and widely different mathematical formulations that are all equivalent to one another ... the fundamental laws of physics, when discovered, can appear in so many different forms that are not apparently identical at first ... there is always another way to say the same thing that doesn't look at all like the way you said it before."

One thing that this means IMHO is that the 21st century will be a century in which this QIT freedom is exploited for purposes of constructing never-before-imagined narratives.

For example, the quantum narrative that our QSE Group constructs is designed to mesh as closely as feasible with the narratives that the synthetic biologists and the nanotechnologists construct. To achieve this, we use the forms-and-flow methods of Arnol'd as a shared framework for integrating both the molecular dynamics of Frenkel and Smit, and the quantum dynamics of Nielsen and Chuang; the resulting mathematical unity serves to link-up our nanotechnological spin imaging goals with the broad goals of synthetic and regenerative biology.

The point here is that there need not be just *one* mathematical narrative about quantum dynamics, any more than there is just *one* mathematical narrative about classical dynamics.

As Mao said: "Let a hundred flowers blossom and a hundred schools of thought contend!" In this sense, an unstoppable revolution in modern conceptions of quantum mechanics is well underway.

Neil, decoherence is not an interpretation of quantum mechanics. It is a physical process. This is what experimentalists are observing when they talk about T2 and T1.

Joe, decoherence is both. There is the process, sure. But what can it make happen by itself, without something unusual prompted by measurement? That is the basis of a now faddish "interpretation." Some people claim decoherence somehow dodges the collapse problem, by a sort of IMHO double-talk argument. They say (roughly summarized and simplified) it makes the original superposition like a mixture (i.e., sometimes one thing and sometimes another) by making phases differ from instance to instance. (! - how is that relevant to what happens any one time?) Reread my comments here and at my name link to see why that is fallacious (hint: it's a circular argument.) But quantum interpretation is a subtle and tricky thing and no one really "understands" what's going on, so I don't want to be too proud or sure about the physical issue per se - I just know how to analyze arguments for flaws like circular reasoning!

Note difficulty re ensembles - they have a "grouping" problem. Which bunch/es of instances "belong together" to form eg a density matrix, if the variability etc. changes over time, or different blocks of instances can be grouped different ways for inconsistent "average" values or variance levels, etc?

BTW your link didn't work. I suppose you meant http://jfitzsimons.org/, http://quthoughts.blogspot.com/?

What Joe says is exactly right, and moreover, decoherence is the engineer's best friend (algorithmically speaking). The reason is that decoherent processes, viewed as state-space flows, are generically concentrative.

This concentration dynamically tames what is often called "the curse of dimensionality", and makes feasible the computational simulation (by pullback) of quantum dynamical systems, that otherwise would be infeasible to simulate.

From this point-of-view, quantum dynamics in the 2010s looks increasingly like fluid dynamics did in the 1950s, with decoherence reprising the role of viscosity. The more viscosity (or decoherence) is present, the easier a fluid (or quantum) dynamical system is to simulate.

I must have had too many links last comment, and I only mention this again since I was referred to re this subject!

(Hence no need to repost the comment later):

Joe, John: decoherence is both. There is the process, sure. But what can it make happen by itself, without something special prompted by measurement? Claiming it can is the basis of an "interpretation." Some people claim decoherence somehow dodges the collapse problem, by a sort of IMHO double-talk argument. I have seen it written (roughly summarized and simplified), it makes the original superposition like a mixture (i.e., sometimes one thing and sometimes another) by making phases differ from instance to instance. (! - how is that relevant to what happens any one time?) Reread my comments here and at my name link to see why that is fallacious (hint: it's a circular argument.) Ask yourself: if no collapse intervenes, what would a system under decoherence do? Act differently than without decoherence - OK - but no removal of some part of the superposition (how?)

I know that quantum interpretation is a subtle and tricky thing and no one really "understands" what's going on, so I don't want to be too proud or sure about the physical issue per se - I just know how to analyze arguments for flaws like circular reasoning! And again, Penrose and others have made the same complaints.

Note difficulty re ensembles - they have a "grouping" problem. Which bunch/es of instances "belong together" to form eg a density matrix, if the variability etc. changes over time, or different blocks of instances can be grouped different ways for inconsistent "average" values or variance levels, etc?

BTW Joe your link didn't work. I found your other links through Google, however.

Neil, it has often happened in science that if one embraces mathematically simple ideas, one is led to physically dubious conclusions.

For example, simple Euclidean geometry impels us to the dubious conclusion of an infinite universe. Similarly, simple classical mechanics impels us to the dubious conclusion that matter is unstable.

So perhaps we ought to consider that the mathematically simple assumption of linear Hilbert state-space geometry, may be impelling us to dubious conclusions regarding the physics of measurement processes?

No amount of philosophizing fixes problems like these: it's necessary instead to steadily raise the level of mathematical abstraction.

Scientists and mathematicians have been jointly traveling this path of discovery for centuries ... and together have come a remarkably long way ... yet the end is not in sight. Engineers tend to follow a few years behind; their job is to glue the validated aspects of math-and-science together into narratives and enterprises.

There have been a handful of people who are simultaneously good at the math, the science, *and* the engineering ... Gauss, Poincare, and (of course) von Neumann come to mind.

The bottom line is that there is plenty of quantum work to do, for every kind of talent and interest. None of this work is easy, though.

OK John, I'll humbly leave it there for now as far as collapse/decoherence goes (do note, it is controversial.) OTOH, how about Aharonov style "weak measurements" involving multiple interactions? Could that lead to interesting finding out more (like about polarization state) than we're "supposed to"? If you look at the TE summarized in the name link, you will at least have a challenge to figure out how nature would handle that multi-interaction situation.

PS: Because of dark energy and the rapid expansion of the universe, it seems now to be "open" and thus infinite - as creepy as that sounds (aleph null copies of yourself; it's almost as bad as MWI.)

My EM professor made this point: quantum dynamics in the 2010s looks increasingly like fluid dynamics did in the 1950s over and over. At the time, her husband's lab looked like a backwater. Now I wish that I had gotten involved.

Dave, when you say no one knows *how* classical arises from quantum I assume you are implying that, whether or not it can, it certainly *should*.

But even if everything that is "classical" is made up of things that are "quantum", it doesn't mean that the theories we create for them need to match up in some transition region.

Classical mechanics gives us a simple idealized picture of macroscopic phenomena by ignoring microscopic phenomena and vice versa for quantum mechanics. So there is no transition region - no overlap in the aspects of nature which either theory models.

Here is an analogy to analogies. We often compare two things by selecting a feature of both and ignoring the rest. Only when we simplify in this way do we make the analogy by saying "X is similar to Y because this small part of X is `like' this small part of Y". But the analogy always breaks down when we try to extend that 'likeness' beyond those small regions.

We can extend Chris' comments on uniting macroscopic physics with microscopic physics, by applying Wigner's principle of "the unreasonable effectiveness of mathematics in the natural sciences".

Specifically, let's elevate to a Wignerian Principle the requirement that mathematics be "unreasonably effective" in uniting all aspects of natural science and engineering.

This leads us to ask ourselves two dovetailed questions: "What kind of mathematics might that be?" and "What kind of natural science might that be?"

This is like asking oneself the two dovetailed questions, "If there were a discipline called algebraic geometry, what might its algebra look like? What might its geometry like?" ... a fruitful avenue that has led to concrete extensions of our concepts of algebra and geometry.

It was this line of thinking that led to the predictions on Dick Lipton's Blog that FOCS2019 would include two dovetailed findings:

FOCS 2019-A: Simulating noisy quantum systems is generically in the same computational complexity class as simulating classical systems

FOCS 2019-B: M-Theory is the unique causally separable, relativistically invariant quantum field theory that can be simulated with classical computational resources

These findings are predicted by what we may call the Strong Wignerian Principle of math/science/engineering ... that advances in mathematical abstraction in the twenty-first century will be "unreasonably effective" at strengthening the shared foundations of all three disciplines.

Even if these predictions of the Strong Wignerian Principle prove not to be *exactly* true, it will be very good news for the twenty-first century should they prove to be even *approximately* true.

This confuses me:

But even if everything that is "classical" is made up of things that are "quantum", it doesn't mean that the theories we create for them need to match up in some transition region.

You would probably object if I said, "You can have a near-field theory and a far-field theory but that doesn't mean that as you move downrange there has to be a point where they are compatible."

Simulating ... systems ...advances ... : it is still true I am sure, that we can't just imagine "the world" as evolving wave functions or whatever all by itself and have to think about "looking at it" and the intrusiveness, relativeness, and disjunctural nature of that.

Not knowing about the game, I had some fun physical interpretations of the headlines:

Quantum Theory's Release Date Now 'TBA'

Can't have a complete theory *and* a release date.

Quantum Theory Faces Delay Due to Quality Reasons

Likewise: quality, timeliness, pick one.

Quantum Theory Delayed

Maybe trapped in a Bose-Einstein condensate. Might have interesting applications. "First the system is set in motion and then the laws of physics are put into effect."