Does Quantum Uncertainy Come From the Foundations of Math?

Over at Asymptotia, Len Adleman (the A in RSA, founder of DNA computation (but not the A in DNA!), and a discoverer of the APR primality testing algorithm) has a guest post about the foundations of quantum theory. Len suggests, if I understand him correctly, that one should attempt to understand the uncertainty arising in quantum theory as being of the same nature as the fact that there exists statements which cannot be proven true or false within a fixed set of powerful enough axioms.

First of all, I know I've heard a similar argument before, but can't seem to find the reference! Any foundations (or other) people want to supply those as I'm sure they would be welcomed in the comment section of Asymptotia. Second, I find it interesting that Len seems most troubled by the uncertainty arising in quantum theory and not by, for example, Bell inequalities. I'm no so sure many of us are troubled by this aspect (that it is probabilistic and not deterministic) of quantum theory, in and of itself. That is to say if the world had a probabilistic local hidden variable theory, would we be arguing about the foundations of quantum theory? Third, this of course brings to mind the Kochen-Specker theorem which shows that there is no non-contextual hidden variable theory of quantum mechanics. Indeed contextuality reminds one a lot of a "choice of axiomatic system." It would indeed be neat if one could make this into a more established result. But in particular one would need to argue why Hilbert space best captures the idea of a set of axioms. Finally, because I think Bell inequalities are essential for understanding what makes quantum theory truely unique (yes I'm biased), I'm curious as to whether mathematicians have ever considered the notion of "local axioms", i.e. axioms which live in spacetime?

Tags

More like this

I'll admit it: I like reading George Soros' books. I mean, here's a guy whose made a godzillion dollars in the financial markets, has been behind political destabilizations/stabilizations worldwide, taken on a U.S. president (can you guess which one?), and yet, in spite of this, can write a book…
Through my computer science "information is king" eyeglasses, there are really only two notions which thoroughly distinguish quantum theory from classical theories of how the world works: the nonlocal nature of quantum correlations as exemplified by Bell's theorem and the much less well known…
Recently I finally got a chance to read the new preprint arXiv:0905.2292 "A new physical principle: Information Causality" by M. Pawlowski, T. Paterek, D. Kaszlikowski, V. Scarani, A. Winter, and M. Zukowski. It's been a long time since I spent more than a few spare hours thinking about…
This is an approximate transcription of my physics talk from Boskone, titled "Spooky Action at a Distance," in which I attempted to give a reasonable explanation of the Einstein-Podolsky-Rosen ("EPR" hereafter) paper and Bell's Theorem. This was sort of a follow-on from last year's "Weird Quantum…

YES!!! I've been saying this for years - literally - but no one listens to me. Anyway, both the uncertainty principle as well as Bell's inequalities (along with a host of other things) can be derived from the Cauchy-Schwarz inequality (see "The Cauchy-Schwarz Master Class" by J. Michael Steele, published by CUP - Patrick Hayden pointed this amazing book out to me). That's why I call it the "God equation" sometimes. I suspect it is possible to trace some aspect of this back to some form of Gödel's incompleteness theorem, but haven't reached that point yet. Nonetheless, there seems to clearly be some sort of mathematical basis to this.

In fact, at it's core, this is exactly what my infamous Cerf-Adami paper was about, but over three years of battling referees and other commenters, it got watered down and still hasn't been published.

If anyone out there is truly interested in pursuing this further, contact me since I could probably save you a lot of background work.

It's well-known that the Heisenberg Uncertainty Principle is mathematically equivalent to the bandwidth theorem. But that doesn't seem to be the argument Adleman is making.

By Eric Lund (not verified) on 23 Sep 2009 #permalink

>> I know I've heard a similar argument before, but can't seem to find the reference

this arxiv.org/abs/0811.4542 ?

I remember that Scott wrote about it (after I explicitly asked 8-), but was not exactly thrilled...

Well, with due respect to Ian, I have to say that I think this idea is a load of old twaddle. For starters, the idea has nothing to say about probabilities, because, in formal systems, indeterminate means literally indeterminate rather than that one thing or the other will happen with a certain probability. Also, as usual, Bohmian mechanics provides a glaring counterexample that shows how boring the world can be whilst still being compatible with quantum mechanics. Thus, it is certainly true that you could never *prove* that quantum uncertainty is due to some sort of axiomatic indeterminiateness. What we are left with after this just seems to be a very vague analogy.

In any case, as you say, this idea has come up repeatedly in the foundations of quantum theory. It used to be far easier to get away with making vague comments about this sort of thing in the days before there were a lot of quantum physicists and computer scientists who were familiar with each others' subjects, so I will skip the pre-history.

Karl Svozil has written a whole book on the subject:

http://www.amazon.com/exec/obidos/ASIN/981020809X

More recently, Caslav Brukner has been talking about similar things. See http://arxiv.org/abs/0901.3327 for example.

My own contention is that these sort of ideas are mainly suitable for consumption by those on a content-free diet. However, I could well be wrong so I wouldn't want to discourage anyone from trying to make serious progress on these ideas.

Dave,

Is the reference you're thinking of "Mathematical undecidability and quantum randomness" by Tomasz Paterek et al? It was a hot discussion topic around here for a little while last year, and while I don't really remember the analysis, I'm pretty sure I ended up a bit disappointed.

As far as I can tell (YMMV), the paper seems to put forth an analogy between undecidability and quantum randomness, rather than a causal or logical connection. At the risk of being long-winded, I'll provide a quote from page 3:

When a physical system prepared in an eigenstate of a Pauli operator is measured along complementary directions,
the measurement outcomes are random. Correspondingly, the proposition identiï¬ed with a complementary observable
is undecidable within the one-bit axiom encoded in the measured state. For example, the measurement of X on an eigenstate of Z gives random outcomes, and accordingly proposition (B) is undecidable within the one-bit axiom (A). This links mathematical undecidability and quantum randomness in complementary measurements. We propose that it is therefore possible to experimentally ï¬nd out whether a proposition is decidable or not, as summarized in Figure 2.

However, I think this is one of those papers that shouldn't be discussed without reading it first, so I'll shut up now.

By Robin Blume-Kohout (not verified) on 23 Sep 2009 #permalink

I'm not sure what you mean by "I'm curious as to whether mathematicians have ever considered the notion of "local axioms", i.e. axioms which live in spacetime?" I guess you don't mean the Wightman or Haag-Kastler axioms?

As far as the derivation of Bell inequalities in the mathematical context of C*-algebras is concerned, try L. J. Landau, Phys. Lett. A 120, 5456 (1987) or J. Baez, Letters in Mathematical Physics 13 (1987) 135-136. A more elementary derivation of something similar, but not for C*-algebras, can be found in W. M. de Muynck, Phys. Lett. A 114, 65 (1986). These likely come out somewhat similar to the treatment in the book mentioned by Ian Durham's post, but some kind of precise mathematical context of linear spaces and inner products is needed to derive a Cauchy-Schwarz inequality.

Contextuality requires only that the whole experimental apparatus is included in the classical model of the experiment. That has to include the measurement apparatus and its interactions with all other parts of the experimental apparatus. Then empirical equivalence of a classical model with a QM model is not ruled out either by the Bell inequalities or by the Kochen-Specker theorem. The hard question, however, is whether the classical model gets you anything that the QM model misses, or is mathematically enough nicer to be preferred for that reason.

For a precise (enough) mathematical construction of a random field model that is empirically equivalent to a quantum field model, see EPL 87 (2009) 31002. This approach, however, glosses the difference between a random field and a quantum field. Random fields generally do not allow Bell inequalities to be derived, J. Phys. A 39 (2006) 7441-7455.

If you mean by "quantum uncertainty" the abstract mathematical equations which some dogmatically assert is telling them something about "reality", then yes.

If, on the other hand, you mean the fact that we can't seem predict the outcomes of some experiments, then I think the most plausible place "quantum uncertainty" comes from is some combination of human stupidity and laziness.

Key foundational question: are there coincidences in Mathematics? Toby Bartels wrote on the n-Category Cafe: "I would define a 'coincidence' as a situation that has no simpler explanation than the bare facts (although our intuition may look for one). Thus coincidences can occur in objective circumstances like mathematics. (Ironically, I came to this definition reading the rantings of a paranoid schizophrenic explaining why nothing is a coincidence.)"

I was at a Math tea at Caltech in May 2007, joking about the 26 sporadic groups being a "kabbalistic coincidence" â or perhaps examples of some incompletely glimpsed set of more complicated mathematical objects which are forced to be Groups for reasons not yet clear. Some people deny that there are "coincidences" in Mathematics. Gregory Chaitin insists that Mathematics is filled with coincidences,
so that most truths are true for no reason. That sounds like the beginning of a human-guided deep theorem-proving project to me. Humans supply gestalt intuition that we don't know how to axiomatize or algorithmize. Humans did not stop playing Chess when Deep Blue became able to beat a world champion. The computer is crunching fast; the human looks deep. The human has the edge in Go, which takes deeper
search.

"House" last year correctly quoted:

"Coincidence is God's way of remaining anonymous."

â Albert Einstein [The World As I See It].

> It's well-known that the Heisenberg Uncertainty Principle is
> mathematically equivalent to the bandwidth theorem. But that
> doesn't seem to be the argument Adleman is making.

I will admit to not knowing quite what this (bandwidth theorem) is (perhaps I know it by another name?), but I can guess that the reason for this is simply due to the behavior of waves in general (see any "old-fashioned" quantum book for a discussion of this, for example Bohm's Quantum Theory).

"as to whether mathematicians have ever considered the notion of 'local axioms', i.e. axioms which live in spacetime?"

There's a more problematic consideration. How can we tell if the SAME axioms apply in widely separated PARTS of the cosmos?

Cf.: "Luminous" by Greg Egan -- region of space where Arithmetic is subtly different.

Cf.: "Imaginary Logic" as first proposed by N.A. Vasiliev. Maybe OUR universe is too small for difference between P and NP to make a difference. But in much bigger, much more complex universes within the Multiverse, the difference is essential to the very nature of meta-life, meta-thought, and meta-consciousness. We DO have a consistent logical metalanguage for beings in universe A with Logic A to discuss beings in universe B with Logic B, or beings in universe B who falsely claim to use Logic B, or even fictional beings in other universes. Google Scholar "Imaginary Logic" and the more recent publications on N.A. Vasiliev.

We are trying to describe the structure of the set of modern scientific theories, each of which requires at the very least a semantics -- an instrumentalist description which relates the mathematical formalism (syntax) to an experimental practice and prediction (pragmatics). In Quantum Mechanics, the standard instrumentalist description asserts some statistical regularities between state preparation processes and measurement processes. If a measurement of a real-valued quantity (we return to this assumption later) is performed many times, each time beginning at the same initial conditions, then the outcome is a well-defined probability distribution over the real numbers. This is conventionally in the range (0,1) but, as I first introduced the idea, not original to me, to Feynman circa 1969, probability could in principle be negative (details outside the scope of this paper). Each theory of Quantum Mechanics provides a computational process to determine statistical properties of this distribution, including its expectation value.

We shall return to the question: what is the topology of the ensemble of Quantum Mechanics theories? Do we best describe it by Sheaf Theory, stacks, gerbes? Does the ensemble of Quantum Mechanics theories form a manifold? Complex manifold? Differentiable manifold? Smooth manifold? Or are there singularities, which may hint a unique âbestâ or âmost naturalâ subset of Quantum Mechanics theories?

Can we tell which universe we are in, and formally describe other universes with different foundations of Quantum Mechanics, such as [Mlodinow, 1981]. Mlodinow wrote a PhD dissertation on how QM would work in a universe with an infinite number of spatial dimensions. This dazzled Murray Gell-Mann, who had him appointed to a major faculty position at Caltech, with the office immediately next to Feynmanâs.

[excerpt from "Nearly Physical Deformations of Quantum Mechanics" by Jonathan Vos Post, 33-page 13,900 word Draft 7.0 of 28 July 2009; Abstract: I give here some very preliminary thoughts of a continuous set of deformations of Quantum Mechanics in which Planckâs constant is not precisely a real number. Both the history of Quantum Mechanics, and certain foundational problems lead us to ask these strange questions in strange ways. There are both experimental (conflicting measurements) and theoretical reasons (no independent method to assure unlimited accuracy with Quantum Hall Effect and Josephson Effect) to be skeptical of the official CODATA value for Planckâs constant. If Planckâs constant has a value other than the accepted value, then so also do several fundamental physical constants: Rest mass of the electron, Avogadro constant, Elementary charge (charge of the electron), Bohr magneton and nuclear magneton. Is it possible (though outside of conventional theory) for these to likewise have a non-real component of their values?]

> It's well-known that the Heisenberg Uncertainty Principle is
> mathematically equivalent to the bandwidth theorem. But that
> doesn't seem to be the argument Adleman is making.

There are mathematical similarities between deterministic signal processing and quantum theory, however they ultimately cannot be taken very far. See, for example, Leon Cohen, "Time-Frequency Distributions-A Review", Proc. IEEE 77, 941 (1989). If the discussion is not limited to deterministic signals, there are considerably closer similarities.

I couldn't prevent myself from posting a link to Terry Tao on "Hardy's version" of uncertainty principle, which (in a long and winding way) states that a function and its Fourier transform cannot both have finite support [now it sounds highly nontrivial]

[for me the fact that we like to write free uncharged unspinned particle Hamiltonians as p^2+V(x) is more of a mystery.]

By Kaveh Khodjasteh (not verified) on 23 Sep 2009 #permalink

Speculation as to the "meaning" of quantum mechanics can now cease ... because now QM has now been fully explained in terms of ... skiing metaphors!

Yes Dave, those skiing diagrams drawn on your office board are now PDF diagrams ... all hail GIMP, Inkscape, PSTricks, and TeX!

(mysteriously though, the room number of the "free-beer" seminar is too pixelated to read ... too bad!)

Now we can ski Snoqualmie and contemplate QM mysteries simultaneously! :)

Speculation as to the "meaning" of quantum mechanics can now cease ... because a truly fundamental metaphor for QM has now conceived ... skiing! :)

Yes Dave, those skiing diagrams sketched on your office board are now PDF diagrams ... via GIMP, Inkscape, PSTricks, and TeX.

Mysteriously though, the room number of the "free-beer" seminar is too pixelated to read ... too bad!

Gosh golly ... now we can ski Snoqualmie and contemplate QM's spukhafte Fernwirkung simultaneously. :)

------

"http://faculty.washington.edu/sidles/QSEPACK/Kavli/bacon_ski_diagram_or…"

"http://faculty.washington.edu/sidles/QSEPACK/Kavli/bacon_ski_diagram_dr…"

I'm not a deep mathematician or physicist but love the philosophy behind the meaning of a micro-probability based reality underneath us all. But since higher dimensions can't be tested for but only mathematically calculated, is it possible that matter and energy are somehow linked in a dimension beyond our ability to experiment? Just asking rhetorical or answerable ponderings and musings.

That the 'entangled' electrons or photons are just the 'same' entity plit in our dimension but can be sent around like descrete individual quanta observed? That there is no 'communication or outside detection method and signals but the same thing resulting in the same state at the same time but we can only measure the protruding instances separately and it seems like a non-causal action on the part of one of the entangled objects. I thought the multi-universe theory was an attempt to start considering the reality of higher physical dimensions based on mathematical abstractions to help explain some of the quantum mechanical uncertainties.

I read a book called 'Chaos: making a new science' and also got into the Flatland novel's concept that beings of one dimension see the next higher dimensions as magic or ghosts, in for physics 'entangled' quanta. That a Flatland "Line-1D' individual would look like two or themselves split if pulled up into the 3rd Dimension, by others of their kind or a 2D being.

Quantum theory including the uncertainty issue may be put with atomic field force and energy isomerism by new physics functions of the GT integral atomic mapping function. This system of quantized functions includes the solution to the correlation function for mapping the set of virtual fundamental force quanta onto the picoyoctometrically defined manifold of heat capacity energy intermedons by plain figures, for an atom's volume.
The atom's RQT (relative quantum topological) data point imaging function is built by combination of the relativistic Einstein-Lorenz transform functions for time, mass, and energy with the workon quantized electromagnetic wave equations for frequency and wavelength. The atom labeled psi (Z) pulsates at the frequency {Nhu=e/h} by cycles of {e=m(c^2)} transformation of nuclear surface mass to forcons with joule values, followed by nuclear force absorption. This radiation process is limited only by spacetime boundaries of {Gravity-Time}, where gravity is the force binding space to psi, forming the GT integral atomic wavefunction. The expression is defined as the series expansion differential of nuclear output rates with quantum symmetry numbers assigned along the progression to give topology to the solutions.
Next, the correlation function for the manifold of internal heat capacity particle 3D functions condensed due to radial force dilution is extracted; by rearranging the total internal momentum function to the photon gain rule and integrating it for GT limits. This produces a series of 26 topological waveparticle functions of five classes; {+Positron, Workon, Thermon, -Electromagneton, Magnemedon}, each the 3D data image of a type of energy intermedon of the 5/2 kT J internal energy cloud, accounting for all of them.
Those values intersect the sizes of the fundamental physical constants: h, h-bar, delta, nuclear magneton, beta magneton, k (series). They quantize nuclear dynamics by acting as fulcrum particles. The result is the picoyoctometric, 3D, interactive video atomic model data imaging function, responsive to keyboard input of virtual photon gain events by relativistic, quantized shifts of electron, force, and energy field states and positions.
Now the uncertainty issue has vanished, and the atomic mapping function for all quanta is solved in plain numerical calculations.
Images of the h-bar magnetic energy waveparticle of ~175 picoyoctometers are available online at http://www.symmecon.com with the complete RQT atomic modeling guide titled The Crystalon Door, copyright TXu1-266-788. TCD conforms to the unopposed motion of disclosure in U.S. District (NM) Court of 04/02/2001 titled The Solution to the Equation of Schrodinger.

(C) 2009, Dale B. Ritter, B.A.

"...uncertainty arising in quantum theory as being of the same nature as the fact that there exists statements which cannot be proven true or false within a fixed set of powerful enough axioms. "

If true, why does uncertainty not arise in all other physics theories? What makes the quantum axioms (if there's any) different from other theories?

This is an interesting idea, but something to remember: math has to produce definite results itself (logical necessity!), and canât generate actual random outcomes. (Not to be confused with âprobability mathâ which tells us the various proportions of outcomes.) For that reason and others I reject the MUH (mathematical universe hypothesis), it from bit, modal realism etc.

Johnathan, the idea that math (math, not physics) could be different from place to place is titillating but hardly plausible. Even if geometry is different, that is really just "a different shape" for space to have. We don't really understand what math is (! - see battles between constructivists and intuitionists), but most regions are of logical necessity. You rearrange a quadratic equation, you get what you get to solve it - it won't be different in another galaxy.

We also still have the problem of what happens to the various elements of superpositions, why the photon âhitsâ once place and not another place, etc. Also, the popular treatment decoherence is a dodge (see http://plato.stanford.edu/entries/qm-decoherence/ and my own link), and without âobserversâ we have evolving Schrodinger waves with âno place to go.â

By Neil B ⪠(not verified) on 24 Sep 2009 #permalink

Dave, I finalized that "quantum skiing" figure. It began as a light-hearted exercise to help put students at ease .. but then it actually turned out pretty well ... this helps me appreciate why you like skiing so much.

So heck, who needs philosophers to explain QM?

What students *really* need is a good ski instructor!

I read a book called 'Chaos: making a new science' and also got into the Flatland novel's concept that beings of one dimension see the next higher dimensions as magic or ghosts, in for physics 'entangled' quanta.

Thus, it is certainly true that you could never *prove* that quantum uncertainty is due to some sort of axiomatic indeterminiateness. What we are left with after this just seems to be a very vague analogy.

If anyone is interested, I continued this topic on Dick Lipton's thread "Itâs All Algorithms, Algorithms and Algorithms", per this comment:

"rjlipton.wordpress.com/2009/09/22/its-all-algorithms-algorithms-and-algorithms/#comment-1505"

For me, it seems preferable to regard these issues as revolving not around quantum spukhafte Fernwirkung, but rather around algorithm-centric versus category-centric descriptions of QM ... the advantage of the latter approach being that it suggests mathematical questions that are both reasonably interesting (IMHO) and reasonably well-posed.

Quantum Mechanics is a semantic theory. When written as a logical theory by embedding the formalism into Mathematical Logic, mathematical undecidability is revealed. Certain formulae of the quantum formalism are members of an excluded middle under the Field Axioms that are neither logically valid nor logically invalid. I have written articles on this and presently writing the full story. My blog explains.

Does this new arXiv paper, more or less on-topic, work for you?

arXiv:0909.4623
Title: Quantum Simulation of Markov Chains
Authors: X.F.Liu
Comments: 5 pages
Subjects: Mathematical Physics (math-ph); Quantum Physics (quant-ph)

The possibility of simulating a stochastic process by the intrinsic randomness of quantum system is investigated. Two simulations of Markov Chains by the measurements of quantum systems are proposed.

"... In this paper, we study the possibility of simulating a stochastic process by the intrinsic randomness of quantum system. We believe that, theoretically speaking, a quantum system may simulate a stochastic process better than a classical system due to its intrinsic
randomness...."

Or maybe this one:

arXiv:0909.4606 ]
Title: A Stepwise Planned Approach to the Solution of Hilbert's Sixth Problem. I : Noncommutative Symplectic Geometry and Hamiltonian Mechanics
Authors: Tulsi Dass
Comments: 51 pages
Subjects: Mathematical Physics (math-ph)

This is an open-ended project aimed at the solution of Hilbert's sixth problem (concerning joint axiomatization of physics and probability theory) proposed to be constructed in the framework of an `all-embracing' mechanics. In this first paper, the bare skeleton of such a mechanics is constructed in the form of noncommutative Hamiltonian mechanics (NHM) combining elements of noncommutative symplectic geometry and noncommutative probability in a super-algebraic setting; the treatment includes symplectic actions of Lie groups and noncommutative analogues of the momentum map, Poincare-Cartan form and the symplectic version of Noether's theorem. Canonically induced symplectic structure on the (skew) tensor product of two symplectic superalgebras (needed in the description of interaction between systems) exists if and only if either both system superalgebras are supercommutative or both non-supercommutative with a `quantum symplectic structure' characterized by a universal Planck type constant; the presence of such a universal constant is, therefore, dictated by the formalism.

Has anybody asked Len Adleman what the superposition of two contradictory axioms is? For example, in non-Euclidean geometry (am I being to simple-minded here?) what is the superposition of Euclid's fifth postulate and Lobachevsky's parallel postulate? And how does it differ from the orthogonal superposition which has a - sign rather than a + sign?

By Peter Shor (not verified) on 29 Sep 2009 #permalink

"what [is] the superposition of two contradictory axioms?"

"Paraconsistent Logic
First published Tue Sep 24, 1996; substantive revision Fri Mar 20, 2009
http://plato.stanford.edu/entries/logic-paraconsistent/

The contemporary logical orthodoxy has it that, from contradictory premises, anything can be inferred. To be more precise, let ⨠be a relation of logical consequence, defined either semantically or proof-theoretically. Call ⨠explosive if it validates {A , ¬A} ⨠B for every A and B (ex contradictione quodlibet (ECQ)). The contemporary orthodoxy, i.e., classical logic, is explosive, but also some ânon-classicalâ logics such as intuitionist logic and most other standard logics are explosive. "

"The major motivation behind paraconsistent logic is to challenge this orthodoxy. A logical consequence relation, â¨, is said to be paraconsistent if it is not explosive. Thus, if ⨠is paraconsistent, then even if we are in certain circumstances where the available information is inconsistent, the inference relation does not explode into triviality. Thus, paraconsistent logic accommodates inconsistency in a sensible manner that treats inconsistent information as informative."

Even though this topic is winding to a halt, I will offer a few more remarks.

Adelman's essay talks about "What the physicist does" and "What the mathematician" does, as though they were two different things. But in one crucial respect, these two frameworks are alike: they both describe real-world experiments in terms of projection and superposition ... operations that only make sense on state-spaces that are linear vector spaces.

It is perfectly feasible, however, to describe quantum mechanical processes in purely differential terms, as stochastic flow processes that are governed by symplectic and metric structures.

From an engineering point of view, such differential processes are the way that "real" quantum measurement processes work; the notions of projection and superposition are properly regarded as convenient, coarse-grained, low-dimension approximations to the "real" fine-grained differential processes of the physical world.

This differential point of view has a natural pullback onto lower-dimension algebraic state-spaces that, computationally, are far more tractable than high-dimension Hilbert spaces.

Especially for the simulation of large-dimension noisy systems---which is by far the most common case that engineers encounter---this formalism offers compelling practical advantages in terms of computational efficiency.

Could it be the case that Nature, too, takes advantage of this computational efficiency? Could be be that QM is *not* a vector-space theory? Certainly there are researchers---Ashtekar and Schilling, for example---who have argued for this point of view. And on the basis of experimental evidence, this possibility cannot presently be excluded.

Engineers are agnostic on this issue. It's not our job to discover new physics *or* new mathematics. Rather, it is our job to apply known physics and known mathematics to the efficient solution of practical problems. And for this purpose, it is increasingly evident that non-Hilbert quantum frameworks sometimes offer compelling practical advantages.

The solutions to Hilbert's 6th Problem holds the keys to understanding quantum mechanics. It was proposed over one hundred years ago and all the great "Mathematical Physicist" like Ed Witten have not tried to solve it. It is the one problem of his 23 that Mathematical Physicist are so afraid of they say its not a problem but a proposal that has been solved. It has not been solved until now and it was tested. If it were solved by someone from the great universities you would know by now. The fact is, it was solved by mathematics and physics teacher. Hilbert said,"We must know" We will know. I agree.

By Mr. Jamahl A. Peavey (not verified) on 01 Oct 2009 #permalink

Jonathan asks: " ..., are there coincidences in Mathematics?"

How about sum(i=0:n) i^3 = ( sum(i=0:n) i^1)^2 ?

On first noticing this, how many of you thought "that's interesting. Wonder if it means anything?"

It does not appear to be a simple case of anything, so I rate it a coincidence.

By Raoul Ohio (not verified) on 17 Oct 2009 #permalink