When I wrote up the giant interferometer experiment at Stanford, I noted that they've managed to create a situation where the wavefunction of the atoms passing through their interferometer contains two peaks separated by almost a centimeter and a half. This isn't two clouds of atoms each definitely in a particular position, mind, this is a wavefunction representing a bunch of atoms that are each partly in two places at the same time , separated by 1.4 centimeters.
I emailed Mark a link to the post, and in his reply he said that they've increased that to about 4cm (which is just a matter of improving the "beamsplitter" pulses they use to separate the atoms). I joked that they need to insert a fast shutter near the top, so they can physically separate the atoms for an instant-- it wouldn't have any particular purpose, physics-wise, but it would be kind of cool. He also added the comment that it's kind of amazing that quantum physics allows that kind of macroscopic separation.
Which got me wondering-- is there any kind of fundamental limit on the distance scale of a quantum superposition? I know there are people-- most famously Roger Penrose-- who argue for versions of quantum gravity that place a limit on the mass of particles that can be in a superposition state. The idea is that above some mass scale, the interaction with gravity prevents superposition states by rapidly causing decoherence of the wavefunction (or collapsing the superposition into a definite state, if you prefer more Copenhagen-ish phrasing). But I don't think I've heard of any suggestion regarding a distance scale-- that is, a physical separation of the component states of the superposition above which things ought to break down.
Now, this is obviously mostly because I don't follow this question all that closely, so this post is, in part, a not all that subtle request for a pointer to anything relevant. It could also be, though, that there's actually no reason to expect such a limit-- quantum mechanics is, after all, famously non-local, and people have done Bell inequality tests with particles separated by meters. Those were either photons (which are massless) or entangled atoms created indirectly through interactions with entangled photons, though, so maybe it's not the same game. On the other hand, though, it does sort of seem like scale ought to be relevant in some way.
I did idly wonder whether you could do something via dimensional analysis-- if there's a mass limit, that's also an energy limit, and energy is related to frequency, which gets you time, which multiplied by the speed of light is a distance. But my vague recollection is that Penrose's mass limit is around the Planck mass, in which case dimensional puttering around leads you to the Planck length, which is manifestly not any kind of limit for this. So that goes nowhere.
Anyway, I thought it was sort of interesting to think about, though I don't know enough to do much. Probably I need to go back and look at this old summary paper about what's believed to be possible, and see who's citing it these days in hopes of an update. And as long as I'm idly poking at that, I might as well throw it out here to see if anybody has any useful thoughts...
- Log in to post comments
I don't think there's any fundamental limit. Practically there's decoherence of course. Forget about atoms for a moment, don't you create superpositions every time you do interference with a beam splitter? Bell inequality tests have meanwhile been done over distances of a hundred km for all I know. Grav wave interferometers would also bring you into the ball park of 10-100 km. Then there's radio astronomy. Don't have any references, but that's the first things that came to my mind.
I wondered the same thing on the web a few years ago. (Call me prescience-t. =D) Never got a response.
So again: Maybe cosmology sets a practical limit. I have to assume we need decoherence to have an observable time within a light cone. And as it happens we do have a CMB cosmological clock for the universe age.
I note that this should be a practical limit. As we approach the particle horizon the redshift would attenuate any remaining coherence discrepancies out of observability.
Oops. For "practical" in my previous comment, substitute "fundamental".
I agree that there are kilometer-scale superpositions (at least) with photons, but then, photons are massless. I don't think even Penrose-type theories limit the superpositions possible with photons. My very vague impression is that those rely on some coupling to the mass of the particle to induce decoherence/ "collapse."
The question, then, is what's the exact mechanism of that decoherence? If it's a sudden, sharp threshold then it might not imply any kind of meaningful distance scale. On the other hand, if it's simply an exponential decay kind of process, with the decay rate increasing with mass, then that implies a characteristic time, which can be converted into a distance if you know the velocity (or an absolute distance limit, if you just use the speed of light...).
I guess I need to (try to) read those papers. In my copious free time.