white noise

New calculations suggest the question of whether the universe is holographic or not is testable, and recent data is consistent with the model, and consistent with the universe actually being holographic.

h/t Jake at Pure Pedantry

Prof Craig Hogan, the new director of the Center for Particle Astrophysics at Fermilab, has written a series of very interesting papers suggesting that space-time quantization at the Planck scale ought to show up as white noise in the transverse displacement of laser interferometers, with a power spectrum that is just √(tp/2), independent of frequency, given some conditions.

What makes this interesting, is that the predicted noise is within the range of current interferometers, specifically the GEO600, and is in fact consistent with current measurements, which have shown a persistent and unexplained noise floor.

This would imply the universe is actually holographic.

Holography, in this context, is a slightly non-trivial property, but for our purposes there is not too much to worry about, as an approximation.
The essential property of a holographic universe is that its full internal quantum state, everything about all things within (some volume of the) universe, can be uniquely read off any arbitary boundary surface of the volume.
This implies the universe is very simple, much simpler than it could be, in some sense.

The driving argument for holography is the need to accommodate gravity and to evolve quantum mechanical information consistently with gravity. As a side effect, and as a prediction of this property, we find that black holes encode information on the surface of their event horizon, and that this is a maximum entropy state.

Now, if you consider the universe, and tile it with Planck size elements of size lp ~ 10-33 cm pieces - the current observable radius of the universe is about 4 Gpc, or ~1028 cm - so we find the universe has about 10183 Planck volumes, but only 10123 surface elements encode completely the internal state of the universe.

Amelino-Camelia argued some years ago that the associated graininess ought to be measurable, there are several possible implications, inclduding violations of Lorentz invariance and dispersion for light in vacuum (speed of light would be frequency dependent). However, while the scaling arguments were plausible, the numerical co-efficient describing the size of the effect is unknown and could be small, and the leading order terms might be suppressed by some unsuspected symmetry, or completely suppressed to all orders even. ie the effect might be zero or much smaller than the largest allowed effect suggested by scaling arguments.

What Hogan did is two things: first he considered the projection of some small classical object, like a black hole, onto the a distant surface. Since the area scales more slowly than the volume, the information encoded on the compact object "fuzzes" when it is projected onto a distant surface, and by holography this fuzziness is irreducable (you could squeeze it, I suppose, but that loses you as much in one direction as you gain in the other). The fuzz is surprisingly large - the error σ ~ √(R*lp), where R is the macroscopic projection radius.
Note that this implies a Planck element projected onto the current Hubble radius is fuzzed out to an area of about 30 microns (hmm, that is comparable to compactification scales for theories of Large Extra Dimensions - interesting...).
So, quantum gravity scale physics project to classical scales on the surface of the currently observable universe - we are trivially computable, given large enough a computing surface.
The second thing Hogan did, was to require that the holographic fuzz be precisely consistent with the holographic bound on black hole entropy. This sets the dimensionless numerical factor, and it is in fact about unity, there are no small numbers suppressing this effect. Which is mildly surprising.

So, how large is the effect - well, it is just about at the sensitivity of GEO600. The characteristic strain, h ~ √(tp/2) ~ 10-22, which is just where GEO600 is now.
To recover the predicted white noise - gaussian fluctuations indenpendent of frequency- we need a detector regime where photon residency time is long, so they make up-and-down trips, but which are not photon discreteness noise limited - so no Poisson noise overwhelming the predicted white noise.
This ought to be true near resonance peak sensitivity for GEO600, between ~ 500 and 1200 Hz.
GEO600 ought to be seeing a dip in the noise there, but they are not - the noise curve is too flat. Consistent with white noise as predcited.

This kinda funky.
There are holes in Hogan's argument, which I am sure several people will now drive trucks at, but the size of the effect is interesting and just at the edge of current capabilities.
The effect could be tested and cross-correlated with a high power, short arm length interferometer, so it is falsifiable.

The implications, if true, are very profound. Having experimental data showing our universe is actually holographic, as conjectured, would be extraordinarily profound at the philosophical level, but the margins of this blog are too narrow to fill in the details.



Geo600 sensitivity

cf theoretical sensitivity

Hogan Article I

Hogan Article II

Hogan Article III

Holographic Principle (wiki)

Amelino-Camelia Nature paper (1998)

New Scientist discussion

Tags
Categories

More like this

Doesn't this imply Max-Tegmark-like that if the universe is very large (many orders of magnitude larger than we can see) as predicted in some inflation models and uniform, that there is a high probability that there are exact copies of us out there? By the Holographic Principle there are a finite number of states for a given volume of spacetime. Thus if there are enough volumes by pigeonhole principle there will be duplicates. Of course most of the volumes could be "degenerate" in some way, or perhaps there just aren't that many of them.

it is possible, but not necessary for a holographic universe to be a multiverse, if it is large enough

if the volume is large enough, and the number of states is finite, then a generalised recurrence theorem suggests strongly that states are duplicated - including slightly variant states, unless there is some deep symmetry or initial condition constraint that prohibits such duplication, though that seems unlikely a priori, as it'd require most of a large universe to be trivial in some sense, with states with certain complexity be rarer than they ought to be statistically

Question from an extremely ignorant (but curious) reader:

As something that is unique by definition has no duplicate, why doesn't the word "unique" in this sentence

The essential property of a holographic universe is that its full internal quantum state, everything about all things within (some volume of the) universe, can be uniquely read off any arbitary boundary surface of the volume.

contradict the word "duplicated" in this sentence

if the volume is large enough, and the number of states is finite, then a generalised recurrence theorem suggests strongly that states are duplicated

If the question is so ignorant as to be idiotic, feel free to say so, and I apologize in advance.

not at all ignorant

the idea is that if you take some particular volume, then if the holographic principle holds, the all the information in the volume can be exactly read off the surface surrounding the volume, and only exactly that information

as a separate question, you can ask whether in a very very large universe any such volume is unique in its existence, or whether necessarily a duplicate exists. You can also ask the subtler question of whether an exact duplicate vs "close enough" copy must exist.

so we can ask whether any particular piece of the universe is uniquely recordable, and separately whether the piece is unique in the way it is. The two questions are related, but there are subtleties.

Astronomers seem to be making a habit of picking up "unexplained" noise which happens to be due to fundamental properties of the Universe. First the CMB, now this. ;)

This is a fascinating discovery, with some pretty radical implications. I'll await the inexorable ensuing debate about it with interest...

Well, we sometimes pick up unexplained noise that winds up just to be noise. Sometimes your detector is not doing what you think it is. (Not that I have any particular reason to expect that's the case here).