Real Clock Tutorial: Fountains

In the previous clock tutorial post, I described the basic workings of a cesium atomic clock, which looks sort of like this:

i-38392513ada59a78d7244468e302f96b-clock_schematic.jpg

It works by sending a beam of cesium atoms through two microwave cavities. The first cavity synchronizes the "clock" in the atoms with the microwaves, and the second cavity checks whether the two are still in synch. If they are, the microwaves are at the right frequency; if they aren't, the frequency is corrected.

The key feature that determines the performance of this clock is the time between cavities. The longer the atoms spend in between the two, the better the resolution will be, and the better the clock. The problem is, thermal cesium atoms are moving at speeds of a few hundred meters per second, meaning that for any reasonable separation between the cavities, the atoms only spend milliseconds in flight between them. And even if you were willing and able to make an extremely long clock, the atomic beam would fall under the influence of gravity, which greatly complicates matters.

The key to achieving really long interaction times is to make gravity work for you, rather than against you, by standing the clock on end.

This idea traces back to Jerrold Zacharias, who proposed the idea of a "fountain clock" in the 1950's. To make a fountain clock, you shoot the cesium atoms up into the air, and let them fall back down. They pass through a cavity on the way up, and back through the same cavity on the way back down, to get you two interactions with the microwave field (and, as a bunus, make everybody's life easier by only requiring the fabrication of one high-quality microwave cavity). And in between, they can spend a long time in flight-- if you launch a ball upward with a speed of only about 5 m/s, it will spend a full second in the air before returning to its original height. That's potentially a huge improvement.

Of course, there are a few problems with the idea, that prevented Zacharias from actually building such a clock in the 1950's. Chief among the problems is the high temperature of the cesium atoms. High temperature means a high average thermal velocity for the atoms, and also a large spread in velocities. While the atoms will, in fact, spend quite a long time in flight, they'll also spray out all over the place, because they have large horizontal components of velocity. Most of the atoms falling back down will just flat-out miss the cavity.

If you want to make the fountain clock idea work, you need to have cold cesium atoms. But at low temperatures, cesium is usually a solid, and tossing a lump of metal up through the clock just doesn't work. So the fountain clock is pretty much a lost cause until the 1990's.

The key technological development that made fountain clocks possible was laser cooling. By hitting a gas of atoms with properly tuned lasers, you can lower the average velocity of the atoms to centimeters per second, if not better. That's exactly the sort of thermal velocity you need to make the fountain idea work-- the atoms are moving so slowly that they hardly spread out at all, and can fall back down through the microwave cavity with very small losses, but they're still a vapor, and can give you a nice clock signal.

NIST has a nice schematic showing how their fountain clock works. Using the fountain technique gets you a good long time between interactions with the microwave field, which is what lets them achieve the really extraordinary precision they manage-- the frequency uncertainty quoted on that page is on the order of 10-16, which amounts to a loss of one second in sixty million years. That's getting down to the level where the effects of general relativity start to become large enough to notice-- clocks at different elevations will run at very slightly different rates, due to the difference in graviational potential.

Why does anybody need a clock that good? That's a subject for yet another post...

Tags

More like this

Now I'm just a stupid Chemistry student so forgive me if this is a stupid question, but if you laser cool the atoms to move slower, wouldn't that also affect the time it takes for them to traverse the distance? Or is it possible to know exactly how much you're slowing them by so it doesn't matter?

If you know how much you're cooling them, you ipso facto know how much you're slowing them.

Even dumber question from a neurosciences type: Can't you just filter the atoms directly? I mean, you have a vapor - a lot of atoms bouncing around with different velocities and directions. Let them go up through the cavity. You lose a lot of them with a sideways trajectory, and you'll lose even more of them with a too-high velocity (they'll hit the ceiling, basically). The one's that come back down will behave nicely, though, so restrict yourself to using those?

Can't you just filter the atoms directly? I mean, you have a vapor - a lot of atoms bouncing around with different velocities and directions. Let them go up through the cavity. You lose a lot of them with a sideways trajectory, and you'll lose even more of them with a too-high velocity (they'll hit the ceiling, basically). The one's that come back down will behave nicely, though, so restrict yourself to using those?

You can do that, but you throw away most of your signal. You wind up with so few atoms that the signal from the atoms is almost impossible to detect, and you end up getting big effects from statistical fluctuations and electronic noise.

There's always a trade-off between accuracy and signal in these things, and it's really bad at room temperature. Cooling the sample gets you around that, to some degree.