Somebody asked a question at the Physics Stack Exchange site about the speed of light and the definition of the meter that touches on an issue I think is interesting enough to expand on a little here.

The questioner notes that the speed of light is defined to be 299,792,458 m/s and the meter is defined to be the distance traveled by light in 1/299,792,458 seconds, and asks if that doesn’t seem a little circular. There are actually three relevant quantities here, though, the third being the second, which is defined as 9,192,631,770 oscillations of the light associated with the transition between the hyperfine ground states of cesium. As long as you have one of these quantities nailed down to a physical reference, you are free to define the other two in terms of that one.

This strikes a lot of people as odd. The image most people have of physical standards would have both the meter and the second tied to some sort of physical reference, with the speed of light determined in terms of those two standards. And, in fact, that’s how things used to be– if you look at the history of the definition of the meter, you see that it was tied to a physical standard until 1983. So why the change?

The reason for the change is basically that we can do a much better job of measuring time than position, thanks to spectroscopic techniques developed in the 1940’s. This wasn’t always the case– the second was originally defined in terms of astronomical observations, with the value being defined as 1/31,556,925.9747 of the duration of the year 1900. This end up being a little problematic, though, because the motion of astronomical objects changes measurably over time.

With the ultra-precise spectroscopic techniques developed by Rabi and Ramsey in the 1940’s and 50’s, it became possible to make clocks using atoms as the time reference. This offers a big advantage over astronomical motion, in that the quantum mechanical rules that determine the energy levels of an atom do not (as far as we can tell) change in time. Thus, every cesium atom in the universe has exactly the same energy level structure as every other cesium atom in the universe, making them ideal reference sources.

By the 1980’s when the definition of the meter was changed, atomic clocks were accurate to about one part in 10^{13}. That’s vastly better than the then-current physical standard for the meter, namely “1,650,763.73 vacuum wavelengths of light resulting from unperturbed atomic energy level transition 2p10 5d5 of the krypton isotope having an atomic weight of 86.” At that point, our ability to measure time exceeded our ability to measure length by enough that it was worth changing standards to take advantage of the precision of atomic clocks. Since relativity tells us that all observers will measure exactly the same speed of light, that gives us a very nice way of connecting time and distance measurements. Thus, the speed of light was defined to have a fixed value, and the definition of length was tied to the definition of time.

Of course, atomic clocks have only gotten better since the 1980’s. The current best atomic clocks, based on laser-cooled atoms, are good to several parts in 10^{16}, or, in the way such things are usually quoted, about one second in sixty million years. We can’t come close to measuring lengths with that sort of precision, except by converting them to time measurements.

The other obvious question to come from this subject is “Why are all these things defined with such screwy values? If the speed of light is going to be a defined quantity, why not make it exactly 3×10^{8} m/s, and spare us having to type ‘299,792,458’ all the time?” The answer is, basically, inertia. The physical standards for length and time had been in use for so long, and used for so many precision measuring instruments that the expense of changing the base units would’ve been ridiculously huge. So, the definitions were chosen to be as close to the previously existing values as possible.

In an ideal world, it would’ve been better to define the speed of light as exactly 300,000,000 m/s, and the second as 10,000,000,000 oscillations of the light associated with the cesium ground state splitting, and end up with a meter that is about 9% longer than the one we currently use. In addition to making life simpler for physics students all over the world, it would be an elegant solution to the problem of the US using a different system of units than the rest of the world. If *everybody* had to change all their units over to a new system, it wouldn’t be such an imposition on us… Sadly, ideal worlds only exist in physics problems and economics textbooks, so we’re stuck with the current screwy definitions for the foreseeable future.