As I said yesterday, I’m going to blow through another entire subfield of physics in a single equation, as our march toward Newton’s Birthday continues. Today, it’s statistical mechanics, a very rich field of study that we’re boiling down to a single equation:

This is Boltzmann’s formula for the entropy of a macroscopic system of particles, which says that the entropy *s* is proportional to the logarithm of the number *N* of microscopic states consistent with that macroscopic state. The constant *k _{B}* is there to get the units right.

Why does this get to stand in for the whole field of statistical mechanics?

When thinking about this, I was tempted to go with the definition of the Boltzmann distribution, which my angry quantum prof in grad school gave as the only useful thing to come out of stat mech, saying there was no reason to spend a whole semester on it. This seems to get more to the heart of the matter, though.

The field of statistical physics is basically dedicated to explaining how the world we see around us arises from the interactions of uncountable numbers of microscopic particles (atoms, molecules, electrons, whatever). The key realization that makes it possible to extract predictions without needing to know the state of all 10^{27} atoms making up some object is that such huge systems can be described statistically– that there are many possible equivalent states for all the particles making up a system, and which precise one you find yourself in is just a matter of probability.

This equation, then, really captures the central notion of the entire field. The macroscopic property known as entropy is just a measure of how many microscopic arrangements of particles and states could give you the same collection of properties for the macroscopic system. If there are lots of ways to get basically the same situation, the entropy is high; if there are only a few ways of getting a particular set of properties, then it has low entropy.

The provides a very simple foundation for thermodynamics– the observed fact that entropy always increases (the infamous Second Law of Thermodynamics) is really just a consequence of probability. A system that starts in a low-entropy state has lots of ways to move to a state of higher entropy, but only a few ways to move to a state with the same or lower entropy. Thus, you are more likely to see a system move from low to high entropy than vice versa, and when you’re talking about macroscopic objects involving 10^{20}-odd atoms, the probability of seeing entropy spontaneously *decrease* quickly moves into monkeys-writing-Shakespeare territory.

This doesn’t have as many practical consequences as some of the other equations we’ve dealt with, but it’s a tremendously powerful idea in a philosophical sort of way. It provides a framework for talking about the origin and history of the entire universe, including its eventual fate, and you don’t get any deeper than that.

So, take a moment to admire the simplicity of the idea at the core of statistical mechanics, and come back tomorrow to see the next equation of the season.