Volume and Entropy

Thermodynamics and statistical mechanics hold a bit of an odd place in the heart and mind of a physics student. On one hand it's one of the few subjects with truly universal applicability. No matter if you work in galaxy clusters, nuclear theory, experimental solid state, or anything else the concepts of those disciplines are going to assert themselves. Energy and entropy are everywhere. On the other hand it tends to be a tremendous pain in the neck to learn.

I'm in a stat mech class this semester, as you might guess. We do about two days worth of thermodynamics to keep up appearances, but by the time you get to the graduate physics level thermo is essentially an engineering problem. Statistical mechanics is the heart of the issue.

Now as today is day 1 and we haven't yet done all that much, I'm not going to go into any detail here either. I just want to give you a brief taste of the weirdness. Trust me, I'm going to take advantage of this class to write about plenty of weirdness on this site as the semester goes into full swing.

Take pressure, for instance. We know it's force per unit area. Slightly more esoterically, we're probably also aware that it's the derivative of energy with respect to volume. That has more thermodynamic significance, but then perhaps more relevant is this weird creature:

i-d4bb9fe09d6b78eb832985821d8a307f-1.png

That's more than a little bizarre, in my opinion. Pressure is equal to the temperature times the derivative of entropy with respect to volume - number of particles and total energy held constant. I wouldn't find that intuitively obvious at all. Of course it works out dimensionally: the entropy S has units of energy per temperature, so you end up with overall units of energy per volume which matches the more common-sense definition above.

Things get weirder. You can without too much trouble extract a similar-looking equation for volume. Volume is an even more concrete thing to measure than pressure and so to see it written in terms of thermodynamic variable like entropy is a little disconcerting. True nonetheless, however.

And people say quantum mechanics is weird.

More like this

Via social media, John Novak cashes in a Nobel Betting Pool win from a while back, asking: Please explain to me the relationship between energy, entropy, and free energy. Like you would explain it to a two year old child. Why? There is a statistical algorithm called Expectation Maximization…
As I said yesterday, I'm going to blow through another entire subfield of physics in a single equation, as our march toward Newton's Birthday continues. Today, it's statistical mechanics, a very rich field of study that we're boiling down to a single equation: This is Boltzmann's formula for the…
Once again, the advent calendar is delayed until late at night by a busy day with SteelyKid-- soccer in the morning, playing with a trebuchet after lunch, then Arthur Christmas at the Colonie mall. We're running low on days to honor great milestones in physics, though, so I don't want to skip a day…
On the reader request thread, commenter Brad had several questions; one led to yesterday's post about superconductors, another is a critical issue in pedagogy: Finally, why did all of my stat[istical] mech[anics] courses suck? Statistical Mechanics is the branch of physics that deals with…

i always got annoyed by all the partial derivatives. and don't even start with the maxwell relations equating the partial derivatives.

the math is pretty straight forward--it is the physical intuition that is hard.

you are always taking the partial derivative of one variable with respect to another holding volume, entropy and the phase of the moon constant.

then there are the thermodynamic potentials: energy, enthalpy, helmoltz free energy and gibb's free energy.

problem sets went like this:

"hmmmm, we are given pressure, volume, the population of philedelphia, 3 cloves of garlic and my neighbors dog's favorite chew toy. that mean's we use the enthalpy--or is it gibb's free energy? aargh, i shoulda been a crash test dummy!"

I'm glad to know that I am not the only physics student who finds thermo to be completely unintuitive.

By Max Fagin (not verified) on 22 Jan 2009 #permalink

The obvious solution is to teach primary grades as mathematical physics instead of touchie-feelie jury-rigs or flat out politicized crap. Achieved "common sense" then models reality not Official Truth. This would require teachers educators like DR. Mrs. Vice-President to possess IQs in excess of ambient temperature. (Europe would obviously require a different metric. Brutal air conditoning to be installed in all domestic instructional facilities as a Federal program, kickstarting the American economy.)

I used classical thermodynamics a lot in my thesis work. I've always found it interesting that the math involved is trivial compared to what you find in most other areas of physics (including stat. mech.), and yet thermo. seems to be haunted by basic conceptual errors (by people with Ph.D.'s who should know better) more frequently than other areas.

By BiophysicsMonkey (not verified) on 22 Jan 2009 #permalink

That derivative becomes much less weird if you think of entropy not as a measure of degeneracy (although it is), but instead as the magical function of the extensive properties of a system (energy, volume, particle numbers,...) which attains a maximum at the equilibrium values of those extensive properties. As such, entropy is akin to the Lagrangian, except for thermodynamics.

So, S = S(U, V, N1, ..., Nr), and at equilibrium dS = 0.

Now consider the standard fictional box with a movable wall that divides the box into two sections, 1 and 2:

[  1   <-|-> 2 ]

S = S1 + S2, or dS = dS1 + dS2 = 0 at equilibrium. Now we may not know what the partial derivative of S with respect to V actually is, physically (let's just call it X for now), but at equilibrium we know that X1 dV1 + X2 dV2 = 0. Since dV1 = -dV2 (because V = V1 + V2), then (X1-X2)dV1 = 0, or X1 = X2 at equilibrium.

So, what physical quantity should be the same on both sides of the movable wall at equilibrium? Well, the pressure should be the same. Thus, it shouldn't be surprising that your derivative is proportional to the pressure.

I highly recommend "Thermodynamics and an introduction to thermostatistics" by H. B. Callen (http://www.amazon.com/Thermodynamics-Introduction-Thermostatistics-Herb…). Looking for extrema is natural for physicists steeped in Lagrangians and Hamiltonians, so this way of defining the entropy is fairly natural, if less than intuitive from a microscopic standpoint. (Looking for extrema is not at all natural for beginning physicists, which is one reason why thermodynamic definitions of entropy seem remarkably arcane in freshman physics.) It's up to statistical mechanics to provide a microscopic definition of entropy.

By Grant Goodyear (not verified) on 22 Jan 2009 #permalink

Please discuss that equation with reference to entropy change in black holes.

@#1:
It's easy to memorise all those Maxwell relations. Write down a square graph like this (does not come off perfect in this Font here, hope you can understand it):

SUV
H F
pGT

From this you read off all of them like (dS/dp)_T = (dV/dT)_p (hope I did not mis-type). You can also memorise that you have to look at H as depending on S and p and so on.

Memorise with the sentence "Good physicists have studied under very fine teachers".

Thermo is like the original programmer's manual for the Mac: 80 chapters, each of which assumes you already understand the other 79 chapters.

I look at it a lot like comment #5, as analogous to generalized potentials in class. mech. Generalized potentials and thermo aren't meant to be intuitive. They provide systematic methods for attacking problems that are too complicated for your intuition.

By Bob Hawkins (not verified) on 23 Jan 2009 #permalink

Only mysterious because many physicists unfortunately seem to have confused mathematical proof with physical argument. While the proof is ultimately important, the physical intuition that motivates such derivations and arguments are usually forgotten. Think of all the solutions you get (in QM this is especially common) that are ``lucky guesses that work'' because we know the answer in advance, but are rarely ever presented as a reasonable solution in of its own right.

Huh - the entropy one makes fine intuitive sense to me, and more quickly than the energy one..But then, I'm weird, and lots of this stuff makes more sense to me than it should.