What Entropy Is

The Second Law of Thermodynamics probably produces more confusion in the general public than most other physical laws that have percolated themselves into the collective consciousness. Not the least of these are all the seemingly disconnected ways of saying it, which vary in accuracy. Disorder increases. Absolute zero is impossible. Engines can't turn heat into work with perfect efficiency. Things fall apart, the center cannot hold, mere anarchy is loosed upon the world...

Well maybe not that last one.

Probably the version that both makes the most sense and is the most accurate is one that's rarely stated outside of physics books. Before I state it, I want to build up an example to illustrate the point. Consider the flipping of four fair coins. You take them out of your pocket and fling them into the air. You can take my word for it that there are sixteen ways that they can land, and here they all are:

HHHH, HHHT, HHTH, HHTT, HTHH, HTHT, HTTH, HTTT, THHH, THHT, THTH, THTT, TTHH, TTHT, TTTH, TTTT.

All of them are equally probable, with a 1 in 16 chance each. On the other hand, we're not keeping track of which coin does which. We're only interested in how many heads and how many tails there are. And not all of those are equally probable. For instance, there's one way each to get all heads or all tails. Both of those are a 1 in 16 chance. But to get two heads and two tails there's six possibilities, making the chance of that 3 in 8 - considerably more likely. That is the heart of the second law:

A system in equilibrium will probably be in the state with the most ways to create that state.

It's still a little informal, but it gets the point across. It may also be objected that physical laws ought not have words like "probably" in them. Fair enough. I wouldn't worry about it much though. Entropy is mainly useful in describing large collections of particles such as for instance the molecules in a roomful of air. The number of ways to more-or-less evenly scatter the molecules about the room is unbelievably colossally enormously larger than the number of ways that they might be all put to one side of the room. And so you're not going to suddenly find yourself in a room where all the molecules just happen to all be somewhere else. My very rough back-of-the-envelope suggests that for even a quite small volume of gas (say, one mole's worth) the imbalance between two halves of a room will essentially never be above one part per trillion.

Even this tiny "violation" of the second law is perfectly kosher though, because the law is formally a statistical statement. The details of those statistics get hairy, but the gist of the very interesting but headache-inducing statistical mechanics class I'm taking this semester is just the words in the blockquote.

More like this

It may also be objected that physical laws ought not have words like "probably" in them. Fair enough.

If that's "fair enough," you've just tossed out quantum mechanics.

By D. C. Sessions (not verified) on 29 Jan 2009 #permalink

This is exactly the way I have always thought of entropy. I think you explained it well.

Hmm. I think I could have said that better then, D.C. What I mean is that QM is an explicitly probabilistic theory, while the second law of thermodynamics is not generally stated as "entropy will probably always increase with time" even if the probability is overwhelming in macroscopic cases. Usually it's just a blanket "entropy increases" statement.

I always preferred the "gamer's" version of the laws of thermodynamics:

1. You can't win
2. You can't break even
3. You can't get out of the game

But then, I went to an odd sort of college...

By featheredfrog (not verified) on 29 Jan 2009 #permalink

Take a handful of coins (uncirculated nickels and pennies are easiest). Balance them on their edges but aligned parallel, heads and tails in equal relative alignment for each type of coin. Impulse the table (nearest) parallel to the planes to randomly topple the coins. Sum outcomes of many runs.

Theory tells us a binomial distribution should describe composition outcome frequency. Theory tells us random binary noise should be sqrt(events). Run the experiment to your summed statistical satisfaction and then tell us why the plotted curve is inarguably asymmetric - why theory fails.

Expiricism? "Oh mamma, that's where the fun is!"

Good post - I find that trying to explain entropy in terms of disorder confuses people since a lot of cases that are more or less ordered in the common sense aren't in the thermodynamic sense. For example, most people would say a deck of cards sorted by suit and value is more ordered than a deck of cards in any other arbitrary order, but since each card is unique within the deck, all the specific sequences are evenly likely.

Uncle Al:
Because the coins aren't perfectly symmetric, amongst many other factors. But could you tell me what the purpose of that was anyway?

"Absolute zero is impossible."

Isn't that the third law?

By Anonymous (not verified) on 29 Jan 2009 #permalink

I think the important thing to get across to students is that entropy is most definitely NOT a vague statement about a vague concept like "order" or "disorder", but rather a statement about a definite quantity. It is a very specific, quantitative measure of disorder (and must be applied to a well defined system) found by figuring out the number of ways you can assign variables like velocity to individual atoms and end up with the same macroscopic state.

It is also extremely important to point out that entropy can, and does, decrease. A perfect example is water turning to ice in a freezer. The entropy outside that system has to increase to compensate (due to physics we have usually covered by that point), but the point gets made that blanket statements about entropy (such as ones many of our students have learned in church) cannot be applied to open systems such as that cube of water or, say, a biological organism.

The second law is not violated when a cell turns into a baby or a baby into an adult.

the second law talks about what entropy is and it's context to irreversibility
basically there is entropy change in of a system and entropy generated by the system during a process
this is what the second law talks about

you explained this well.Thank you tho,got an A on my quiz :)

By Scientist (not verified) on 17 Mar 2009 #permalink