*This post was copied and slightly edited from a post I made a year or so ago at my blog’s former location.*

More bullshit has been written about entropy than about any other physical quantity.

—Prof. Dave Beeman, 1988

There is a popular-level understanding of entropy that is “randomness” or “disorder”. This is not a bad way of looking at it, but brings along with it some associated concepts that are misleading. Creationists exploit this ambiguity by turning the argument around to information, where, even though ultimately we’re talking about the same physical quantity, the implications are much less obvious- precisely because “information” has a common, colloquial meaning in regular conversation that is different from the entropy-connected (and therefore second-law-of-thermo-connected) definition of information. Much as the term “theory” is misunderstood when talking about science, so is “information”.

So, if we are going to strive to be accurate: **what is entropy?**

Consider a very simple physical system: a row of eight switches. Each switch can either be “on” or “off”. (Old-school computer nerds will recognize this as a “byte”, but if you don’t, don’t worry; all you need is the concept of the switches.)

What is the entropy of this system? Well, each switch has two different states it can be in. There are eight switches, so the system as a whole has 2×2×2×2×2×2×2×2 = 256 different states that it can be in. That’s a measure of how much randomness there is in the system; each time you come across a set of eight randomly oriented switches, it could be in one of 256 randomly different configurations. It’s also a measure of how much information there is. You could, for instance, assign serial numbers to 256 different objects using this system of switches. You need an alphabet or language that can provide 256 different answers to fully specify the state of this system.

Now let’s take the same system, but impose some order, some structure, upon it. Let’s say that the switches must alternate: if one is on, then the next one must be off, and vice versa.

Now how many different states are there that the system could have? Only 2. If the first switch is on, then the second is off, the third is on, the fourth is off, and so forth. If the first switch is off, the second is on, and so forth. You can completely specify the state of the system with an alphabet or language that can provide only two different answers; all you need to do is specify the position of the first (or, really, of any) switch, and you know the state of all the rest.

Up to multiplying by a constant, here’s the mathematical definition of the physical quantity of entropy, that thing that the second law of thermodynamics talks about: the number of switches whose position you need to specify to completely describe the identity, position, and velocity of every particle in a physical system. (For these switches, it’s only identity we’re worried about: up or down? Position and velocity would require additional information, effectively additional switches or computer memory to keep track of where everything is, but let’s keep the situation simple for now by saying that position is fixed and velocity is zero.)

In the case of eight switches which can be set any way, the entropy is (a constant times) 8. In the case of two switches which can be set any way, the entropy is (a constant times) only 1. The entropy of the system which has some structure imposed upon it is lower than the entropy of the system which has less structure imposed up it.

Here we can see that more “randomness” or “disorder” does mean more entropy. Note, however, that when we talk about this, we talk about the possibilities of where the switches can be; we don’t talk about any particular switch alignment. The system of eight uncoupled switches has 256 different configurations it can be in, and thus has a lot more “disorder” or entropy than the system of switches whose positions must alternate.

In this way, a star and planetary system has less entropy than the big randomly mixed gas cloud from which the star and planetary system formed. But how can this be, if the second law of thermodynamics tells us that entropy is always supposed to increase? Well, when that gas cloud collapsed and formed a planetary system, it also released a lot of light- some visible light, but even more infrared light, which sometimes we (not entirely correctly) call “heat”. All of the photons released by that infrared radiation carry away a lot of entropy with them; it takes a lot of switches to specify where all of those photons are, and where they are going. The entropy of some system can locally decrease, as long as it is coupled to another system whose entropy can increase. In this case, the stuff that condenses to make the star and planetary system is one system, and the photons released by that stuff is another system. The entropy of the first system can decrease as long as the overall entropy of everything considered together stays the same or increases.