The second law of thermodynamics says that entropy always increases. But what is entropy?

I’ll admit, it took me a long time to understand entropy. I read lay explanations of entropy that said vague things like “entropy is a measure of disorder”, and it didn’t click. It wasn’t until I watched this video, which gave a precise definition of entropy, that I felt like I finally got it.

Microstates and Macrostates

To understand entropy, you must first understand the difference between microstates and macrostates. Let’s define these concepts in the context of an example: a silverware drawer. This particular silverware drawer has three sections: one section for forks, one for spoons, and one for knives. In addition, there are 5 forks, 5 spoons, and 5 knives in the silverware drawer.

A microstate is a full description of the system. In our example, it tells you exactly where every single fork, spoon, and knife is in the drawer. Here’s an example microstate: there are 5 forks and 3 spoons in the fork section, 3 knives in the spoon section, and 2 spoons and 2 knives in the knife section. Here’s another microstate: all the utensils are in the spoon section.

How many microstates are possible in our silverware drawer? We have 15 utensils, each of which can go in any one of the three sections, so \(3^{15}\) microstates.

A macrostate is a higher level description of a system. For example, the drawer could be “messy” or “organized”. The key is that more than one microstate can have a given macrostate. Let’s consider the macrostate of the silverware drawer being “messy” and let’s be precise about what “messy” means. Let’s say the drawer is messy if more than one utensil is in the wrong section. How many microstates would be classified as “messy”? Since this isn’t a post about combinatorics, I’ll just tell you: all but \(31\). That means \(3^{15} - 31\) microstates would be considered “messy” (according to my totally arbitrary definition).

Entropy

With that background, we can introduce the formula for the entropy of a given macrostate:

\[S = k_b ln W\]

where \(S\) is the entropy of the macrostate, \(k_b\) is a constant, and \(W\) is the number of microstates that have that particular macrostate. I don’t particularly care about the constant, just the fact that entropy is proportional to the log of \(W\).

So which has higher entropy, the macrostate of the silverware drawer being “messy” or “not messy”? Clearly messy. There are way more microstates that are messy than not. The entropy of the “messy” state is about \(ln(3^{15}-31) \approx 16.5\), while the entropy of the “not messy” state is about \(ln(31) \approx 3.5\) (ignoring the constant).

So when lay explanations colloquially say that “entropy is a measure of disorder”, it’s because – more often than not – there are more ways for a system to be “disorderly” or “messy” than “organized” or “neat”.

Gas particles in a box

Let’s work through another really commonly used example in entropy explanations. Let’s consider a box full of gas particles. The question is: which macrostate has higher entropy, the gas particles being all on one side of the box or the gas particles being generally evenly spread out throughout the box?

As usual, let’s try to make this precise (and tractable). Let’s say there are 100 gas particles in the box and we are going to keep track of how many particles are on the left and right halves of the box. Let’s say that the particles are “well mixed” if there are at least 40 particles on each side.

Well Mixed

Not Well Mixed

First off, how many microstates are there? We have 100 particles, and for each one we’re keeping track of which side it’s on, so that’s \(2^{100}\) microstates.

How many microstates are “well mixed”? Again, not a combinatorics lesson, but using the binomial distribution we can figure out that the probability that either side has less than 40 particles is about 3.5%. So, the macrostate of “well mixed” has about 27 times the number of microstates as the macrostate of “not well mixed”. That means that “well mixed” has higher entropy – in fact, it has \(ln(27) \approx 3.3\) more entropy than “not well mixed”. Honestly, the numbers don’t really matter. It’s more important to understand that more microstates mean higher entropy.

Second Law of Thermodynamics, revisited

Now that we understand what entropy is, let’s revisit the second law of thermodynamics:

  • Entropy almost always increases
  • i.e. systems tend towards macrostates with more microstates vs. fewer microstates
  • i.e. systems tend towards high probability macrostates vs. lower probability macrostates

Entropy increases because there are more ways to be in a high entropy state than a low entropy state. Put that way, it sounds so obvious that it’s almost a tautology. Higher probability things happen with… higher probability? Yeah… sure, I guess.

I’m probably glossing over some important details. And I don’t claim that we’ve actually fully explained the second law of thermodynamics. In particular, a system must be able to evolve over time in order for entropy to increase. And the way a system evolves is usually constrained.

For example, consider the gas particles in a box. If the system evolved by picking a random microstate from one moment to the next, then entropy would increase by construction. You’d necessarily jump to higher probability macrostates with… higher probability. But that’s not how the system evolves. The gas particles are constrained to move to a location close to their current position. However, if you wait for long enough, modeling the particles as jumping to a randomly chosen location within the box is actually probably a decent model since gases are so chaotic. So, given enough time, entropy will very likely increase.

Oh, and that that leads me to my last bone to pick with lay explanations of entropy. Saying that entropy always increases is misleading! It implies, at least to me, that it’s some physical law that is always obeyed, like gravity. Instead, we should say that entropy very (very) likely increases, which conveys the fact that this is a probabilistic statement that relies on probabilistic arguments.

Anyways, I hope that clears up any confusion about what entropy is for you. And if not, I highly recommend watching this video.