The Statistical Interpretation of Entropy
Part I: Entropy and Probability
Microstates and Macrostates
Two common ways of stating the Second Law of Thermodynamics are: heat always moves from hot to cold, and
heat engines are always less then 100% efficient. In this activity we will examine another, equivalent, way of stating
the Second Law:
The total entropy of an isolated system can increase, but it cannot decrease.
In 1877 Ludwig Boltzmann showed that entropy can be understood in terms of the motion of the molecules that
compose a physical system. To understand Boltzmann’s ideas we must first understand what is meant by entropy,
and to understand entropy we must first discuss microstates and macrostates.
A microstate is a state of a physical system described at the finest level of detail. A macrostate is a state of a
physical system that is described in terms of the systems overall or average properties at a macroscopic level. A
macrostate will generally consist of many different microstates. In defining a macrostate we ignore what is going on
at the microscopic (atomic/molecular) level.
- Suppose 24 students are enrolled in a class. For simplicity let’s assume that any student who is present is
sitting completely still at his or her assigned seat, facing forward (so we don’t have to worry about students
being in different locations, having different motions, etc.). Which of the following would constitute a
macrostate description of the class attendance and which would constitute a microstate description. Right
“micro” or “macro” next to each item.
- A list of the names of each student present today.
- The number of students in attendance.
- Now suppose you had a class roster showing the names of all 24 students enrolled in this class. If I specified
the macrostate of the class by saying that there are 24 students present, could you tell me what the microstate
is? Why or why not?
- If I specified the macrostate of the class by saying that there are 15 students present, could you tell me what
the microstate is? Why or why not?
- This example illustrates the fact that some macrostates provide a great deal of information about the
microstate of the system. This is the case when the macrostate contains only one, or a small
number, of microstates. A system in one of these macrostates is said to have a low entropy. Other
macrostates tell us very little about the microstate of the system. This is the case when the macrostate
contains a large number of different microstates. A system in one of these macrostates is said to
have a high entropy. Which of the following macrostates of the class would have the highest
entropy?
- All 24 students are present.
- 12 students are present.
- 1 student is present.
- No students are present.
- Which of the above choices would have the lowest entropy?
The remainder of this activity will help develop your understanding of microstates and macrostates, how these
relate to the entropy of a system, and how random behavior leads to the Second Law. To illustrate these ideas we
will focus on a simple example of a row of coins.
A Row of Coins
To get a better understanding of microstates and macrostates we will begin by working with a very
simple model system: a row of coins. This model system consists of some number of identical coins
laid out in a straight row. The position of each coin in the row is fixed, so we don’t have to worry
about the coins moving around. Each coin can display either its heads side or its tails side. We want to
explore the various possible microstates of this system and the macrostates we can form from them.
Note that if we use n coins then there should be 2n possible microstates. This is because each coin
can be in one of two states, so when we add another coin we multiply the number of microstates by
2.
- If there are only two coins, then there are four microstates: HH, HT, TH, and TT. But notice that there are
only three macrostates: (2H,0T), (1H,1T), and (0H,2T). In the spaces below indicate how many microstates
are in each macrostate. [Note: the number of microstates in a given macrostate is called the multiplicity of that
macrostate and is denoted by a capital Greek omega (Ω). So, for example, we denote the multiplicity of the
state with two heads and zero tails as Ω(2H,0T). We denote the combined multiplicity of all macrostates as
Ω(all).]
- Ω(2H,0T) =
- Ω(1H,1T) =
- Ω(0H,2T) =
- Ω(all) =
- Now we are ready to provide a technical definition of entropy. The entropy of a particular macrostate is
just the (natural) logarithm of that macrostate’s multiplicity. If we denote the entropy by S,
then
S = ln Ω.
The ln denotes the natural logarithm, which is a function that is built into any decent scientific
calculator. Hopefully you are already familiar with this function, but if not you should plot a graph
of lnx vs. x on a graphing calculator or computer plotting program (ask your instructor for
help if needed). Which of the graphs below could be a plot of lnx vs. x? (Circle the correct
answer.)
- Compute the entropy of the three macrostates for two coins and record your results below. Note that it only
makes sense to talk about the entropy of a macrostate, not a microstate, since the multiplicity of a microstate
would always be 1 (and so the entropy would always be zero).
- S(2H,0T) = ln[Ω(2H,0T)] =
- S(1H,1T) = ln[Ω(1H,1T)] =
- S(0H,2T) = ln[Ω(0H,2T)] =
- Suppose you chose one of the microstates of the two coins system at random. What is the probability that you
would get a microstate that corresponds to the macrostate (1H,1T)? To calculate this probability just divide
the multiplicity of the macrostate by the total number of possible microstates (Ω(all) = 4, in
this case). Determine the probabilities for each of the three macrostates and record your results
below.
- P(2H,0T) = Ω(2H,0T)∕Ω(all) =
- P(1H,1T) = Ω(1H,1T)∕Ω(all) =
- P(0H,2T) = Ω(0H,2T)∕Ω(all) =
- Now add up the three probabilities you found in the previous question. What do you get? Is this what you
expected? Why?
- Let’s keep going. For the case of 3 coins write all of the possible microstates below.
- Record the multiplicities, entropies, and probabilities of these macrostates below. In the last row record the
multiplicity of all macrostates and the sum of all the probabilities.
|
|
|
|
Macrostate | Multiplicity (Ω) | Entropy (S = lnΩ) | Probability (P = Ω∕Ω(all)) |
|
|
|
|
3H, 0T | | | |
|
|
|
|
2H, 1T | | | |
|
|
|
|
1H, 2T | | | |
|
|
|
|
0H, 3T | | | |
|
|
|
|
Totals | | NA | |
|
|
|
|
|
- One more round. For the case of 4 coins write all of the possible microstates below.
- Record the multiplicities, entropies, and probabilities of these macrostates below.
|
|
|
|
Macrostate | Multiplicity (Ω) | Entropy (S = lnΩ) | Probability (P = Ω∕Ω(all)) |
|
|
|
|
4H, 0T | | | |
|
|
|
|
3H, 1T | | | |
|
|
|
|
2H, 2T | | | |
|
|
|
|
1H, 3T | | | |
|
|
|
|
0H, 4T | | | |
|
|
|
|
Totals | | NA | |
|
|
|
|
|
- Notice how the probability tends to be higher for certain macrostates. The probability of getting one of the
three macrostates with 1, 2, or 3 heads is while the probability of getting one of the two
macrostates with 0 or 4 heads is .
- Now let’s see if we can spot a pattern in the multiplicities. In the space below I’ve written the numbers you
should have gotten for the various multiplicities produced by each number of coins. Note that I’ve inserted the
results for a single coin (two macrostates each with a multiplicity of one, which should make sense if
you think about it). The numbers form a pattern known as Pascal’s triangle. See if you can
identify the pattern. Once you have figured out the pattern complete the triangle down to row
10.
n | | | | | | | | | | | | | | | | | | | | | |
1 | | | | | | | | | | 1 | | 1 | | | | | | | | | | |
|
2 | | | | | | | | | 1 | | 2 | | 1 | | | | | | | | |
|
3 | | | | | | | | 1 | | 3 | | 3 | | 1 | | | | | | | |
|
4 | | | | | | | 1 | | 4 | | 6 | | 4 | | 1 | | | | | |
|
5 | | | | | | 1 | | | | | | | | | | 1 | | | | | |
|
6 | | | | | 1 | | | | | | | | | | | | 1 | | | | |
|
7 | | | | 1 | | | | | | | | | | | | | | 1 | | | |
|
8 | | | 1 | | | | | | | | | | | | | | | | 1 | | |
|
9 | | 1 | | | | | | | | | | | | | | | | | | 1 | |
|
10 | 1 | | | | | | | | | | | | | | | | | | | | 1 |
|
- For the case of 10 coins, record the multiplicities, entropies, and probabilities below. (Hint: you should not
need to add up all the numbers to fill in the last row.)
|
|
|
|
Macrostate | Multiplicity (Ω) | Entropy (S = lnΩ) | Probability (P = Ω∕Ω(all)) |
|
|
|
|
10H, 0T | | | |
|
|
|
|
9H, 1T | | | |
|
|
|
|
8H, 2T | | | |
|
|
|
|
7H, 3T | | | |
|
|
|
|
6H, 4T | | | |
|
|
|
|
5H, 5T | | | |
|
|
|
|
4H, 6T | | | |
|
|
|
|
3H, 7T | | | |
|
|
|
|
2H, 8T | | | |
|
|
|
|
1H, 9T | | | |
|
|
|
|
0H, 10T | | | |
|
|
|
|
Totals | | NA | |
|
|
|
|
|
- Which statement below best describes the relationship between probability and entropy for the macrostates we
have considered?
- The most probable macrostates have the lowest entropy.
- The most probable macrostates have the highest entropy.
- There is no clear relationship between entropy and probability for these macrostates.
- If you put ten coins in a cup and then dump them out onto the table, the probability that you would get
somewhere between 3 and 7 tails is . The probability that you would get less than 3
tails or more than 7 tails is .
- In question 13 you should have found that roughly 90% of the probability was concentrated into 3 of the 5
macrostates. In question 18 you should have found that roughly 90% of the probability is concentrated into 5
of the 11 macrostates. Based on these results you can conclude that as you increase the number of coins in the
system the probability .
- becomes more spread out among a larger fraction of the macrostates
- becomes more concentrated into a smaller fraction of the macrostates
- remains about the same for a given fraction of the macrostates
- Look back over your results for 3, 4, and 10 coins. Now suppose we did the same calculations for a sequence of
2000 coins (don’t worry, we won’t actually do this). Which of the following macrostates would have the highest
probability?
- 2000 H, 0 T
- 1100 H, 900 T
- 1001 H, 999 T
- 1000 H, 1000 T
- If we examined the probabilities for all of the macrostates in the system of 1000 coins we would find that the
probability is .
- spread evenly among all of the macrostates
- spread out among a large fraction of the macrostates, but with some macrostates having slightly
higher probabilities than others
- tightly concentrated within a small fraction of the macrostates centered on the most probable
macrostate
- entirely concentrated into the single most probable macrostate
Experimenting with Our Model
Now that you have a handle on multiplicities and entropy, let’s perform a little “experiment” with our
model. What would happen if we start with a row of 20 coins that are all showing heads and then we
began flipping coins over at random? This may seem like a strange thing to investigate, but we will
see later on that the behavior of this system provides us with a qualitative understanding of how
real systems (like gases) behave. So let’s explore the behavior of this system by laying out 20 coins (I
suggest laying them out in four rows with five coins in each row). We will start with all 20 coins heads
up and then we will choose which coin to turn over by rolling a 20-sided die. Flip over the coin that
corresponds to the result of the die roll (the groups of five should make it easy to find the correct
coin).
- Continue the experiment until you have completed 80 coin flips. Keep track of the results in a table
or a spreadsheet (if available). You should have columns for the number of flips (starting at 0), the
number of heads (initially 20), and the number of tails (initially 0).
- Create a plot of the number of heads after each flip. If your data is in a spreadsheet, use an X-Y plot
with the points connected by a line. Make sure to label your axes and give the plot a title.
- Describe what happens to the number of heads in your sequence of coins as you flip more and more
coins.
- Which of the following best describes the long-term behavior of the model (after many coin
flips).
- The number of heads bounces around all over the place, from 0 to 20.
- The number of heads goes to 10 and stays there.
- The number of heads fluctuates around 10 but neither stays at 10 nor gets very far from 10.
- The number of heads goes to 0 and stays there.
- Consider a situation in which your sequence of 20 coins is in a macrostate with 15 heads and 5 tails. How
many different results from the die roll would lead to a macrostate with 16 heads and 4 tails?
- If the macrostate is 15 H, 5 T then how many different results from the die roll would lead to the macrostate
14 H, 6 T?
- Based on the answer to the previous two questions, if the current macrostate is 15 H, 5 T then what is the
most likely macrostate for the model system after the next die roll? How much more likely is this macrostate
than the alternative?
- If the current macrostate is 4 H, 16 T then what is the most likely macrostate for the model
system after the next die roll? How much more likely is this macrostate than the alternative?
- There is one macrostate for which the two alternatives that can result from the next die roll are equally likely.
What macrostate is this?
- Use the results of these last few questions to explain why the model system tends toward the macrostate 10 H,
10 T and then tends to stay close to that macrostate. This special macrostate is often referred to as
the equilibrium macrostate (because, in this case, it has the same number of heads as tails).
- Based on your entropy calculations above, would you say that the entropy of this model system mainly
increased, mainly decreased, or remained constant during the course of your experiment? Explain.