Reference: Daniel V. Schroeder, An Introduction to Thermal Physics, (Addison-Wesley, 2000) – Problems 2.28 – 2.30.
We’ve seen that, by counting the numbers of microstates for each macrostate in a system and invoking the assumption that all microstates are equally probable, we can prove that it’s overwhelmingly probable that for any system with a macroscopic number of molecules, it will be found at or very near its most probable state. For an ideal gas, this state sees the molecules distributed evenly over the available volume, and the total energy distributed evenly over the population. Events such as all the molecules suddenly confining themselves to some smaller subvolume are essentially impossible.
Another way of looking at systems like this is to say that if we start the system off in one of its less-probable microstates and allow it some time to settle down, it will invariably end up in one of the nearly-most-probable microstates. In other words, the system will evolve towards the macrostate with the greatest number (that is, the highest multiplicity) of microstates.
Since multiplicities of macroscopic collections of molecules are very large numbers, it’s more usual to work with their logarithms. Since even the logarithms can be large numbers (on the order of ), it’s conventional, in SI units, to multiply this logarithm by Boltzmann’s constant so that the final value is usually a ‘normal’ sized number. This combination is known as the entropy of a macrostate:
The units of are thus the same as those of , which are .
In terms of entropy, we can state the second law of thermodynamics, which is that entropy tends to increase. It’s a ‘law’ only in the sense that it arises from the assumption that all microstates for a given macrostate are equally probable. If, for some reason (divine intervention?) that were not true, then of course systems could end up in unlikely states on a regular basis. However, the second law does seem to hold for all everyday processes, especially those involved in keeping our modern technological society running.
The second law leads to a couple of important definitions in thermodynamics. Any process which creates new entropy is said to be irreversible, since reversing the process would result in a decrease in entropy, violating the second law. Any process which leaves the total amount of entropy in the universe unchanged is therefore, in principle, reversible.
If we have a way of calculating or estimating , then clearly is easy to find, so we’ll give a few examples here.
Example 1 In a standard pack of playing cards there are 52 different cards, so they can be arranged in ways. The size of this number is why it’s highly unlikely that any card game that relies on dealing cards from a shuffled deck will ever repeat itself. The entropy of a shuffled deck is therefore
Without Boltzmann’s constant, we have .
Although playing cards aren’t made of an Einstein solid, the multiplicity of the macrostate in which thermal energy is exchanged among the cards will be something of a similar order. The approximate multiplicity in the high temperature case for an Einstein solid with oscillators and energy quanta is
For on the order of , is a very large number, so the thermal entropy of the cards is vastly greater than the entropy generated by shuffling the deck.
Example 2 Now consider two Einstein solids with , and . The most likely macrostate will see the energy divided proportionately between the two solids, so and . The multiplicity of this macrostate is given by
The least likely macrostate would find all the energy in the smaller solid, so that and . In that case
Over long time scales, the interaction between the two solids means that all microstates are accessible. In this case the multiplicity is
Thus the most probable state with the solids divided has almost as much entropy as when the whole system is a single state.
Example 3 Returning to the example of two large, identical Einstein solids, each containing oscillators with a total energy of . For the state in which all microstates are allowed, we had
If , the entropy of this total state is
The most probable state, where the energy is equally divided between the two solids, had a multiplicity of
so the entropy is
Thus the entropy is essentially determined entirely by the factor, and it makes no difference whether the system is considered as a single system with all microstates possible or as two separate solids in their most probable configuration.
Over long times, the system is free to explore all the microstates included in , but at any given time, it is overwhelmingly likely to be at or near its most probable state. Thus over short time scales, we can say that the entropy is given by , but over long time scales it is given by , which is ever so slightly larger than . We can, however, picture a scenario in which we choose a time when the system is in its most probable macrostate (so the entropy is ) and then insert an impermeable barrier between the two solids so they can no longer exchange energy. In effect, this means that the system is stuck in its most probable state for all time, so its entropy will always be . In practice, though, the difference between and is so small that it is utterly unmeasurable. Also, the mechanical process of inserting the barrier would probably increase the entropy of the surroundings by more than the decrease in entropy in the two solids. Thus, it’s unlikely that the second law has been violated and even if it were possible to insert the barrier with no additional entropy, the violation is so small it could never be measured.