Entropy: the second law of thermodynamics and reversible & irreversible processes

Reference: Daniel V. Schroeder, An Introduction to Thermal Physics, (Addison-Wesley, 2000) – Problems 2.28 – 2.30.

We’ve seen that, by counting the numbers of microstates for each macrostate in a system and invoking the assumption that all microstates are equally probable, we can prove that it’s overwhelmingly probable that for any system with a macroscopic number of molecules, it will be found at or very near its most probable state. For an ideal gas, this state sees the molecules distributed evenly over the available volume, and the total energy distributed evenly over the population. Events such as all the molecules suddenly confining themselves to some smaller subvolume are essentially impossible.

Another way of looking at systems like this is to say that if we start the system off in one of its less-probable microstates and allow it some time to settle down, it will invariably end up in one of the nearly-most-probable microstates. In other words, the system will evolve towards the macrostate with the greatest number {\Omega_{max}} (that is, the highest multiplicity) of microstates.

Since multiplicities of macroscopic collections of molecules are very large numbers, it’s more usual to work with their logarithms. Since even the logarithms can be large numbers (on the order of {10^{23}}), it’s conventional, in SI units, to multiply this logarithm by Boltzmann’s constant so that the final value is usually a ‘normal’ sized number. This combination is known as the entropy of a macrostate:

\displaystyle  \boxed{S\equiv k\ln\Omega} \ \ \ \ \ (1)

The units of {S} are thus the same as those of {k}, which are {\mbox{Joules K}^{-1}}.

In terms of entropy, we can state the second law of thermodynamics, which is that entropy tends to increase. It’s a ‘law’ only in the sense that it arises from the assumption that all microstates for a given macrostate are equally probable. If, for some reason (divine intervention?) that were not true, then of course systems could end up in unlikely states on a regular basis. However, the second law does seem to hold for all everyday processes, especially those involved in keeping our modern technological society running.

The second law leads to a couple of important definitions in thermodynamics. Any process which creates new entropy is said to be irreversible, since reversing the process would result in a decrease in entropy, violating the second law. Any process which leaves the total amount of entropy in the universe unchanged is therefore, in principle, reversible.

If we have a way of calculating or estimating {\Omega}, then clearly {S} is easy to find, so we’ll give a few examples here.

Example 1 In a standard pack of playing cards there are 52 different cards, so they can be arranged in {52!=8.07\times10^{67}} ways. The size of this number is why it’s highly unlikely that any card game that relies on dealing cards from a shuffled deck will ever repeat itself. The entropy of a shuffled deck is therefore

\displaystyle  S=1.38\times10^{-23}\times\ln52!\approx2.16\times10^{-21}\mbox{ J K}^{-1} \ \ \ \ \ (2)

Without Boltzmann’s constant, we have {S=\ln52!=156.4}.

Although playing cards aren’t made of an Einstein solid, the multiplicity of the macrostate in which thermal energy is exchanged among the cards will be something of a similar order. The approximate multiplicity in the high temperature case for an Einstein solid with {N} oscillators and {q\gg N} energy quanta is

\displaystyle  \Omega\approx\left(\frac{qe}{N}\right)^{N} \ \ \ \ \ (3)

For {N} on the order of {10^{23}}, {\Omega} is a very large number, so the thermal entropy of the cards is vastly greater than the entropy generated by shuffling the deck.

Example 2 Now consider two Einstein solids with {N_{A}=300}, {N_{B}=200} and {q=100}. The most likely macrostate will see the energy divided proportionately between the two solids, so {q_{A}=60} and {q_{B}=40}. The multiplicity of this macrostate is given by

\displaystyle   \Omega \displaystyle  = \displaystyle  \binom{q_{A}+N_{A}-1}{q_{A}}\binom{q_{B}+N_{B}-1}{q_{B}}\ \ \ \ \ (4)
\displaystyle  \displaystyle  = \displaystyle  \binom{359}{60}\binom{239}{40}\ \ \ \ \ (5)
\displaystyle  \displaystyle  = \displaystyle  6.866\times10^{114}\ \ \ \ \ (6)
\displaystyle  \ln\Omega \displaystyle  = \displaystyle  264.4 \ \ \ \ \ (7)

The least likely macrostate would find all the energy in the smaller solid, so that {q_{A}=0} and {q_{B}=100}. In that case

\displaystyle   \Omega \displaystyle  = \displaystyle  \binom{q_{A}+N_{A}-1}{q_{A}}\binom{q_{B}+N_{B}-1}{q_{B}}\ \ \ \ \ (8)
\displaystyle  \displaystyle  = \displaystyle  \binom{300}{0}\binom{299}{100}\ \ \ \ \ (9)
\displaystyle  \displaystyle  = \displaystyle  2.772\times10^{81}\ \ \ \ \ (10)
\displaystyle  \ln\Omega \displaystyle  = \displaystyle  187.5 \ \ \ \ \ (11)

Over long time scales, the interaction between the two solids means that all microstates are accessible. In this case the multiplicity is

\displaystyle   \Omega \displaystyle  = \displaystyle  \binom{q_{A}+q_{B}+N_{A}+N_{B}-1}{q_{A}+q_{B}}\ \ \ \ \ (12)
\displaystyle  \displaystyle  = \displaystyle  \binom{599}{100}\ \ \ \ \ (13)
\displaystyle  \displaystyle  = \displaystyle  9.26\times10^{115}\ \ \ \ \ (14)
\displaystyle  \ln\Omega \displaystyle  = \displaystyle  267.0 \ \ \ \ \ (15)

Thus the most probable state with the solids divided has almost as much entropy as when the whole system is a single state.

Example 3 Returning to the example of two large, identical Einstein solids, each containing {N} oscillators with a total energy of {q=2N}. For the state in which all microstates are allowed, we had

\displaystyle  \Omega_{total}\approx\frac{2^{4N}}{\sqrt{8\pi N}} \ \ \ \ \ (16)

If {N=10^{23}}, the entropy of this total state is

\displaystyle   S_{total} \displaystyle  = \displaystyle  k\ln\Omega_{total}\ \ \ \ \ (17)
\displaystyle  \displaystyle  = \displaystyle  \left(1.38\times10^{-23}\right)\left(4\times10^{23}\ln2-\frac{1}{2}\ln\left(8\pi\times10^{23}\right)\right)\ \ \ \ \ (18)
\displaystyle  \displaystyle  = \displaystyle  \left(1.38\times10^{-23}\right)\left(2.773\times10^{23}-28.1\right)\ \ \ \ \ (19)
\displaystyle  \displaystyle  \approx \displaystyle  \left(1.38\times10^{-23}\right)\left(2.773\times10^{23}\right)\ \ \ \ \ (20)
\displaystyle  \displaystyle  = \displaystyle  3.83\mbox{ J K}^{-1} \ \ \ \ \ (21)

The most probable state, where the energy is equally divided between the two solids, had a multiplicity of

\displaystyle  \Omega_{mp}\approx\frac{2^{4N}}{4\pi N} \ \ \ \ \ (22)

so the entropy is

\displaystyle   S_{mp} \displaystyle  = \displaystyle  k\ln\Omega_{mp}\ \ \ \ \ (23)
\displaystyle  \displaystyle  = \displaystyle  \left(1.38\times10^{-23}\right)\left(4\times10^{23}\ln2-\ln\left(4\pi\times10^{23}\right)\right)\ \ \ \ \ (24)
\displaystyle  \displaystyle  \approx \displaystyle  3.83\mbox{ J K}^{-1} \ \ \ \ \ (25)

Thus the entropy is essentially determined entirely by the {2^{4N}} factor, and it makes no difference whether the system is considered as a single system with all microstates possible or as two separate solids in their most probable configuration.

Over long times, the system is free to explore all the microstates included in {\Omega_{total}}, but at any given time, it is overwhelmingly likely to be at or near its most probable state. Thus over short time scales, we can say that the entropy is given by {S_{mp}}, but over long time scales it is given by {S_{total}}, which is ever so slightly larger than {S_{mp}}. We can, however, picture a scenario in which we choose a time when the system is in its most probable macrostate (so the entropy is {S_{mp}}) and then insert an impermeable barrier between the two solids so they can no longer exchange energy. In effect, this means that the system is stuck in its most probable state for all time, so its entropy will always be {S_{mp}}. In practice, though, the difference between {S_{mp}} and {S_{total}} is so small that it is utterly unmeasurable. Also, the mechanical process of inserting the barrier would probably increase the entropy of the surroundings by more than the decrease in entropy in the two solids. Thus, it’s unlikely that the second law has been violated and even if it were possible to insert the barrier with no additional entropy, the violation is so small it could never be measured.

17 thoughts on “Entropy: the second law of thermodynamics and reversible & irreversible processes

  1. Pingback: Entropy of an ideal gas; Sackur-Tetrode equation | Physics pages

  2. Pingback: Entropy: a few examples | Physics pages

  3. Pingback: Entropy of mixing | Physics pages

  4. Pingback: Temperature defined from entropy | Physics pages

  5. Pingback: Entropy changes in macroscopic systems | Physics pages

  6. Pingback: Entropy of solar irradiance of the Earth | Physics pages

  7. Pingback: Entropy from erasing computer memory | Physics pages

  8. Pingback: Two-state paramagnet: numerical solution | Physics pages

  9. Pingback: Two-state paramagnet: analytic solution | Physics pages

  10. Pingback: Einstein solid – numerical solution | Physics pages

  11. Pingback: Einstein solid: analytic solution for heat capacity | Physics pages

  12. Pingback: Throttling: enthalpy versus entropy | Physics pages

  13. Pingback: Gibbs energy in batteries | Physics pages

  14. Pingback: Muscle as a fuel cell | Physics pages

  15. Pingback: Extensive and intensive quantities | Physics pages

  16. Kristian

    There seems to be a mistake in problem 2.22 or 2.30 For (omega)total in problem 2.22 you write 2^4N/((Sqrt)(8piN)), but in 2.30 you write it as 2^4N/((Sqrt)(2piN).

    Reply
    1. gwrowe Post author

      Fixed now. Thanks. (Although replacing 2 by 8 doesn’t actually affect the answer, since the numerator is a huge number while the denominator is only a large number.)

      Reply

Leave a Reply

Your email address will not be published. Required fields are marked *