Third law of thermodynamics; residual entropy

Reference: Daniel V. Schroeder, An Introduction to Thermal Physics, (Addison-Wesley, 2000) – Problem 3.9.

The entropy is related to temperature by

\displaystyle \frac{1}{T}=\left(\frac{\partial S}{\partial U}\right)_{N,V} \ \ \ \ \ (1)

Using the chain rule, and keeping everything at constant {N} and {V}, we can measure the change in entropy due to a change in temperature as

\displaystyle dS=\frac{dU}{T}=\left(\frac{\partial U}{\partial T}\right)_{N,V}\frac{dT}{T}=C_{V}\frac{dT}{T} \ \ \ \ \ (2)

where {C_{V}} is the heat capacity at constant volume:

\displaystyle C_{V}=\left(\frac{\partial U}{\partial T}\right)_{N,V} \ \ \ \ \ (3)

If we know {C_{V}\left(T\right)} as a function of temperature, we can therefore find the change in entropy for a finite change in temperature by integration:

\displaystyle \Delta S=S_{f}-S_{i}=\int_{T_{i}}^{T_{f}}\frac{C_{V}\left(T\right)}{T}dT \ \ \ \ \ (4)

The total entropy in a system at temperature {T_{f}} could theoretically be found by setting {T_{i}=0} in the integral

\displaystyle S_{f}-S\left(0\right)=\int_{0}^{T_{f}}\frac{C_{V}\left(T\right)}{T}dT \ \ \ \ \ (5)

In theory, at absolute zero, any system should be in its (presumably) unique lowest energy state so the multiplicity of the zero state is 1, meaning that {S\left(0\right)=0}, and this integral does in fact give the actual entropy in a system at temperature {T_{f}}. It’s also obvious that for this integral to be finite (and positive) {C_{V}\rightarrow0} as {T\rightarrow0} at a rate such that the integral doesn’t diverge at its lower limit. Thus we must have {C_{V}\left(T\right)\propto T^{a}} where {a>0} as {T\rightarrow0}. Either of these conditions is a statement of the third law of thermodynamics, which basically says that at absolute zero, the entropy of any system is zero.

In practice, as a substance is cooled, its molecular configuration can get frozen into one of several possible ground states, so that there is a residual entropy even when {T=0\mbox{ K}}.

Example Carbon monoxide molecules are linear and in the solid form, they can line up in two orientations: OC and CO. Thus at absolute zero, the collection of molecules can be considered as a frozen-in matrix of molecules oriented randomly, so for a sample of {N} molecules, there are {2^{N}} possible structures. For a mole, the residual entropy is therefore

\displaystyle S_{res}=k\ln2^{6.02\times10^{23}}=\left(1.38\times10^{-23}\right)\left(6.02\times10^{23}\right)\ln2=5.76\mbox{ J K}^{-1} \ \ \ \ \ (6)

12 thoughts on “Third law of thermodynamics; residual entropy

  1. Pingback: Entropy changes in macroscopic systems | Physics pages

  2. Pingback: Entropy of aluminum at low temperatures | Physics pages

  3. Pingback: Entropy of a star | Physics pages

  4. Pingback: Heat capacities using Maxwell relations | Physics pages

  5. Pingback: Phases of helium-3 | Physics pages

  6. Pingback: Calcium carbonate phase diagram | Physics pages

  7. Pingback: Diamond-graphite phase boundary | Physics pages

  8. Nikos Christodoulou

    I presume in the example that the multiplicity is 2^N if one uses the combination of N with respect to N/2 that is equal to N!/[N/2)!]^2. If one goes back to problem 2.23(a) and using Sterling’s approximation the ratio is roughly equal to 2^N*sqrt(2/πN). But the square root is very small compared to the factor 2^N and therefore Ω(N,N/2)~2^N. I assume this problem is similar to having N elementary dipoles with half pointing up and half pointing down. Please let me know if this is correct.

    Reply
    1. gwrowe Post author

      I’m not sure what you mean. In this case, each CO molecule can ‘freeze’ into one of two orientations, so the multiplicity (number of possible microstates) is just {2^{N}} and the entropy is (exactly) what is given in eqn 6. No need for any approximations.

      Reply
  9. Nikos Christodoulou

    All I wanted to say (apparently not clearly) is whether the example of the CO and OC molecules is similar to the examples in the book with 2 micro-states such as the three coins in section 2.1 (e.g. Table 2.1) or the 2-state paramagnet in the same section (e.g., Fig 2.1). I was just wondering about the justification of the multiplicity being 2^N and how it is derived.

    Reply
    1. gwrowe Post author

      For any system composed of {N} units, each of which can be in 2 states, the number of possible microstates is always just {2^{N}}. There isn’t really much of a derivation – you just say that unit 1 can be in 2 states, for each of which unit 2 can be in 2 states, so there is a total of {2^{2}=4} possible states for the first 2 units and so on.

      What confused me about your first comment was the mention of {\frac{N}{2}}. In the example with CO molecules, we’re not requiring half of them to be in one state and half in the other. Rather, we’re just concerned with the total number of possible microstates for a colleciton of {N} molecules, and that’s {2^{N}}, from which the entropy follows exactly, without any factorials or Stirling’s approximation.

      Reply
  10. Nikos Christodoulou

    Thanks. Yes my example mentioning problem 2.23a was not the best. I should not have used it. I appreciate taking the time to explain. After all I am new to these concepts.

    Reply

Leave a Reply

Your email address will not be published. Required fields are marked *