Third law of thermodynamics explained

The third law of thermodynamics states that the entropy of a closed system at thermodynamic equilibrium approaches a constant value when its temperature approaches absolute zero. This constant value cannot depend on any other parameters characterizing the system, such as pressure or applied magnetic field. At absolute zero (zero kelvins) the system must be in a state with the minimum possible energy.

Entropy is related to the number of accessible microstates, and there is typically one unique state (called the ground state) with minimum energy.[1] In such a case, the entropy at absolute zero will be exactly zero. If the system does not have a well-defined order (if its order is glassy, for example), then there may remain some finite entropy as the system is brought to very low temperatures, either because the system becomes locked into a configuration with non-minimal energy or because the minimum energy state is non-unique. The constant value is called the residual entropy of the system.[2]

Formulations

The third law has many formulations, some more general than others, some equivalent, and some neither more general nor equivalent.[3]

The Planck statement applies only to perfect crystalline substances:

As temperature falls to zero, the entropy of any pure crystalline substance tends to a universal constant.

That is,

\limTS=S0

, where

S0

is a universal constant that applies for all possible crystals, of all possible sizes, in all possible external constraints. So it can be taken as zero, giving

\limTS=0

.

The Nernst statement concerns thermodynamic processes at a fixed, low temperature, for condensed systems, which are liquids and solids:

The entropy change associated with any condensed system undergoing a reversible isothermal process approaches zero as the temperature at which it is performed approaches 0 K.
That is,

\limTS(T,X1)-S(T,X2)=0

.

Or equivalently,

At absolute zero, the entropy change becomes independent of the process path.
That is, \forall x, \lim_ |S(T, x) - S(T, x + \Delta x)| \to 0 where

\Deltax

represents a change in the state variable

x

.

The unattainability principle of Nernst:[4]

It is impossible for any process, no matter how idealized, to reduce the entropy of a system to its absolute-zero value in a finite number of operations.[5]
This principle implies that cooling a system to absolute zero would require an infinite number of steps or an infinite amount of time.

The statement in adiabatic accessibility:

It is impossible to start from a state of positive temperature, and adiabatically reach a state with zero temperature.

The Einstein statement:

The entropy of any substance approaches a finite value as the temperature approaches absolute zero.
That is, \forall x, \lim_ S(T, x) \rightarrow S_0(x) where

S

is the entropy, the zero-point entropy

S0(x)

is finite-valued,

T

is the temperature, and

x

represents other relevant state variables.

This implies that the heat capacity

C(T,x)

of a substance must (uniformly) vanish at absolute zero, as otherwise the entropy

S=

T1
\int
0
C(T,x)dT
T
would diverge.

There is also a formulation as the impossibility of "perpetual motion machines of the third kind".

History

The third law was developed by chemist Walther Nernst during the years 1906 to 1912 and is therefore often referred to as the Nernst heat theorem, or sometimes the Nernst-Simon heat theorem[6] to include the contribution of Nernst's doctoral student Francis Simon. The third law of thermodynamics states that the entropy of a system at absolute zero is a well-defined constant. This is because a system at zero temperature exists in its ground state, so that its entropy is determined only by the degeneracy of the ground state.

In 1912 Nernst stated the law thus: "It is impossible for any procedure to lead to the isotherm in a finite number of steps."[7]

An alternative version of the third law of thermodynamics was enunciated by Gilbert N. Lewis and Merle Randall in 1923:

If the entropy of each element in some (perfect) crystalline state be taken as zero at the absolute zero of temperature, every substance has a finite positive entropy; but at the absolute zero of temperature the entropy may become zero, and does so become in the case of perfect crystalline substances.

This version states not only

\DeltaS

will reach zero at 0 K, but

S

itself will also reach zero as long as the crystal has a ground state with only one configuration. Some crystals form defects which cause a residual entropy. This residual entropy disappears when the kinetic barriers to transitioning to one ground state are overcome.[8]

With the development of statistical mechanics, the third law of thermodynamics (like the other laws) changed from a fundamental law (justified by experiments) to a derived law (derived from even more basic laws). The basic law from which it is primarily derived is the statistical-mechanics definition of entropy for a large system:

S-S0=kBln\Omega

where

S

is entropy,

kB

is the Boltzmann constant, and

\Omega

is the number of microstates consistent with the macroscopic configuration. The counting of states is from the reference state of absolute zero, which corresponds to the entropy of

S0

.

Explanation

In simple terms, the third law states that the entropy of a perfect crystal of a pure substance approaches zero as the temperature approaches zero. The alignment of a perfect crystal leaves no ambiguity as to the location and orientation of each part of the crystal. As the energy of the crystal is reduced, the vibrations of the individual atoms are reduced to nothing, and the crystal becomes the same everywhere.

The third law provides an absolute reference point for the determination of entropy at any other temperature. The entropy of a closed system, determined relative to this zero point, is then the absolute entropy of that system. Mathematically, the absolute entropy of any system at zero temperature is the natural log of the number of ground states times the Boltzmann constant .

The entropy of a perfect crystal lattice as defined by Nernst's theorem is zero provided that its ground state is unique, because . If the system is composed of one-billion atoms that are all alike and lie within the matrix of a perfect crystal, the number of combinations of one billion identical things taken one billion at a time is . Hence:

S - S_0 = k_\text \ln\Omega = k_\text\ln = 0

The difference is zero; hence the initial entropy can be any selected value so long as all other such calculations include that as the initial entropy. As a result, the initial entropy value of zero is selected is used for convenience.

S - S_0 = S - 0 = 0

S = 0

Example: Entropy change of a crystal lattice heated by an incoming photon

Suppose a system consisting of a crystal lattice with volume of identical atoms at, and an incoming photon of wavelength and energy .

Initially, there is only one accessible microstate:S_0 = k_\text \ln\Omega = k_\text\ln = 0.

Let us assume the crystal lattice absorbs the incoming photon. There is a unique atom in the lattice that interacts and absorbs this photon. So after absorption, there are possible microstates accessible by the system, each corresponding to one excited atom, while the other atoms remain at ground state.

The entropy, energy, and temperature of the closed system rises and can be calculated. The entropy change is \Delta S = S - S_ = k_\text \ln

From the second law of thermodynamics:\Delta S = S - S_0 = \frac

Hence\Delta S = S - S_0 = k_\text \ln(\Omega) = \frac

Calculating entropy change:

S - 0 = k_\text \ln = 1.38 \times 10^ \times \ln = 70 \times 10^ \,\mathrm

We assume and . The energy change of the system as a result of absorbing the single photon whose energy is :

\delta Q = \varepsilon = \frac =\frac=2 \times 10^ \,\mathrm

The temperature of the closed system rises by

T = \frac = \frac = 0.02857 \,\mathrm

This can be interpreted as the average temperature of the system over the range from

0<S<70 x 10-23

JK-1
.[9] A single atom is assumed to absorb the photon, but the temperature and entropy change characterizes the entire system.

Systems with non-zero entropy at absolute zero

An example of a system that does not have a unique ground state is one whose net spin is a half-integer, for which time-reversal symmetry gives two degenerate ground states. For such systems, the entropy at zero temperature is at least (which is negligible on a macroscopic scale). Some crystalline systems exhibit geometrical frustration, where the structure of the crystal lattice prevents the emergence of a unique ground state. Ground-state helium (unless under pressure) remains liquid.

Glasses and solid solutions retain significant entropy at 0 K, because they are large collections of nearly degenerate states, in which they become trapped out of equilibrium. Another example of a solid with many nearly-degenerate ground states, trapped out of equilibrium, is ice Ih, which has "proton disorder".

For the entropy at absolute zero to be zero, the magnetic moments of a perfectly ordered crystal must themselves be perfectly ordered; from an entropic perspective, this can be considered to be part of the definition of a "perfect crystal". Only ferromagnetic, antiferromagnetic, and diamagnetic materials can satisfy this condition. However, ferromagnetic materials do not, in fact, have zero entropy at zero temperature, because the spins of the unpaired electrons are all aligned and this gives a ground-state spin degeneracy. Materials that remain paramagnetic at 0 K, by contrast, may have many nearly degenerate ground states (for example, in a spin glass), or may retain dynamic disorder (a quantum spin liquid).

Consequences

Absolute zero

The third law is equivalent to the statement that

It is impossible by any procedure, no matter how idealized, to reduce the temperature of any closed system to zero temperature in a finite number of finite operations.[10]

The reason that cannot be reached according to the third law is explained as follows: Suppose that the temperature of a substance can be reduced in an isentropic process by changing the parameter X from X2 to X1. One can think of a multistage nuclear demagnetization setup where a magnetic field is switched on and off in a controlled way.[11] If there were an entropy difference at absolute zero, could be reached in a finite number of steps. However, at T = 0 there is no entropy difference, so an infinite number of steps would be needed. The process is illustrated in Fig. 1.

Example: magnetic refrigeration

To be concrete, we imagine that we are refrigerating magnetic material. Suppose we have a large bulk of paramagnetic salt and an adjustable external magnetic field in the vertical direction.

Let the parameter

X

represent the external magnetic field. At the same temperature, if the external magnetic field is strong, then the internal atoms in the salt would strongly align with the field, so the disorder (entropy) would decrease. Therefore, in Fig. 1, the curve for

X1

is the curve for lower magnetic field, and the curve for

X2

is the curve for higher magnetic field.

The refrigeration process repeats the following two steps:

X1

and temperature

T

. We divide the chunk into two parts: a large part playing the role of "environment", and a small part playing the role of "system". We slowly increase the magnetic field on the system to

X2

, but keep the magnetic field constant on the environment. The atoms in the system would lose directional degrees of freedom (DOF), and the energy in the directional DOF would be squeezed out into the vibrational DOF. This makes it slightly hotter, and then it would lose thermal energy to the environment, to remain in the same temperature

T

.

X1

. This frees up the direction DOF, absorbing some energy from the vibrational DOF. The effect is that the system has the same entropy, but reaches a lower temperature

T'<T

.

At every two-step of the process, the mass of the system decreases, as we discard more and more salt as the "environment". However, if the equations of state for this salt is as shown in Fig. 1 (left), then we can start with a large but finite amount of salt, and end up with a small piece of salt that has

T=0

.

Specific heat

A non-quantitative description of his third law that Nernst gave at the very beginning was simply that the specific heat of a material can always be made zero by cooling it down far enough.[12] A modern, quantitative analysis follows.

Suppose that the heat capacity of a sample in the low temperature region has the form of a power law asymptotically as, and we wish to find which values of are compatible with the third law. We haveBy the discussion of third law above, this integral must be bounded as, which is only possible if . So the heat capacity must go to zero at absolute zeroif it has the form of a power law. The same argument shows that it cannot be bounded below by a positive constant, even if we drop the power-law assumption.

On the other hand, the molar specific heat at constant volume of a monatomic classical ideal gas, such as helium at room temperature, is given by with the molar ideal gas constant. But clearly a constant heat capacity does not satisfy Eq. . That is, a gas with a constant heat capacity all the way to absolute zero violates the third law of thermodynamics. We can verify this more fundamentally by substituting in Eq., which yieldsIn the limit this expression diverges, again contradicting the third law of thermodynamics.

The conflict is resolved as follows: At a certain temperature the quantum nature of matter starts to dominate the behavior. Fermi particles follow Fermi–Dirac statistics and Bose particles follow Bose–Einstein statistics. In both cases the heat capacity at low temperatures is no longer temperature independent, even for ideal gases. For Fermi gaseswith the Fermi temperature TF given byHere is the Avogadro constant, the molar volume, and the molar mass.

For Bose gaseswith given byThe specific heats given by Eq. and both satisfy Eq. . Indeed, they are power laws with and respectively.

Even within a purely classical setting, the density of a classical ideal gas at fixed particle number becomes arbitrarily high as goes to zero, so the interparticle spacing goes to zero. The assumption of non-interacting particles presumably breaks down when they are sufficiently close together, so the value of gets modified away from its ideal constant value.

Vapor pressure

The only liquids near absolute zero are 3He and 4He. Their heat of evaporation has a limiting value given bywith and constant. If we consider a container partly filled with liquid and partly gas, the entropy of the liquid–gas mixture is

10-31mmHg

is so low that the gas density is lower than the best vacuum in the universe. In other words, below 100 mK there is simply no gas above the liquid.[13]

Miscibility

If liquid helium with mixed 3He and 4He were cooled to absolute zero, the liquid must have zero entropy. This either means they are ordered perfectly as a mixed liquid, which is impossible for a liquid, or that they fully separate out into two layers of pure liquid. This is precisely what happens.

For example, if a solution with 3 3He to 2 4He atoms were cooled, it would start the separation at 0.9 K, purifying more and more, until at absolute zero, when the upper layer becomes purely 3He, and the lower layer becomes purely 4He.

Surface tension

Let

\sigma

be the surface tension of liquid, then the entropy per area is

-d\sigma/dT

. So if a liquid can exist down to absolute zero, then since its entropy is constant no matter its shape at absolute zero, its entropy per area must converge to zero. That is, its surface tension would become constant at low temperatures. In particular, the surface tension of 3He is well-approximated by

\sigma=\sigma0-bT2

for some parameters

\sigma0,b

.[14]

Latent heat of melting

The melting curves of 3He and 4He both extend down to absolute zero at finite pressure. At the melting pressure, liquid and solid are in equilibrium. The third law demands that the entropies of the solid and liquid are equal at . As a result, the latent heat of melting is zero, and the slope of the melting curve extrapolates to zero as a result of the Clausius–Clapeyron equation.

Thermal expansion coefficient

The thermal expansion coefficient is defined as

With the Maxwell relationand Eq. with it is shown thatSo the thermal expansion coefficient of all materials must go to zero at zero kelvin.

See also

Further reading

Notes and References

  1. J. Wilks The Third Law of Thermodynamics Oxford University Press (1961).
  2. Kittel and Kroemer, Thermal Physics (2nd ed.), page 49.
  3. Klimenko . A. Y. . 2012-06-29 . Teaching the third law of thermodynamics . The Open Thermodynamics Journal . 6 . 1 . 1–14 . 10.2174/1874396X01206010001. 1208.4189 .
  4. Masanes . Lluís . Oppenheim . Jonathan . 2017-03-14 . A general derivation and quantification of the third law of thermodynamics . Nature Communications . en . 8 . 1 . 14538 . 10.1038/ncomms14538 . 2041-1723 . 5355879 . 28290452. 1412.3828 . 2017NatCo...814538M .
  5. Wilks, J. (1971). The Third Law of Thermodynamics, Chapter 6 in Thermodynamics, volume 1, ed. W. Jost, of H. Eyring, D. Henderson, W. Jost, Physical Chemistry. An Advanced Treatise, Academic Press, New York, page 477.
  6. Wheeler . John C. . Nonequivalence of the Nernst-Simon and unattainability statements of the third law of thermodynamics . Physical Review A . 1 May 1991 . 43 . 10 . 5289–5295 . 10.1103/PhysRevA.43.5289 . 9904841 . 1991PhRvA..43.5289W . 1 August 2023.
  7. Bailyn, M. (1994). A Survey of Thermodynamics, American Institute of Physics, New York,, page 342.
  8. 10.3390/e10030274 . Residual Entropy, the Third Law and Latent Heat . 2008 . Kozliak . Evguenii . Frank L. . Lambert . Entropy . 10 . 3 . 274–84 . 2008Entrp..10..274K. free .
  9. Book: Reynolds and Perkins. Engineering Thermodynamics. registration. 1977. McGraw Hill. 978-0-07-052046-2. 438.
  10. [Edward A. Guggenheim|Guggenheim, E.A.]
  11. Book: Pobell, Frank. Matter and Methods at Low Temperatures. Springer-Verlag. Berlin. 2007. 978-3-662-08580-6.
  12. Einstein and the Quantum, A. Douglas Stone, Princeton University Press, 2013.
  13. Book: Pippard, Alfred B. . Elements of classical thermodynamics: for advanced students of physics . 1981 . Univ. Pr . 978-0-521-09101-5 . Repr . Cambridge.
  14. Suzuki . M . Okuda . Y . Ikushima . A. J . Iino . M . 1988-02-15 . Surface Tension of Liquid 3He from 0.4 K down to 15 mK . Europhysics Letters (EPL) . 5 . 4 . 333–337 . 10.1209/0295-5075/5/4/009 . 1988EL......5..333S . 0295-5075.