In statistical thermodynamics, thermodynamic beta, also known as coldness, is the reciprocal of the thermodynamic temperature of a system: (where is the temperature and is Boltzmann constant).[1]
It was originally introduced in 1971 (as German: Kältefunktion "coldness function") by, one of the proponents of the rational thermodynamics school of thought,[2] [3] based on earlier proposals for a "reciprocal temperature" function.[4] [5]
Thermodynamic beta has units reciprocal to that of energy (in SI units, reciprocal joules,
[\beta]=rm{J}-1
8ln2 x 1018
Thermodynamic beta is essentially the connection between the information theory and statistical mechanics interpretation of a physical system through its entropy and the thermodynamics associated with its energy. It expresses the response of entropy to an increase in energy. If a system is challenged with a small amount of energy, then β describes the amount the system will randomize.
Via the statistical definition of temperature as a function of entropy, the coldness function can be calculated in the microcanonical ensemble from the formula
\beta=
1{k | |
\rm |
T}=
1 | \left( | |
k\rm |
\partialS | |
\partialE |
\right)V,
Though completely equivalent in conceptual content to temperature, is generally considered a more fundamental quantity than temperature owing to the phenomenon of negative temperature, in which is continuous as it crosses zero whereas has a singularity.
In addition, has the advantage of being easier to understand causally: If a small amount of heat is added to a system, is the increase in entropy divided by the increase in heat. Temperature is difficult to interpret in the same sense, as it is not possible to "Add entropy" to a system except indirectly, by modifying other quantities such as temperature, volume, or number of particles.
From the statistical point of view, β is a numerical quantity relating two macroscopic systems in equilibrium. The exact formulation is as follows. Consider two systems, 1 and 2, in thermal contact, with respective energies E1 and E2. We assume E1 + E2 = some constant E. The number of microstates of each system will be denoted by Ω1 and Ω2. Under our assumptions Ωi depends only on Ei. We also assume that any microstate of system 1 consistent with E1 can coexist with any microstate of system 2 consistent with E2. Thus, the number of microstates for the combined system is
\Omega=\Omega1(E1)\Omega2(E2)=\Omega1(E1)\Omega2(E-E1).
We will derive β from the fundamental assumption of statistical mechanics:
When the combined system reaches equilibrium, the number Ω is maximized.
(In other words, the system naturally seeks the maximum number of microstates.) Therefore, at equilibrium,
d | |
dE1 |
\Omega=\Omega2(E2)
d | |
dE1 |
\Omega1(E1)+\Omega1(E1)
d | |
dE2 |
\Omega2(E2) ⋅
dE2 | |
dE1 |
=0.
But E1 + E2 = E implies
dE2 | |
dE1 |
=-1.
So
\Omega2(E2)
d | |
dE1 |
\Omega1(E1)-\Omega1(E1)
d | |
dE2 |
\Omega2(E2)=0
i.e.
d | |
dE1 |
ln\Omega1=
d | |
dE2 |
ln\Omega2 atequilibrium.
The above relation motivates a definition of β:
\beta=
dln\Omega | |
dE |
.
When two systems are in equilibrium, they have the same thermodynamic temperature T. Thus intuitively, one would expect β (as defined via microstates) to be related to T in some way. This link is provided by Boltzmann's fundamental assumption written as
S=k\rmln\Omega,
where kB is the Boltzmann constant, S is the classical thermodynamic entropy, and Ω is the number of microstates. So
dln\Omega=
1 | |
k\rm |
dS.
Substituting into the definition of β from the statistical definition above gives
\beta=
1 | |
k\rm |
dS | |
dE |
.
Comparing with thermodynamic formula
dS | |
dE |
=
1 | |
T |
,
we have
\beta=
1 | |
k\rmT |
=
1 | |
\tau |
where
\tau