Boltzmann distribution explained

In statistical mechanics and mathematics, a Boltzmann distribution (also called Gibbs distribution[1]) is a probability distribution or probability measure that gives the probability that a system will be in a certain state as a function of that state's energy and the temperature of the system. The distribution is expressed in the form:

pi\propto\exp\left(-

\varepsiloni
kT

\right)

where is the probability of the system being in state, is the exponential function, is the energy of that state, and a constant of the distribution is the product of the Boltzmann constant and thermodynamic temperature . The symbol \propto denotes proportionality (see for the proportionality constant).

The term system here has a wide meaning; it can range from a collection of 'sufficient number' of atoms or a single atom to a macroscopic system such as a natural gas storage tank. Therefore the Boltzmann distribution can be used to solve a wide variety of problems. The distribution shows that states with lower energy will always have a higher probability of being occupied.

The ratio of probabilities of two states is known as the Boltzmann factor and characteristically only depends on the states' energy difference:

pi
pj

=\exp\left(

\varepsilonj-\varepsiloni
kT

\right)

The Boltzmann distribution is named after Ludwig Boltzmann who first formulated it in 1868 during his studies of the statistical mechanics of gases in thermal equilibrium.[2] Boltzmann's statistical work is borne out in his paper “On the Relationship between the Second Fundamental Theorem of the Mechanical Theory of Heat and Probability Calculations Regarding the Conditions for Thermal Equilibrium"[3] The distribution was later investigated extensively, in its modern generic form, by Josiah Willard Gibbs in 1902.[4]

The Boltzmann distribution should not be confused with the Maxwell–Boltzmann distribution or Maxwell-Boltzmann statistics. The Boltzmann distribution gives the probability that a system will be in a certain state as a function of that state's energy,[5] while the Maxwell-Boltzmann distributions give the probabilities of particle speeds or energies in ideal gases. The distribution of energies in a one-dimensional gas however, does follow the Boltzmann distribution.

The distribution

The Boltzmann distribution is a probability distribution that gives the probability of a certain state as a function of that state's energy and temperature of the system to which the distribution is applied.[6] It is given asp_i=\frac \exp\left(- \frac \right) = \frac

where:

Q = \sum_^ \exp\left(- \tfrac \right) It results from the constraint that the probabilities of all accessible states must add up to 1.

Using Lagrange multipliers, one can prove that the Boltzmann distribution is the distribution that maximizes the entropyS(p_1,p_2,\cdots,p_M) = -\sum_^ p_i\log_2 p_i

subject to the normalization constraint that \sum p_i=1 and the constraint that \sum equals a particular mean energy value, except for two special cases. (These special cases occur when the mean value is either the minimum or maximum of the energies . In these cases, the entropy maximizing distribution is a limit of Boltzmann distributions where approaches zero from above or below, respectively.)

The partition function can be calculated if we know the energies of the states accessible to the system of interest. For atoms the partition function values can be found in the NIST Atomic Spectra Database.[7]

The distribution shows that states with lower energy will always have a higher probability of being occupied than the states with higher energy. It can also give us the quantitative relationship between the probabilities of the two states being occupied. The ratio of probabilities for states and is given as\frac = \exp\left(\frac \right)

where:

The corresponding ratio of populations of energy levels must also take their degeneracies into account.

The Boltzmann distribution is often used to describe the distribution of particles, such as atoms or molecules, over bound states accessible to them. If we have a system consisting of many particles, the probability of a particle being in state is practically the probability that, if we pick a random particle from that system and check what state it is in, we will find it is in state . This probability is equal to the number of particles in state divided by the total number of particles in the system, that is the fraction of particles that occupy state .

pi=

Ni
N

where is the number of particles in state and is the total number of particles in the system. We may use the Boltzmann distribution to find this probability that is, as we have seen, equal to the fraction of particles that are in state i. So the equation that gives the fraction of particles in state as a function of the energy of that state is [5] \frac = \frac

This equation is of great importance to spectroscopy. In spectroscopy we observe a spectral line of atoms or molecules undergoing transitions from one state to another.[5] [8] In order for this to be possible, there must be some particles in the first state to undergo the transition. We may find that this condition is fulfilled by finding the fraction of particles in the first state. If it is negligible, the transition is very likely not observed at the temperature for which the calculation was done. In general, a larger fraction of molecules in the first state means a higher number of transitions to the second state.[9] This gives a stronger spectral line. However, there are other factors that influence the intensity of a spectral line, such as whether it is caused by an allowed or a forbidden transition.

The softmax function commonly used in machine learning is related to the Boltzmann distribution:

(p1,\ldots,pM)=\operatorname{softmax}\left[-

\varepsilon1
kT

,\ldots,-

\varepsilonM
kT

\right]

Generalized Boltzmann distribution

Distribution of the form

n
\Pr\left(\omega\right)\propto\exp\left[\sum
η=1
X
\left(\omega\right)
x
η
η
-
kBT
E\left(\omega\right)
kBT

\right]

is called generalized Boltzmann distribution by some authors.[10]

The Boltzmann distribution is a special case of the generalized Boltzmann distribution. The generalized Boltzmann distribution is used in statistical mechanics to describe canonical ensemble, grand canonical ensemble and isothermal–isobaric ensemble. The generalized Boltzmann distribution is usually derived from the principle of maximum entropy, but there are other derivations.[11]

The generalized Boltzmann distribution has the following properties:

In statistical mechanics

See main article: Canonical ensemble and Maxwell–Boltzmann statistics.

The Boltzmann distribution appears in statistical mechanics when considering closed systems of fixed composition that are in thermal equilibrium (equilibrium with respect to energy exchange). The most general case is the probability distribution for the canonical ensemble. Some special cases (derivable from the canonical ensemble) show the Boltzmann distribution in different aspects:

Canonical ensemble (general case)
  • The canonical ensemble gives the probabilities of the various possible states of a closed system of fixed volume, in thermal equilibrium with a heat bath. The canonical ensemble has a state probability distribution with the Boltzmann form.
    Statistical frequencies of subsystems' states (in a non-interacting collection)
  • When the system of interest is a collection of many non-interacting copies of a smaller subsystem, it is sometimes useful to find the statistical frequency of a given subsystem state, among the collection. The canonical ensemble has the property of separability when applied to such a collection: as long as the non-interacting subsystems have fixed composition, then each subsystem's state is independent of the others and is also characterized by a canonical ensemble. As a result, the expected statistical frequency distribution of subsystem states has the Boltzmann form.
    Maxwell–Boltzmann statistics of classical gases (systems of non-interacting particles)
  • In particle systems, many particles share the same space and regularly change places with each other; the single-particle state space they occupy is a shared space. Maxwell–Boltzmann statistics give the expected number of particles found in a given single-particle state, in a classical gas of non-interacting particles at equilibrium. This expected number distribution has the Boltzmann form.

    Although these cases have strong similarities, it is helpful to distinguish them as they generalize in different ways when the crucial assumptions are changed:

    In mathematics

    See main article: Gibbs measure, Log-linear model and Boltzmann machine.

    In economics

    The Boltzmann distribution can be introduced to allocate permits in emissions trading.[13] [14] The new allocation method using the Boltzmann distribution can describe the most probable, natural, and unbiased distribution of emissions permits among multiple countries.

    The Boltzmann distribution has the same form as the multinomial logit model. As a discrete choice model, this is very well known in economics since Daniel McFadden made the connection to random utility maximization.[15]

    See also

    Notes and References

    1. Book: Landau, Lev Davidovich . Lifshitz, Evgeny Mikhailovich . amp . Statistical Physics . 5 . Course of Theoretical Physics . 3 . 1976 . 1980 . Oxford . Pergamon Press. 0-7506-3372-7. Lev Landau . Evgeny Lifshitz . Translated by J.B. Sykes and M.J. Kearsley. See section 28
    2. Boltzmann . Ludwig . Ludwig Boltzmann. 1868. Studien über das Gleichgewicht der lebendigen Kraft zwischen bewegten materiellen Punkten. Studies on the balance of living force between moving material points. Wiener Berichte . 58 . 517–560.
    3. Web site: Archived copy . 2017-05-11 . 2021-03-05 . https://web.archive.org/web/20210305005604/http://crystal.med.upenn.edu/sharp-lab-pdfs/2015SharpMatschinsky_Boltz1877_Entropy17.pdf . dead .
    4. Book: Gibbs, Josiah Willard . Josiah Willard Gibbs . Elementary Principles in Statistical Mechanics . 1902 . . New York. Elementary Principles in Statistical Mechanics .
    5. Atkins, P. W. (2010) Quanta, W. H. Freeman and Company, New York
    6. Book: McQuarrie, A. . 2000 . Statistical Mechanics . University Science Books . Sausalito, CA . 1-891389-15-7 .
    7. http://physics.nist.gov/PhysRefData/ASD/levels_form.html NIST Atomic Spectra Database Levels Form
    8. Book: Atkins . P. W. . de Paula . J. . 2009 . Physical Chemistry . 9th . Oxford University Press . Oxford . 978-0-19-954337-3 .
    9. Book: Skoog . D. A. . Holler . F. J. . Crouch . S. R. . 2006 . Principles of Instrumental Analysis . Brooks/Cole . Boston, MA . 978-0-495-12570-9 .
    10. Gao . Xiang . Gallicchio . Emilio . Adrian . Roitberg . 2019 . The generalized Boltzmann distribution is the only distribution in which the Gibbs-Shannon entropy equals the thermodynamic entropy . The Journal of Chemical Physics. 151. 3. 034113. 10.1063/1.5111333. 31325924 . 1903.02121 . 2019JChPh.151c4113G . 118981017 .
    11. Gao . Xiang . March 2022 . The Mathematics of the Ensemble Theory . Results in Physics. 34. 105230. 10.1016/j.rinp.2022.105230 . 2022ResPh..3405230G . 221978379 . 2006.00485 .
    12. A classic example of this is magnetic ordering. Systems of non-interacting spins show paramagnetic behaviour that can be understood with a single-particle canonical ensemble (resulting in the Brillouin function). Systems of interacting spins can show much more complex behaviour such as ferromagnetism or antiferromagnetism.
    13. Park, J.-W., Kim, C. U. and Isard, W. (2012) Permit allocation in emissions trading using the Boltzmann distribution. Physica A 391: 4883–4890
    14. http://www.technologyreview.com/view/425051/the-thorny-problem-of-fair-allocation/ The Thorny Problem Of Fair Allocation
    15. Book: Amemiya, Takeshi . none . Multinomial Logit Model . Advanced Econometrics . 1985 . Basil Blackwell . Oxford . 0-631-13345-3 . 295–299 . https://books.google.com/books?id=0bzGQE14CwEC&pg=PA296 .