The Boltzmann distribution

The system plus the heat bath form an isolated system, and so we can use some of the results obtained in the previous section, in particular the fact that the sum of the two energies $E_0$ is constant. The energy of the system can assume any of its possible values $E_1, \dots E_r, \dots$. The probability $p_r$ for the system to be in a particular microstate $r$ with energy $E_r$ is proportional to the probability of the heat bath to have energy $E = E_0- E_r$, which is proportional to the statistical weight $\Omega_2(E_0 - E_r) = \exp\{S_2(E_0-E_r)/k_{\rm B}\}$, where we have used the definition of the entropy [*]. Since $E_r$ is small compared to the energy of the heat bath 3.8, we can expand the entropy of the bath around $E_0$ 3.9:

$\displaystyle S_2(E_0 - E_r) = S_2(E_0) - E_r \left ( \frac{\partial S_2}{\part...
...c{1}{2}E_r^2 \left ( \frac{\partial^2 S_2}{\partial E^2}\right )_{V,N} + \dots.$ (3.31)

Using using [*] we have:

$\displaystyle S_2(E_0 - E_r) = S_2(E_0) - \frac{E_r}{T_2}+\frac{1}{2}E_r^2 \frac{\partial}{\partial E} \left ( \frac{1}{T_2} \right ) + \dots.$ (3.32)

Under the assumption that the temperature of the heat bath $T_2$ does not change due to interactions with the system, the third term in the r.h.s. is zero, and we have:

$\displaystyle p_r \propto \Omega_2(E_0 - E_r) = e^{\left ({\frac{S_2(E_0)}{k_{\...
...B}T_2}}\right )}= e^{\frac{S_2(E_0)}{k_{\rm B}}} e^{- \frac{E_r}{k_{\rm B}T_2}}$ (3.33)

The first term on the r.h.s. is a constant, and can be included in the the constant of proportionality that makes [*] an equality. This is the condition of normalisation, $\sum_r p_r = 1$, with the sum running over all microstates, and so we can re-write [*] as

$\displaystyle p_r = \frac {e^{-\beta_2 E_r}}{\sum_r e^{-\beta_2 E_r}} = \frac{e^{-\beta_2 E_r}}{Z},$ (3.34)

where we have introduced the temperature parameter $\beta_2 = 1/k_{\rm B}T_2$ and the partition function

$\displaystyle Z = \sum_r e^{-\beta_2 E_r}.$ (3.35)

Eq. [*] is known as the Boltzmann distribution. Here $T_2$ is the temperature of the heat bath, but taking over the results of the previous section, at equilibrium it is also the temperature of the system. We can therefore drop the subscript and write $T$ (and correspondingly $\beta$). The totality of the microstates, now each with a different probability for the system to be found in any of them, forms the canonical ensemble, which is characterised by the system having constant volume $V$, constant number of particles $N$ and constant temperature $T$. The partition function, which depends on $N,V,T$, is a very powerful object, as we shall see that many thermodynamic properties of the system can be obtained from it. For example, the average energy of the system can be obtained as:

$\displaystyle \bar{E} = \sum_r E_r p_r = - \frac{\partial \ln Z}{\partial \beta}.$ (3.36)

Imagine that some of the energy levels are degenerate. We can group the equal terms and rewrite the list of energies so that they are all different, but each one has a multiplicity factor attached to it. For example, suppose we have $E_1 < E_2 = E_3 = E_4 < E_5 < \dots$. We rewrite this list as $E_1 < E_2 < E_3 < \dots$, and we include the multiplicity factor $\Omega(E_1) = 1; \Omega(E_2) = 3; \Omega(E_3) = 1; \dots$. This multiplicity factor is nothing but the statistical weight $\Omega(E_r)$ of the system in a particular energy level $E_r$. We can therefore rewrite Eq. [*] as:

$\displaystyle Z = \sum_{E_r} \Omega(E_r) e^{-\beta E_r},$ (3.37)

with all the energies in the sum now different. Since there are $\Omega(E_r)$ ways of realising the energy $E_r$, the probability that the system has energy $E_r$ is proportional to $\Omega(E_r)$, and we have:

$\displaystyle p(E_r) = \frac{\Omega(E_r)}{Z} e^{-\beta E_r}.$ (3.38)

This can be rewritten also in terms of the entropy $S(E_r) = k_{\rm B}\ln \Omega(E_r)$, which gives:

$\displaystyle p(E_r) = \frac{1}{Z} e^{-\beta [E_r - TS(E_r)]}.$ (3.39)

We now introduce the density of states $g(E)$ defined in such a way that $g(E)\Delta E$ is equal to the number of states with energy between $E$ and $E+\Delta E$, with $\Delta E$ a sufficiently large energy interval so that the number of states in this interval is large, but also sufficiently small that the Boltzmann factor does not change much. Imagine for example that the separation between the states is much smaller than $k_{\rm B}T$, $E_r - E_{r-1} \ll k_{\rm B}T$. In this case the number of states with energies in the range $k_{\rm B}T$ is large, and the factor $e^{-\beta E_r}$ is a slowly varying function of $E_r$. This means that with $\Delta E$ being a small range of energy levels such that $\Delta E \ll k_{\rm B}T$ we can approximate:

$\displaystyle \sum_{E = E_r}^{E_r + \Delta E} \Omega(E_r)e^{-\beta E_r} \simeq \int_{E_r}^{E_r+\Delta E} g(E) e^{-\beta E} dE.$ (3.40)

Under these conditions, we can rewrite the sum in Eq. [*] as an integral 3.10:

$\displaystyle Z = \sum_{E_r} \Omega(E_r) e^{-\beta E_r} = \int_0^{E_{max}} g(E) e^{-\beta E} dE,$ (3.41)

where we have set the ground state energy equal to zero and the maximum energy $E_{max}$ may be infinite. This also allows us to define a probability density:

$\displaystyle p(E) = \frac{g(E)}{Z} e^{-\beta E},$ (3.42)

such that $p(E)dE$ is the probability of finding the system with energy between $E$ and $E+dE$.