The system plus the heat bath form an isolated system, and so we can use some of the results obtained in the previous section, in particular the fact that the sum of the two energies
is constant. The energy of the system can assume any of its possible values
.
The probability
for the system to be in a particular microstate
with energy
is proportional to the probability of the heat bath to have energy
, which is proportional to the statistical weight
, where we have used the definition of the entropy
.
Since
is small compared to the energy of the heat bath 3.8,
we can expand the entropy of the bath around
3.9:
 |
(3.31) |
Using using
we have:
 |
(3.32) |
Under the assumption that the temperature of the heat bath
does not change due to interactions with the system, the third term in the r.h.s. is zero, and we have:
 |
(3.33) |
The first term on the r.h.s. is a constant, and can be included in the the constant of proportionality that makes
an equality. This is the condition of normalisation,
, with the sum running over all microstates, and so we can re-write
as
 |
(3.34) |
where we have introduced the temperature parameter
and the partition function
 |
(3.35) |
Eq.
is known as the Boltzmann distribution. Here
is the temperature of the heat bath, but taking over the results of the previous section, at equilibrium it is also the temperature of the system. We can therefore drop the subscript and write
(and correspondingly
).
The totality of the microstates, now each with a different probability for the system to be found in any of them, forms the canonical ensemble, which is characterised by the system having constant volume
, constant number of particles
and constant temperature
.
The partition function, which depends on
, is a very powerful object, as we shall see that many thermodynamic properties of the system can be obtained from it. For example, the average energy of the system can be obtained as:
 |
(3.36) |
Imagine that some of the energy levels are degenerate. We can group the equal terms and rewrite the list of energies so that they are all different, but each one has a multiplicity factor attached to it. For example, suppose we have
. We rewrite this list as
, and we include the multiplicity factor
. This multiplicity factor is nothing but the statistical weight
of the system in a particular energy level
.
We can therefore rewrite Eq.
as:
 |
(3.37) |
with all the energies in the sum now different. Since there are
ways of realising the energy
, the probability that the system has energy
is proportional to
, and we have:
 |
(3.38) |
This can be rewritten also in terms of the entropy
, which gives:
![$\displaystyle p(E_r) = \frac{1}{Z} e^{-\beta [E_r - TS(E_r)]}.$](img231.svg) |
(3.39) |
We now introduce the density of states
defined in such a way that
is equal to the number of states with energy between
and
, with
a sufficiently large energy interval so that the number of states in this interval is large, but also sufficiently small that the Boltzmann factor does not change much.
Imagine for example that the separation between the states is much smaller than
,
. In this case the number of states with energies in the range
is large, and the factor
is a slowly varying function of
. This means that with
being a small range of energy levels such that
we can approximate:
 |
(3.40) |
Under these conditions, we can rewrite the sum in Eq.
as an integral 3.10:
 |
(3.41) |
where we have set the ground state energy equal to zero and the maximum energy
may be infinite.
This also allows us to define a probability density:
 |
(3.42) |
such that
is the probability of finding the system with energy between
and
.