we introduced the entropy of an isolated system as proportional to the logarithm of its statistical weight, which is the number of accessible microstates. Since it is isolated form the environment, the system can be in any of these microstates with equal probability 3.11. For a system in thermal equilibrium, i.e. in contact with a heat bath, the microstates are not all equiprobable, as we have seen that their probabilities to occur depend on the values of their energies according to the Boltzmann distribution. How do we define the entropy for such a system? We follow an approach developed by Gibbs.
Let us consider an ensemble of a very large number of copies of identical systems. These systems are distinguishable, for example by a label attached to each of them. Each system can be in any of its available microstates
with probabilities
. These probabilities
are given by the Boltzmann distribution
. The systems interact very weakly amongst themselves, in such a way that their energy levels (and probabilities) are independent on how many systems are present in the ensemble, but they can exchange energy to come in thermal equilibrium. There will be
systems in microstate 1,
in microstate 2, and so on. The
systems in microstate 1 can be chosen in every possible way amongst the totality of
systems, and so can the
and so on. The total number of ways we can choose these
systems out of
is:
for the entropy of the ensemble. Since we can make
when all the microstates in each system are equiprobable,