General form for the entropy

In section [*] we introduced the entropy of an isolated system as proportional to the logarithm of its statistical weight, which is the number of accessible microstates. Since it is isolated form the environment, the system can be in any of these microstates with equal probability 3.11. For a system in thermal equilibrium, i.e. in contact with a heat bath, the microstates are not all equiprobable, as we have seen that their probabilities to occur depend on the values of their energies according to the Boltzmann distribution. How do we define the entropy for such a system? We follow an approach developed by Gibbs.

Let us consider an ensemble of a very large number of copies $\nu$ of identical systems. These systems are distinguishable, for example by a label attached to each of them. Each system can be in any of its available microstates $1, 2, \dots, r, \dots$ with probabilities $p_1, p_2, \dots, p_r, \dots$. These probabilities $p_r$ are given by the Boltzmann distribution [*]. The systems interact very weakly amongst themselves, in such a way that their energy levels (and probabilities) are independent on how many systems are present in the ensemble, but they can exchange energy to come in thermal equilibrium. There will be $\nu_1 = p_1 \nu$ systems in microstate 1, $\nu_2 = p_2 \nu$ in microstate 2, and so on. The $\nu_1$ systems in microstate 1 can be chosen in every possible way amongst the totality of $\nu$ systems, and so can the $\nu_2$ and so on. The total number of ways we can choose these $\nu_1, \nu_2, \dots, \nu_r, \dots$ systems out of $\nu$ is:

$\displaystyle \Omega_\nu = \frac{\nu !}{\nu_1! \nu_2! \dots \nu_r! \dots}.$ (3.58)

We can think of any of these $\Omega_\nu$ realisations as one of the possible microstates of the ensemble of systems. Clearly, they are all equally likely to occur, and so we can use definition [*] for the entropy of the ensemble. Since we can make $\nu$ (and therefore any of the $\nu_r$) as large as we want, we can use the Stirling approximation $\ln \nu! = \nu \ln \nu - \nu$ and we have

$\displaystyle \frac{S_\nu}{k_{\rm B}} = \ln \Omega_\nu = \ln \nu! - \sum_r \ln \nu_r! = \nu \ln \nu - \nu - \sum_r (\nu_r \ln \nu_r - \nu_r) =$    
$\displaystyle \nu \ln \nu - \sum_r \nu_r \ln \nu_r,$ (3.59)

because $\sum_r \nu_r = \nu$. Since the systems in the ensemble are all statistically independent, the entropy of the ensemble $S_\nu$ is equal to the sum of the entropies of each single systems $S$ and since they are all identical we have

$\displaystyle \frac{S}{k_{\rm B}} = \frac{S_\nu}{\nu k_{\rm B}} = \ln \nu - \sum_r p_r \ln p_r \nu = -\sum_r p_r \ln p_r.$ (3.60)

This more general definition obviously reduces to the original one in [*] when all the microstates in each system are equiprobable, $p_r = 1/\Omega, ~\forall r$.



Subsections