Burt Rutan's White Knight and SpaceShip One, Photo Courtesy of Scaled Composites
Thermodynamics and Propulsion

7.3 A Statistical Definition of Entropy

The list of the $ p_i$ is a precise description of the randomness in the system, but the number of quantum states in almost any industrial system is so high this list is not useable. We thus look for a single quantity, which is a function of the $ p_i$ , that gives an appropriate measure of the randomness of a system. As shown below, the entropy provides this measure.

There are several attributes that the desired function should have. The first is that the average of the function over all of the microstates should have an extensive behavior. In other words the microscopic description of the entropy of a system $ C$ , composed of parts $ A$ and $ B$ should be given by

$\displaystyle S_C = S_A+ S_B.$ (7..4)

Second is that entropy should increase with randomness and should be largest for a given energy when all the quantum states are equiprobable.

The average of the function over all the microstates is defined by

$\displaystyle S= \langle f\rangle =\sum_i p_i f(p_i),$ (7..5)

where the function $ f(p_i)$ is to be found. Suppose that system $ A$ has $ n$ microstates and system $ B$ has $ m$ microstates. The entropies of systems $ A$ , $ B$ , and $ C$ , are defined by
\begin{subequations}\begin{align}S_A&=\sum_{i=1}^n p_i f(p_i) S_B&=\sum_{j=1}^...
...})=\sum_{i=1}^n p_i f(p_i)\sum_{j=1}^m p_j f(p_j). \end{align}\end{subequations}

In Equations (7.5) and (7.6), the term $ p_{ij}$ means the probability of a microstate in which system $ A$ is in state $ i$ and system $ B$ is in state $ j$ . For Equation (7.4) to hold given the expressions in Equations (7.6),

$\displaystyle S_C$ $\displaystyle =\sum_{i=1}^n p_i f(p_i)\sum_{j=1}^m p_j f(p_j)$    
  $\displaystyle = \sum_{i=1}^n p_i f(p_i)+\sum_{j=1}^m p_j f(p_j)=S_A+S_B.$ (7..7)

The function $ f$ must be such that this is true regardless of the values of the probabilities $ p_i$ and $ p_j$ . This will occur if $ f()=\ln()$ because $ \ln(a\cdot b)=\ln(a)+\ln(b)$ .

To verify this, make this substitution in the expression for $ S_c$ in the first part of Equation (7.6c) (assume the probabilities $ p_i$ and $ p_j$ are independent, such that $ p_{ij} = p_i p_j$ , and split the log term):

$\displaystyle S_C=\sum_{i=1}^n\sum_{j=1}^m p_i p_j \ln(p_i)+ \sum_{i=1}^n\sum_{j=1}^m p_i p_j \ln(p_j).$ (7..8)

Rearranging the sums, (7.8) becomes

$\displaystyle S_C = \sum_{i=1}^n\left\{p_i\ln(p_i)\left[\sum_{j=1}^m p_j\right]\right\}+ \sum_{j=1}^m\left\{ p_j\ln(p_j)\left[\sum_{i=1}^n p_i\right]\right\}.$ (7..9)

Because

$\displaystyle \sum_{i=1}^n p_i = \sum_{j=1}^m p_j = 1,$ (7..10)

the square brackets in the right hand side of Equation (7.9) can be set equal to unity, with the result written as

$\displaystyle S_C=\sum_{i=1}^n p_i\ln(p_i)+\sum_{j=1}^mp_j\ln(p_j).$ (7..11)

This reveals the top line of Equation (7.7) to be the same as the bottom line, for any $ p_i$ , $ p_j$ , $ n$ , $ m$ , provided that $ f()$ is a logarithmic function. Reynolds and Perkins show that the most general $ f(p_i)$ is $ f = C\ln(p_i)$ , where $ C$ is an arbitrary constant. Because the $ p_i$ are less than unity, the constant is chosen to be negative to make the entropy positive.

Based on the above, a statistical definition of entropy can be given as:

$\displaystyle S=-k\sum_i p_i \ln(p_i).$ (7..12)

The constant $ k$ is known as the Boltzmann constant,

$\displaystyle k =1.380\times10^{-23} \frac{J}{K}.$ (7..13)

The value of $ k$ is (another wonderful result!) given by

$\displaystyle k =\frac{\mathbf{R}}{N_\textrm{Avogadro},}$ (7..14)

where $ \mathbf{R}$ is the universal gas constant, $ 8.3143\textrm{
J/(mol-K)}$ and $ N_\textrm{Avogadro}$ is Avogadro's number, $ 6.02\times10^{23}$ molecules per mol. Sometimes $ k$ is called the gas constant per molecule. With this value for $ k$ , the statistical definition of entropy is identical with the macroscopic definition of entropy.

UnifiedTP