Burt Rutan's White Knight and SpaceShip One, Photo Courtesy of Scaled Composites
Thermodynamics and Propulsion

7.4 Connection between the Statistical Definition of Entropy and Randomness

We need now to examine the behavior of the statistical definition of entropy as regards randomness. Because a uniform probability distribution reflects the largest randomness, a system with $ n$ allowed states will have the greatest entropy when each state is equally likely. In this situation, the probabilities become

$\displaystyle p_i = p = \frac{1}{\Omega},$ (7..15)

where $ \Omega$ is the total number of microstates. The entropy is thus

$\displaystyle S$ $\displaystyle = -k \sum_{i=1}^\Omega \frac{1}{\Omega}\ln\left(\frac{1}{\Omega}\...
...ega}\ln\left(\frac{1}{\Omega}\right)\right] =-k\ln\left(\frac{1}{\Omega}\right)$    
  $\displaystyle = k \ln\Omega.$ (7..16)

Equation (7.16) states that the larger the number of possible states the larger the entropy. The behavior of the entropy stated in Equation (7.16) can be summarized as follows:
  1. $ S$ is maximum when $ \Omega$ is maximum, which means many permitted quantum states, hence much randomness,
  2. $ S$ is minimum when $ \Omega$ is minimum. In particular, for $ \Omega=1$ , there is no randomness and $ S=0$ .
These trends are in accord with our qualitative ideas concerning randomness. Equation (7.16) is carved on Boltzmann's tombstone (he died in 1906 in Vienna).

We can also examine the additive property of entropy with respect to probabilities. If we have two systems, $ A$ and $ B$ , which are viewed as a combined system, $ C$ , the quantum states for the combined system are the combinations of the quantum states from $ A$ and $ B$ . The quantum state where $ A$ is in its state $ x$ and $ B$ is in its state $ y$ would have a probability $ p_{Ax}\cdot p_{By}$ because the two probabilities are independent. The number of probabilities for the combined system, $ \Omega_C$ , is thus defined by $ \Omega_C=\Omega_A\cdot \Omega_B$ . The entropy of the combined system is

$\displaystyle S_C =k\ln(\Omega_A\Omega_B)=k\ln\Omega_A+k\ln\Omega_B =S_A+S_B$ (7..17)

Equation (7.16) is sometimes taken as the basic definition of entropy, but it should be remembered that it is only appropriate when each quantum state is equally likely. Equation (7.12) is more general and applies equally for equilibrium and non-equilibrium situations.

A simple numerical example shows trends in entropy changes and randomness for a system which can exist in three states. Consider the five probability distributions

$\displaystyle (i)\:p_1$ $\displaystyle =1.0,$ $\displaystyle p_2$ $\displaystyle =0,$ $\displaystyle p_3$ $\displaystyle =0;$ $\displaystyle S=-k(1\ln(1)+0\ln(0)+0\ln(0))$ $\displaystyle =0$    
$\displaystyle (ii)\:p_1$ $\displaystyle =0.8,$ $\displaystyle p_2$ $\displaystyle =0.2,$ $\displaystyle p_3$ $\displaystyle =0;$ $\displaystyle S=-k(0.8\ln(0.8)+0.2\ln(0.2)+0\ln(0))$ $\displaystyle =0.5k$    
$\displaystyle (iii)\:p_1$ $\displaystyle =0.8,$ $\displaystyle p_2$ $\displaystyle =0.1,$ $\displaystyle p_3$ $\displaystyle =0.1;$ $\displaystyle S=-k(0.8\ln(0.8)+0.1\ln(0.1)+0.1\ln(0.1))$ $\displaystyle =0.6k$    
$\displaystyle (iv)\:p_1$ $\displaystyle =0.5,$ $\displaystyle p_2$ $\displaystyle =0.3,$ $\displaystyle p_3$ $\displaystyle =0.2;$ $\displaystyle S=-k(0.5\ln(0.5)+0.3\ln(0.3)+0.2\ln(0.2))$ $\displaystyle =1.0k$    
$\displaystyle (v)\:p_1$ $\displaystyle =1/3,$ $\displaystyle p_2$ $\displaystyle =1/3,$ $\displaystyle p_3$ $\displaystyle =1/3;$ $\displaystyle S=-3k\left[\frac{1}{3}\ln\left(\frac{1}{3}\right)\right]$ $\displaystyle =1.1k$    

The first distribution has no randomness. For the second, we know that state 3 is never found. Distributions (iii) and (iv) have progressively greater uncertainty about the distribution of states and thus higher randomness. Distribution (v) has the greatest randomness and uncertainty and also the largest entropy.

UnifiedTP