Thermodynamics and Propulsion | |
7.3 A Statistical Definition of EntropyThe list of the is a precise description of the randomness in the system, but the number of quantum states in almost any industrial system is so high this list is not useable. We thus look for a single quantity, which is a function of the , that gives an appropriate measure of the randomness of a system. As shown below, the entropy provides this measure.There are several attributes that the desired function should have. The first is that the average of the function over all of the microstates should have an extensive behavior. In other words the microscopic description of the entropy of a system , composed of parts and should be given by Second is that entropy should increase with randomness and should be largest for a given energy when all the quantum states are equiprobable. The average of the function over all the microstates is defined by where the function is to be found. Suppose that system has microstates and system has microstates. The entropies of systems , , and , are defined by In Equations (7.5) and (7.6), the term means the probability of a microstate in which system is in state and system is in state . For Equation (7.4) to hold given the expressions in Equations (7.6), The function must be such that this is true regardless of the values of the probabilities and . This will occur if because . To verify this, make this substitution in the expression for in the first part of Equation (7.6c) (assume the probabilities and are independent, such that , and split the log term): Rearranging the sums, (7.8) becomes Because the square brackets in the right hand side of Equation (7.9) can be set equal to unity, with the result written as This reveals the top line of Equation (7.7) to be the same as the bottom line, for any , , , , provided that is a logarithmic function. Reynolds and Perkins show that the most general is , where is an arbitrary constant. Because the are less than unity, the constant is chosen to be negative to make the entropy positive. Based on the above, a statistical definition of entropy can be given as: The constant is known as the Boltzmann constant, The value of is (another wonderful result!) given by where is the universal gas constant, and is Avogadro's number, molecules per mol. Sometimes is called the gas constant per molecule. With this value for , the statistical definition of entropy is identical with the macroscopic definition of entropy. UnifiedTP |