What is Statistical Mechanics?

Statistical mechanics is a branch of physics that attempts to describe systems with a large number of degrees of freedom. By degrees of freedom, we roughly mean things we need to keep track of, such as the position of air molecules in a room or atoms in a copper wire, or the direction of magnetic moments of atoms in a magnet. By "a large number" we mean really really really large: the strength of a 5 mm diameter spherical Neodymium magnet comes from the alignment of magnetic moments of $$4,500,000,000,000,000,000,000$$ atoms, and every cubic foot of the air around you contains around $$700,000,000,000,000,000,000,000$$ molecules.1 Trying to figure out how that many particles would interact and move around may seem impossible - in fact, even a system of three particles, interacting under their own gravity, does not have a general, closed-form solution. The strategy that statistical mechanics suggests we take is to preted the system is (almost) completely random.

Let's start out with a simpler, familiar random system: flipping a coin. Our coins are completely fair: they come up heads exactly half of the time, tails the other half of the time, and each flip is independent of the others. To start with, which do you think is more random: 10 coin flips, or 100 coin flips?

The answer is:

The histograms below show the frequencies of different numbers of heads among either 10 or 100 coin flips. Each time you click, several more coin flip trials are performed.

Your browser doesn't support canvas.

What do you notice about the two histograms? 3

Both have their maximum in the middle (where exactly half of the coins are heads). The 100-coin chart, however, looks more spiky. Using a Binomial random variable, we can calculate that in the 100-coin case, more than two-thirds of the coins will come up heads only once every thousand trials, around 0.089% of the time, and the number of heads is between 45 and 55 around 73% of the time. If we flip N coins, the width of the "peak", in terms of the fraction of heads, is roughly \(\frac{2}{\sqrt{N}}\), so if we quadruple the number of coins the peak becomes half as wide. That means our "fraction of the coins that came up heads" number actually becomes more predictable if we flip more coins. This result may or may not surprise you, but in any case it gives us some hope for, say, describing the physics of the air around you. We would like to to take advantage of the randomness of immense numbers of particles to find some quantity that doesn't seem random at all.

These sorts of properties of physical systems, which do not fluctuate randomly, form the basis for statistical mechanics and thermodynamics. Familiar examples are temperature and pressure. Neither makes any sense for individual atoms, but both become meaningful and measurable for human-scale objects. A thermometer is constantly bombarded with particles of the substance it is immersed in, all moving at different speeds and in different directions, and yet we can expect the temperature reading to stay about the same from one moment to the next, for the same reasons that we can expect very close to half of our 100 coins to come up heads.

To convince you that this approach has some merit, look at the (rather uninteresting) graph below.

Your browser does not support SVG.
The fraction by which \(\frac{PV}{NT}\) differs from k for N2 gas at a range of temperatures and pressures. Scales are logarithmic. The red lines indicate STP (standard temperature and pressure), the air pressure at sea level and \(293.15\ K = 20^\circ\ C = 68^\circ F\).4

One powerful prediction from basic statistical mechanics is that a box full of tiny balls bouncing around without bumping into each other and unaffected by gravity obeys the ideal gas law:

$$PV = NkT$$

where P is the pressure the "gas" exerts on the sides - the amount of force per unit area you would have to apply to keep the box from expanding; V is the volume of the box; N is the number of balls; k is Boltzmann's constant, a sort of conversion factor between temperature and energy; and T is the temperature of the "gas," what a thermometer would read if placed inside (but in Kelvin, an absolute temperature scale). Note that nothing in this formula has anything to do with how much the balls weigh, or how large they are.

Real gases don't quite satisfy our assumptions: their atoms aren't infinitesimally tiny (though they are pretty small), they bump into each other, and they also interact with each other from far away, through van der Waals forces. The graph above shows how much PV/NT differs from k for nitrogen gas at various temperatures and pressures. At half of the data points plotted, encompassing all of the pressure/temperature combinations you will ever experience, nitrogen matches the ideal gas law within 5%. Only at very high pressures and low temperatures do we start to see substantial disagreement between theory and measurement. Somehow, after ignoring most of the properties of the gas and focusing on averages of random behavior, statistical mechanics allows us to make powerful, accurate predictions.

To find out what's happening at the colorful boundaries, read about phase transitions.