The Ergodic HypothesisLudwig Boltzmann was criticized for his 1872 attempt to prove his H-theorem (that entropy always increases) by a dynamical analysis of molecular collisions. Josef Loschmidt and others pointed out that if the molecular velocities were to be reversed at an instant, Boltzmann's work would show that the entropy should decrease. This was the reversibility objection.
Entropy as Lost Information about Molecular PositionsEntropy increase can be easily understood as the loss of information as a system moves from an initially ordered state to a final disordered state. Ludwig Boltzmann was the first to describe entropy as "missing information."
Dr. Shannon's work roots back, as von Neumann has pointed out, to Boltzmann's observation, in some of his work on statistical physics (1894), that entropy is related to "missing information," inasmuch as it is related to the number of alternatives which remain possible to a physical system after all the macroscopically observable information concerning it has been recorded. L. Szilard (Zsch. f. Phys. Vol. 53, 1925) extended this idea to a general discussion of information in physics, and von Neumann (Math. Foundation of Quantum Mechanics, Berlin, 1932, Chap. V) treated information in quantum mechanics and particle physics.Although the physical dimensions of thermodynamic entropy (joules/ºK) are not the same as (dimensionless) mathematical information, apart from units they share the same famous formula.
S = ∑ pi ln piTo see this very simply, let's consider the well-known example of a bottle of perfume in the corner of a room. We can represent the room as a grid of 64 squares. Suppose the air is filled with molecules moving randomly at room temperature (blue circles). In the lower left corner a small number of (red) perfume molecules will be released when we open the bottle (when you start the demonstration animation below). What is the quantity of information we have about the perfume molecules? At the start we know their location in the lower left square, a bit less than 1/64th of the container. The quantity of information is determined by the minimum number of yes/no questions it takes to locate them. The best questions are those that split the locations evenly (a binary tree). For example:
Entropy as Evolution to the Most Probable MacrostateIn 1877, Boltzmann simply ignored classical dynamics and instead made the assumption that all phase space cells were equally probable. Classical dynamics could not prove that the path of the system in phase space would move through all the cells, let alone spend equal time in all cells. Boltzmann described a system he called "ergode," later called the canonical ensemble by J. Willard Gibbs. Equal a priori probabilities for all the phase space cells came to be called the ergodic hypothesis. Paul and Tatiana Ehrenfest made the ergodic hypothesis the central question in statistical mechanics. Mathematicians took up the problem of ergodicity in continuous mathematics, which has questionable relevance for problems in discrete particle physics. In modern quantum statistical mechanics, the same ergodic hypothesis (equiprobability of phase space cells) shows up in an assumption about transition probabilities between phase space cells. The transition probability for any microstate A to jump to microstate B is assumed to be the same as the reverse quantum jump from B to A. The matrix element for the A - B transition is the complex conjugate of the reverse transition B - A. This is called Fermi's Golden Rule, although it was first derived by Paul Dirac. We can see how any system with equal transition probabilities to and from any other state will quickly establish equilibrium populations. If 1000 systems are in state A and none in B, the early transitions will overwhelmingly be from A to B. An equal number of transitions back from B to A is not likely until the populations of A and B are about the same. That is the basic idea behind the statistical formulation of Boltzmann's H-theorem. When all phase space cells are equally populated, the number of ways this can be achieved (the number of microstates) is at its maximum. Although cell populations will fluctuate away from this equilbrium condition, it corresponds to the maximum entropy. Number of systems
Number of cycles
The initial distribution of 500 systems in the upper left corner evolves rapidly to the normal distribution function for occupation numbersNormal | Teacher | Scholar