The Problem of Microscopic Reversibility
Max Planck long suspected that it was the interaction between matter and radiation that caused the entropy increase required by the second law of thermodynamics. We can illustrate this with a collision between matter particles that changes the quantum state of either or both particles.
Classical and Quantum CollisionsWe compare two-particle collisions in classical mechanics with collisions that excite internal energy levels of one atom to emit a photon shortly after the collision. This could equally be the absorption of a photon or a change in the internal quantum states of one particle. For example, if the particles are atoms, they should be treated as a "quasi-molecule" using molecular wave functions. Rotational quantum states are low energy relative to electronic states. A rotational state change would add one unit ℏ of angular momentum, randomizing future particle paths, erasing past path information.
Loschmidt's ParadoxLoschmidt's criticism was based on the simple idea that the laws of classical dynamics are time reversible. Consequently, if we just turned the time around, the time evolution of the system should lead to decreasing entropy. Of course we cannot turn time around, but a classical dynamical system will evolve in reverse if all the particles could have their velocities exactly reversed. Apart from the practical impossibility of reversing particle directions, Loschmidt had shown that in principle systems could exist for which the entropy should decrease instead of increasing. This is called Loschmidt's "Reversibility Objection" (Umwiederkehreinwand) or "Loschmidt's paradox." We call it the problem of microscopic reversibility. Microscopic time reversibility is one of the foundational assumptions of both classical mechanics and quantum mechanics. It is mistakenly thought to be the basis for the "detailed balancing" of chemical reactions in thermodynamic equilibrium. In fact microscopic reversibility is an assumption that is only statistically valid in the same limits as any "quantum to classical transition." This is the limit when the number of particles is large enough that we can average over quantum effects. Quantum events also approach classical behavior in the limit of large quantum numbers, which Niels Bohr called the "correspondence principle." What "detailed balancing" means is that in thermodynamic equilibrium, the number of forward reactions is exactly balanced by the number of reverse reactions. And this is correct. But microscopic reversibility, while still true when considering averages over time, should not be confused with the time reversibility of a specific individual collision between particles. We will examine the collision of two atoms and show that if their velocities are reversed at some time after the collision, it is highly improbable that they will retrace their paths. This does not mean that, given enough particle collisions, there will not be statistically many collisions that are essentially the same as the "reverse collisions" needed for detailed balancing in chemical reactions, for transport processes with the Boltzmann equation, and for the Onsager reciprocal relations in non-equilibrium conditions.
The Origin of IrreversibilityOur careful quantum analysis shows that time reversibility fails even in the most ideal conditions (the simplest case of two particles in collision), provided internal quantum structure or the quantum-mechanical interaction with radiation is taken into account. So we find that Max Planck's intuition was correct. But it was Albert Einstein who was the first to see how this might work, perhaps as early as his 1909 extension of work on the photoelectric effect but definitely in his 1916-17 work on the emission and absorption of radiation. This was the work in which Einstein showed that quantum theory implies ontological chance, which he famously disliked, ("God does not play dice!"). This work is sometimes cited as a proof of detailed balancing and microscopic reversibility. (Wikipedia, for example.) In fact, Einstein started with Boltzmann's assumption of detailed balancing, along with the "Boltzmann principle" that the probability of states with energy E is reduced by the exponential "Boltzmann factor," f(E) ∝ e-E/kT, to derive his transition probabilities for emission and absorption of radiation. In the same paper, Einstein derived Planck's radiation law (one of his four independent derivations of the radiation law) and Niels Bohr's second "quantum postulate" Em - En = hν. But Einstein distinctly denied any symmetry in the elementary processes of emission and absorption. The isotropic symmetry of detailed balancing is the result of averaging over a large number of such processes. As early as 1909, he noted that the elementary process of emission is not "invertible." There are outgoing spherical waves of radiation, but incoming spherical waves are never seen.
While in the kinetic theory of molecules, for every process in which only a few elementary particles participate (e.g., molecular collisions), the inverse process also exists. But that is not the case for the elementary processes of radiation. According to our prevailing theory, an oscillating ion generates a spherical wave that propagates outwards. The inverse process does not exist as an elementary process. A converging spherical wave is mathematically possible, to be sure; but to approach its realization requires a vast number of emitting entities. The elementary process of emission is not invertible. In this, I believe, our oscillation theory does not hit the mark. Newton's emission theory of light seems to contain more truth with respect to this point than the oscillation theory since, first of all, the energy given to a light particle is not scattered over infinite space, but remains available for an elementary process of absorption...In a deterministic universe, the path information needed to predict the future motions of all particles would be preserved. If information is a conserved quantity, the future and the past are all contained in the present. The information about future paths is precisely the same information that, if reversed, would predict microscopic reversibility of each and every collision. The introduction of ontological probabilities and statistics would deny such determinism. If the motions of particles have a chance element, such determinism can not exist. And this is exactly what Einstein did in his papers on the emission and absorption of radiation by matter. He found that quantum theory implies ontological chance. A "weakness in the theory," he called it. What we might call Einstein's "radiation asymmetry" was introduced with these words,
When a molecule absorbs or emits the energy ε in the form of radiation during the transition between quantum theoretically possible states, then this elementary process can be viewed either as a completely or partially directed one in space, or also as a symmetrical (nondirected) one. It turns out that we arrive at a theory that is free of contradictions, only if we interpret those elementary processes as completely directed processes.Before Einstein the common view of light was that it is radiated in all directions (isotropically) as waves. After 1916, it was known by some, but very few took it seriously, that light is spontaneously emitted as what Einstein called "light quanta" (now known as photons), each in a single and random direction. Spontaneous emissions also occur at random times, exactly like radioactive decays, and for the same reasons. This randomness is the basis of all chance in the universe. Einstein discovered it ten years before Werner Heisenberg claimed that his "uncertainty principle" had introduced indeterminism and eliminated causality in physics. Note that Einstein's chance is ontological. Heisenberg's uncertainty is epistemological, the result of limited resolving power in the measurement apparatus. The elementary process of the emission and absorption of radiation is asymmetric, because the process is directed, as Einstein had explicitly noted first in 1909, and we think he may have seen as early as 1905 in his analysis of the photoelectric effect. The apparent isotropy of the emission of radiation is only what Einstein called "pseudo-isotropy" (Pseudoisotropie), a consequence of time averages over large numbers of events. Einstein often substituted time averages of a single system with "space" averages, or averages over a large "ensemble" of identical systems, as had J. Willard Gibbs in his statistical mechanics.
a quantum theory free from contradictions can only be obtained if the emission process, just as absorption, is assumed to be directional. In that case, for each elementary emission process Zm->Zn a momentum of magnitude (εm—εn)/c is transferred to the molecule. If the latter is isotropic, we shall have to assume that all directions of emission are equally probable. If the molecule is not isotropic, we arrive at the same statement if the orientation changes with time in accordance with the laws of chance. Moreover, such an assumption will also have to be made about the statistical laws for absorption, (B) and (B'). Otherwise the constants Bmn and Bnm would have to depend on the direction, and this can be avoided by making the assumption of isotropy or pseudo-isotropy (using time averages).Now the principle of microscopic reversibility is a fundamental assumption of statistical mechanics. It underlies the principle of "detailed balancing," which is critical to the understanding of chemical reactions. In thermodynamic equilibrium, the number of forward reactions is exactly balanced by the number of reverse reactions. But microscopic reversibility, while true in the sense of averages over time, should not be confused with the reversibility of individual collisions between molecules. The equations of classical dynamics are reversible in time. And the deterministic Schrödinger equation of motion in quantum mechanics is also time reversible. But the interactions of photons and material particles like electrons and atoms are distinctly not reversible! An explanation of microscopic irreversibility in atomic and molecular collisions would provide the needed justification for Ludwig Boltzmann's assumption of "molecular disorder" and strengthen his H-Theorem. This is what we hope to do. In quantum mechanics, microscopic time reversibility is assumed true by most scientists because the deterministic Schrödinger equation itself is time reversible. But the Schrödinger equation only describes the deterministic time evolution of the probabilities of various quantum events, which are themselves not deterministic and not reversible. When an actual event occurs, the probabilities of multiple possible events collapse to the actual occurrence of one event. In quantum mechanics, this is the irreversible collapse of the wave function that John von Neumann called "Process 1." Treating two atoms as a temporary molecule means we must use molecular, rather than atomic, wave functions. The quantum description of the molecule now transforms the six independent degrees of freedom into three for the molecule's center of mass and three more that describe vibrational and rotational quantum states. The possibility of quantum transitions between closely spaced vibrational and rotational energy levels in the "quasi-molecule' introduces indeterminacy in the future paths of the separate atoms. The classical path information needed to ensure the deterministic dynamical behavior has been partially erased. The memory of the past needed to predict the "determined" future has been lost. Even assuming the practical impossibility of a perfect classical time reversal, in which we simply turn the two particles around, quantum physics would require two measurements to locate the two particles, followed by two state preparations to send them in the opposite direction. These could only be made within the precision of Heisenberg's uncertainty principle and so could not perfectly produce microscopic reversibility, which is thus only a classical idealization, like the idea of determinism.. Heisenberg indeterminacy puts calculable limits on the accuracy with which perfect reversed paths can be achieved. Let us assume this impossible task can be completed, and it sends the two particles into the reverse collision paths. But on the return path, there is only a finite probability that a "sum over histories" calculation will produce the same (or exactly reversed) quantum transitions between vibrational and rotational states that occurred in the first collision. Thus a quantum description of a two-particle collision establishes the microscopic irreversibility that Boltzmann sometimes described as his assumption of "molecular disorder." In his second (1877) derivation of the H-theorem, Boltzmann used a statistical approach and the molecular disorder assumption to get away from the time-reversibility assumptions of classical dynamics. We must develop a deep insight into Einstein's asymmetry between light and matter, one that was appreciated as early as the 1880's by Max Planck's great mentor Gustave Kirchhoff, but was not understood in quantum mechanical terms until Einstein's understanding of nonlocality and the relation between waves and particles in 1909.. It is still ignored in quantum statistical mechanics by those who mistakenly think that the time reversible Schrödinger equation means microscopic interactions are reversible. Maxwell and Boltzmann had shown that collisions between material particles, analyzed statistically, cause the distribution of positions and velocities to approach their equilibrium Maxwell-Boltzmann distribution. A bit later, Kirchhoff and Planck knew that an extreme non-equilibrium distribution of radiation, for example a monochromatic radiation field, will remain out of equilibrium indefinitely. But if that radiation interacts with even the tiniest amount of matter, a speck of carbon black was their example, all the wavelengths of the spectrum - the Kirchhoff law - soon appear. So we can say that the approach to equilibrium of a radiation field has the same origin of irreversibility as that of matter. Radiation without matter cannot equilibrate. Photons do not interact, except at the extremely high energies where they can convert to matter and anti-matter. Our new insight is that matter without radiation also cannot equilibrate in a way that escapes the reversibility and recurrence objections, as is taught in every textbook and review article on statistical mechanics to this day. It is thus the irreversible interaction of the two, light and matter, photons and electrons, that lies behind the increase of entropy in the universe. The second law of thermodynamics would not explain the increase of entropy except for the microscopic irreversibility that we have shown to be the case. Microscopic irreversibility not only explains the second law, it validates Boltzmann's brilliant assumption of "molecular disorder" to justify his statistical arguments. Zermelo's paradox was a later criticism of Ludwig Boltzmann's attempt to derive the increasing entropy required by the second law of thermodynamics. It also involves time. Assuming infinite available time, a finite universe with fixed matter, energy, and information will at some point return to any given earlier state. We now know that even a finite part of the universe cannot return to exactly the same state, because the surrounding universe will have aged and be in a different information state. This is the information philosophy solution to the problem of eternal recurrence, as seen by Arthur Stanley Eddington and H. Dieter Zeh.
ReferencesThe Origin of Irreversibility (pdf) Microscopic Irreversibility, chapter 25 of Great Problems of Philosophy and Physics Solved?.
ResourcesMacroscopic irreversibility and microscopic paradox: A Constructal law analysis of atoms as open systems Microscopic_Reversibility Equipartition Theorem Path_integral_formulation Microscopic_Reversibility Friedman Equations