Citation for this page in APA citation style.           Close


Philosophers

Mortimer Adler
Rogers Albritton
Alexander of Aphrodisias
Samuel Alexander
William Alston
Anaximander
G.E.M.Anscombe
Anselm
Louise Antony
Thomas Aquinas
Aristotle
David Armstrong
Harald Atmanspacher
Robert Audi
Augustine
J.L.Austin
A.J.Ayer
Alexander Bain
Mark Balaguer
Jeffrey Barrett
William Barrett
William Belsham
Henri Bergson
George Berkeley
Isaiah Berlin
Richard J. Bernstein
Bernard Berofsky
Robert Bishop
Max Black
Susanne Bobzien
Emil du Bois-Reymond
Hilary Bok
Laurence BonJour
George Boole
Émile Boutroux
Daniel Boyd
F.H.Bradley
C.D.Broad
Michael Burke
Lawrence Cahoone
C.A.Campbell
Joseph Keim Campbell
Rudolf Carnap
Carneades
Nancy Cartwright
Gregg Caruso
Ernst Cassirer
David Chalmers
Roderick Chisholm
Chrysippus
Cicero
Tom Clark
Randolph Clarke
Samuel Clarke
Anthony Collins
Antonella Corradini
Diodorus Cronus
Jonathan Dancy
Donald Davidson
Mario De Caro
Democritus
Daniel Dennett
Jacques Derrida
René Descartes
Richard Double
Fred Dretske
John Dupré
John Earman
Laura Waddell Ekstrom
Epictetus
Epicurus
Austin Farrer
Herbert Feigl
Arthur Fine
John Martin Fischer
Frederic Fitch
Owen Flanagan
Luciano Floridi
Philippa Foot
Alfred Fouilleé
Harry Frankfurt
Richard L. Franklin
Bas van Fraassen
Michael Frede
Gottlob Frege
Peter Geach
Edmund Gettier
Carl Ginet
Alvin Goldman
Gorgias
Nicholas St. John Green
H.Paul Grice
Ian Hacking
Ishtiyaque Haji
Stuart Hampshire
W.F.R.Hardie
Sam Harris
William Hasker
R.M.Hare
Georg W.F. Hegel
Martin Heidegger
Heraclitus
R.E.Hobart
Thomas Hobbes
David Hodgson
Shadsworth Hodgson
Baron d'Holbach
Ted Honderich
Pamela Huby
David Hume
Ferenc Huoranszki
Frank Jackson
William James
Lord Kames
Robert Kane
Immanuel Kant
Tomis Kapitan
Walter Kaufmann
Jaegwon Kim
William King
Hilary Kornblith
Christine Korsgaard
Saul Kripke
Thomas Kuhn
Andrea Lavazza
Christoph Lehner
Keith Lehrer
Gottfried Leibniz
Jules Lequyer
Leucippus
Michael Levin
Joseph Levine
George Henry Lewes
C.I.Lewis
David Lewis
Peter Lipton
C. Lloyd Morgan
John Locke
Michael Lockwood
Arthur O. Lovejoy
E. Jonathan Lowe
John R. Lucas
Lucretius
Alasdair MacIntyre
Ruth Barcan Marcus
Tim Maudlin
James Martineau
Nicholas Maxwell
Storrs McCall
Hugh McCann
Colin McGinn
Michael McKenna
Brian McLaughlin
John McTaggart
Paul E. Meehl
Uwe Meixner
Alfred Mele
Trenton Merricks
John Stuart Mill
Dickinson Miller
G.E.Moore
Thomas Nagel
Otto Neurath
Friedrich Nietzsche
John Norton
P.H.Nowell-Smith
Robert Nozick
William of Ockham
Timothy O'Connor
Parmenides
David F. Pears
Charles Sanders Peirce
Derk Pereboom
Steven Pinker
U.T.Place
Plato
Karl Popper
Porphyry
Huw Price
H.A.Prichard
Protagoras
Hilary Putnam
Willard van Orman Quine
Frank Ramsey
Ayn Rand
Michael Rea
Thomas Reid
Charles Renouvier
Nicholas Rescher
C.W.Rietdijk
Richard Rorty
Josiah Royce
Bertrand Russell
Paul Russell
Gilbert Ryle
Jean-Paul Sartre
Kenneth Sayre
T.M.Scanlon
Moritz Schlick
John Duns Scotus
Arthur Schopenhauer
John Searle
Wilfrid Sellars
David Shiang
Alan Sidelle
Ted Sider
Henry Sidgwick
Walter Sinnott-Armstrong
Peter Slezak
J.J.C.Smart
Saul Smilansky
Michael Smith
Baruch Spinoza
L. Susan Stebbing
Isabelle Stengers
George F. Stout
Galen Strawson
Peter Strawson
Eleonore Stump
Francisco Suárez
Richard Taylor
Kevin Timpe
Mark Twain
Peter Unger
Peter van Inwagen
Manuel Vargas
John Venn
Kadri Vihvelin
Voltaire
G.H. von Wright
David Foster Wallace
R. Jay Wallace
W.G.Ward
Ted Warfield
Roy Weatherford
C.F. von Weizsäcker
William Whewell
Alfred North Whitehead
David Widerker
David Wiggins
Bernard Williams
Timothy Williamson
Ludwig Wittgenstein
Susan Wolf

Scientists

David Albert
Michael Arbib
Walter Baade
Bernard Baars
Jeffrey Bada
Leslie Ballentine
Marcello Barbieri
Gregory Bateson
Horace Barlow
John S. Bell
Mara Beller
Charles Bennett
Ludwig von Bertalanffy
Susan Blackmore
Margaret Boden
David Bohm
Niels Bohr
Ludwig Boltzmann
Emile Borel
Max Born
Satyendra Nath Bose
Walther Bothe
Jean Bricmont
Hans Briegel
Leon Brillouin
Stephen Brush
Henry Thomas Buckle
S. H. Burbury
Melvin Calvin
Donald Campbell
Sadi Carnot
Anthony Cashmore
Eric Chaisson
Gregory Chaitin
Jean-Pierre Changeux
Rudolf Clausius
Arthur Holly Compton
John Conway
Jerry Coyne
John Cramer
Francis Crick
E. P. Culverwell
Antonio Damasio
Olivier Darrigol
Charles Darwin
Richard Dawkins
Terrence Deacon
Lüder Deecke
Richard Dedekind
Louis de Broglie
Stanislas Dehaene
Max Delbrück
Abraham de Moivre
Bernard d'Espagnat
Paul Dirac
Hans Driesch
John Eccles
Arthur Stanley Eddington
Gerald Edelman
Paul Ehrenfest
Manfred Eigen
Albert Einstein
George F. R. Ellis
Hugh Everett, III
Franz Exner
Richard Feynman
R. A. Fisher
David Foster
Joseph Fourier
Philipp Frank
Steven Frautschi
Edward Fredkin
Benjamin Gal-Or
Howard Gardner
Lila Gatlin
Michael Gazzaniga
Nicholas Georgescu-Roegen
GianCarlo Ghirardi
J. Willard Gibbs
James J. Gibson
Nicolas Gisin
Paul Glimcher
Thomas Gold
A. O. Gomes
Brian Goodwin
Joshua Greene
Dirk ter Haar
Jacques Hadamard
Mark Hadley
Patrick Haggard
J. B. S. Haldane
Stuart Hameroff
Augustin Hamon
Sam Harris
Ralph Hartley
Hyman Hartman
Jeff Hawkins
John-Dylan Haynes
Donald Hebb
Martin Heisenberg
Werner Heisenberg
John Herschel
Basil Hiley
Art Hobson
Jesper Hoffmeyer
Don Howard
John H. Jackson
William Stanley Jevons
Roman Jakobson
E. T. Jaynes
Pascual Jordan
Eric Kandel
Ruth E. Kastner
Stuart Kauffman
Martin J. Klein
William R. Klemm
Christof Koch
Simon Kochen
Hans Kornhuber
Stephen Kosslyn
Daniel Koshland
Ladislav Kovàč
Leopold Kronecker
Rolf Landauer
Alfred Landé
Pierre-Simon Laplace
Karl Lashley
David Layzer
Joseph LeDoux
Gerald Lettvin
Gilbert Lewis
Benjamin Libet
David Lindley
Seth Lloyd
Werner Loewenstein
Hendrik Lorentz
Josef Loschmidt
Alfred Lotka
Ernst Mach
Donald MacKay
Henry Margenau
Owen Maroney
David Marr
Humberto Maturana
James Clerk Maxwell
Ernst Mayr
John McCarthy
Warren McCulloch
N. David Mermin
George Miller
Stanley Miller
Ulrich Mohrhoff
Jacques Monod
Vernon Mountcastle
Emmy Noether
Donald Norman
Alexander Oparin
Abraham Pais
Howard Pattee
Wolfgang Pauli
Massimo Pauri
Wilder Penfield
Roger Penrose
Steven Pinker
Colin Pittendrigh
Walter Pitts
Max Planck
Susan Pockett
Henri Poincaré
Daniel Pollen
Ilya Prigogine
Hans Primas
Zenon Pylyshyn
Henry Quastler
Adolphe Quételet
Pasco Rakic
Nicolas Rashevsky
Lord Rayleigh
Frederick Reif
Jürgen Renn
Giacomo Rizzolati
A.A. Roback
Emil Roduner
Juan Roederer
Jerome Rothstein
David Ruelle
David Rumelhart
Robert Sapolsky
Tilman Sauer
Ferdinand de Saussure
Jürgen Schmidhuber
Erwin Schrödinger
Aaron Schurger
Sebastian Seung
Thomas Sebeok
Franco Selleri
Claude Shannon
Charles Sherrington
Abner Shimony
Herbert Simon
Dean Keith Simonton
Edmund Sinnott
B. F. Skinner
Lee Smolin
Ray Solomonoff
Roger Sperry
John Stachel
Henry Stapp
Tom Stonier
Antoine Suarez
Leo Szilard
Max Tegmark
Teilhard de Chardin
Libb Thims
William Thomson (Kelvin)
Richard Tolman
Giulio Tononi
Peter Tse
Alan Turing
C. S. Unnikrishnan
Francisco Varela
Vlatko Vedral
Vladimir Vernadsky
Mikhail Volkenstein
Heinz von Foerster
Richard von Mises
John von Neumann
Jakob von Uexküll
C. H. Waddington
John B. Watson
Daniel Wegner
Steven Weinberg
Paul A. Weiss
Herman Weyl
John Wheeler
Jeffrey Wicken
Wilhelm Wien
Norbert Wiener
Eugene Wigner
E. O. Wilson
Günther Witzany
Stephen Wolfram
H. Dieter Zeh
Semir Zeki
Ernst Zermelo
Wojciech Zurek
Konrad Zuse
Fritz Zwicky

Presentations

Biosemiotics
Free Will
Mental Causation
James Symposium
 
The Physics of Information Creation
Information physics shows that the core creative process in the universe involves quantum mechanics and thermodynamics.

To understand information creation, information physics provides new insights into the puzzling "problem of measurement" and the mysterious "collapse of the wave function" in quantum mechanics. It results in a new information interpretation of quantum mechanics that disentangles the Einstein-Podolsky-Rosen paradox and explains the origins of information structures in the universe.

Information physics also probes deeply into the second law of thermodynamics to establish the irreversible increase of entropy on a quantum mechanical basis, something that could not be shown by classical statistical mechanics or even quantum statistical physics..

Although "Information physics" is a new "interpretation" of quantum mechanics, it is not an attempt to alter the standard quantum mechanics, for example, extending it to theories such as "hidden variables" to restore determinism or adding terms to the Schrödinger equation to force a collapse. Information physics investigates the quantum mechanical and thermodynamic implications of cosmic information structures, especially those that were created before the existence of human observers. It shows that no "conscious observers" are required as with the Copenhagen Interpretation or the work of John von Neumann or Eugene Wigner.

Information physics proposes to show that everything created since the origin of the universe over thirteen billion years ago has involved just two fundamental physical processes that combine to form the core of all creative processes. These two steps occur whenever even a single bit of new information is created and comes into the universe.

  • Step 1: A quantum process - the "collapse of a wave function."

    The formation of even a single bit of information that did not previously exist requires the equivalent of a "measurement." This "measurement" does not involve a "measurer," an experimenter or observer. It happens when the probabilistic wave function that describes the possible outcomes of a measurement "collapses" and a particle of matter or energy is actually found somewhere.

  • Step 2: A thermodynamic process - local reduction, but cosmic increase, in the entropy.

    The second law of thermodynamics requires that the overall cosmic entropy always increases. When new information is created locally in step 1, some energy (with positive entropy greater than the negative entropy of the new information) must be transferred away from the location of the new bits or they will be destroyed, if local thermodynamical equilibrium is restored. This can only happen in a locality where flows of matter and energy with low entropy are passing through, keeping it far from equilibrium.

This two-step core creative process underlies the formation of microscopic objects like atoms and molecules, as well as macroscopic objects like galaxies, stars, and planets.

With the emergence of teleonomic (purposive) information in self-replicating systems, the same core process underlies all biological creation. But now some random changes in information structures are rejected by natural selection, while others reproduce successfully.

Finally, with the emergence of self-aware organisms and the creation of extra-biological information stored in the environment, the same information-generating core process underlies communication, consciousness, free will, and creativity.

The two physical processes in the creative process, quantum physics and thermodynamics, are somewhat daunting subjects for philosophers, and even for many scientists.

Quantum Mechanics
In classical mechanics, the material universe is thought to be made up of tiny particles whose motions are completely determined by forces that act between the particles, forces such as gravitation, electrical attractions and repulsions, etc.

The equations that describe those motions, Newton's laws of motion, were for many centuries thought to be perfect and sufficient to predict the future of any mechanical system. They provided support for many philosophical ideas about determinism.

In classical electrodynamics, electromagnetic radiation (light, radio) was known to have wave properties such as interference. When the crest of one wave meets the trough of another, the two waves cancel one another.

In quantum mechanics, radiation is found to have some particle-like behavior. Energy comes in discrete physically localized packages. Max Planck in 1900 made the famous assumption that the energy was proportional to the frequency of radiation ν.

E = hν

For Planck, this assumption was just a heuristic mathematical device that allowed him to apply Ludwig Boltzmann's work on the statistical mechanics and kinetic theory of gases. Boltzmann had shown in the 1870's that the increase in entropy (the second law) could be explained if gases were made up of enormous numbers of particles.

Planck applied Boltzmann's statistics of many particles to radiation and derived the distribution of radiation at different frequencies (or wavelengths) just as James Clerk Maxwell and Boltzmann had derived the distribution of velocities (or energies) of the gas particles.

Note the mathematical similarity of Planck's radiation distribution law (photons) and the Maxwell-Boltzmann velocity distribution (molecules). Both curves have a power law increase on one side to a maximum and an exponential decrease on the other side of the maximum. The molecular velocity curves cross one another because the total number of molecules is the same. With increasing temperature T, the number of photons increases at all wavelengths.

But Planck did not actually believe that radiation came in discrete particles, at least until a dozen years later. In the meantime, Albert Einstein's 1905 paper on the photoelectric effect hypothesized that light comes in discrete particles, subsequently called "photons," analogous to electrons.

Planck was not happy about the idea of light particles, because his use of Boltmann's statistics implied that chance was real. Boltzmann himself had qualms about the reality of chance. Although Einstein also did not like the idea of chancy statistics, he did believe that energy came in packages of discrete "quanta." It was Einstein, not Planck, who quantized mechanics and electrodynamics. Nevertheless, it was for the introduction of the quantum of action h that Planck was awarded the Nobel prize in 1918.

Louis de Broglie argued that if photons, with their known wavelike properties, could be described as particles, electrons as particles might show wavelike properties with a wavelength λ inversely proportional to their momentum p = mev.

p = h/2πλ

Experiments confirmed de Broglie's assumption and led Erwin Schrödinger to derive a "wave equation" to describe the motion of de Broglie's waves. Schrödinger's equation replaces the classical Newton equations of motion.

Note that Schrödinger's equation describes the motion of only the wave aspect, not the particle aspect, and as such it implies interference. Note also that it is as fully deterministic an equation of motion as Newton's equations.

Schrödinger attempted to interpret his "wave function" for the electron as a probability density for electrical charge, but charge density would be positive everywhere and unable to interfere with itself.

Max Born shocked the world of physics by suggesting that the absolute values of the wave function ψ squared (|ψ|2) could be interpreted as the probability of finding the electron in various position and momentum states - if a measurement is made. This allows the probability amplitude ψ to interfere with itself, producing highly non-intuitive phenomena such as the two-slit experiment.

Despite the probability amplitude going through two slits and interfering with itself, experimenters never find parts of electrons. They always are found whole.

In 1932 John von Neumann explained that two fundamentally different processes are going on in quantum mechanics.

  1. A non-causal process, in which the measured electron winds up randomly in one of the possible physical states (eigenstates) of the measuring apparatus plus electron.

    The probability for each eigenstate is given by the square of the coefficients cn of the expansion of the original system state (wave function ψ) in an infinite set of wave functions φ that represent the eigenfunctions of the measuring apparatus plus electron.

    cn = < φn | ψ >

    This is as close as we get to a description of the motion of the particle aspect of a quantum system. According to von Neumann, the particle simply shows up somewhere as a result of a measurement. Information physics says it shows up whenever a new stable information structure is created.

  2. A causal process, in which the electron wave function ψ evolves deterministically according to Schrödinger's equation of motion for the wavelike aspect. This evolution describes the motion of the probability amplitude wave ψ between measurements.

    (ih/2π) ∂ψ/∂t =

Von Neumann claimed there is another major difference between these two processes. Process 1 is thermodynamically irreversible. Process 2 is reversible. This confirms the fundamental connection between quantum mechanics and thermodynamics that is explainable by information physics.

Information physics establishes that process 1 may create information. Process 2 is information preserving.

Collapse of the Wave Function
Physicists calculate the deterministic evolution of the Schrödinger wave function in time as systems interact or collide. At some point, they make the ad hoc assumption that the wave function "collapses." This produces a set of probabilities of finding the resulting combined system in its various eigenstates.
Although the collapse appears to be a random and ad hoc addition to the deterministic formalism of the Schrödinger equation, it is very important to note that the experimental accuracy of quantum mechanical predictions is unparalleled in physics, providing the ultimate justification for this theoretical kluge.

Moreover, without wave functions collapsing, no new information can come into the universe. Nothing unpredicatable would ever emerge. Determinism is "information-preserving." All the information we have today would have to have already existed in the original fireball at the universe origin.

The "Problem" of Measurement
Quantum measurement (the irreducibly random process of wave function collapse) is not a part of the mathematical formalism of wave function time evolution (the Schrödinger equation of motion is a perfectly deterministic process). The hypothesized collapse is an ad hoc heuristic description and method of calculation that predicts the probabilities of what will happen when an observer makes a measurement.

In many standard discussions of quantum mechanics, and most popular treatments, it is said that we need the consciousness of a physicist to collapse the wave function. Eugene Wigner and John Wheeler sometimes describe the observer as making up the "mind of the universe." John Bell sardonically asked whether the observer needs a Ph.D.

Von Neumann contributed a lot to this confusion by claiming that the location of a "cut" (Schnitt) between the microscopic system and macroscopic measurement system could be anywhere - including inside an observer's brain. Information physics will locate the cut (outside the brain).

Measurement requires the interaction of something macroscopic, assumed to be large and adequately determined. In physics experiments, this is the observing apparatus. But in general, measurement does not require a conscious observer. It does require information creation or there will be nothing to observe.

In our discussion of Schrödinger's Cat, the cat can be its own observer.

The Boundary between the Classical and Quantum Worlds
Some scientists (Werner Heisenberg, John von Neumann, Eugene Wigner and John Bell, for example) have argued that in the absence of a conscious observer, or some "cut" between the microscopic and macroscopic world, the evolution of the quantum system ψ and the macroscopic measuring apparatus A would be described deterministically by Schrödinger's equation of motion for the wave function
| ψ + A > with the Hamiltonian H energy operator,

(ih/2π) ∂/∂t | ψ + A > = H | ψ + A >.

Our quantum mechanical analysis of the measurement apparatus in the above case allows us to locate the "cut" or "Schnitt" between the microscopic and macroscopic world at those components of the "adequately classical and deterministic" apparatus that put the apparatus in an irreversible stable state providing new information to the observer.

John Bell drew a diagram to show the various possible locations for what he called the "shifty split." Information physics shows us that the correct location for the boundary is the first of Bell's possibilities.

Thermodynamics
The second law of thermodynamics says that the entropy (or disorder) of a closed physical system increases until it reaches a maximum, the state of thermodynamic equilibrium. It requires that the entropy of the universe is now and has always been increasing. (The first law is that energy is conserved.)
This established fact of increasing entropy has led many scientists and philosophers to assume that the universe we have is running down. They think that means the universe began in a very high state of information, since the second law requires that any organization or order is susceptible to decay. The information that remains today, in their view, has always been here. This fits nicely with the idea of a deterministic universe. There is nothing new under the sun. Physical determinism is "information-preserving."
But the universe is not a closed system. It is in a dynamic state of expansion that is moving away from thermodynamic equilibrium faster than entropic processes can keep up. The maximum possible entropy is increasing much faster than the actual increase in entropy. The difference between the maximum possible entropy and the actual entropy is potential information.

Creation of information structures means that in parts of the universe the local entropy is actually going down. Reduction of entropy locally is always accompanied by radiation of entropy away from the local structures to distant parts of the universe, into the night sky for example. Since the total entropy in the universe always increases, the amount of entropy radiated away always exceeds (often by many times) the local reduction in entropy, which mathematically equals the increase in information.

"Ergodic" Processes

We will describe processes that create information structures, reducing the entropy locally, as "ergodic."

This is a new use for a term from statistical mechanics that describes a hypothetical property of classical mechanical gases. See the Ergodic Hypothesis.

Ergodic processes (in our new sense of the word) are those that appear to resist the second law of thermodynamics because of a local increase in information or "negative entropy" (Erwin Schrödinger's term). But any local decrease in entropy is more than compensated for by increases elsewhere, satisfying the second law. Normal entropy-increasing processes we will call "entropic".

Encoding new information requires the equivalent of a quantum measurement - each new bit of information produces a local decrease in entropy but requires that at least one bit (generally much much more) of entropy be radiated or conducted away.

Without violating the inviolable second law of thermodynamics overall, ergodic processes reduce the entropy locally, producing those pockets of cosmos and negative entropy (order and information-rich structures) that are the principal objects in the universe and in life on earth.

Entropy and Classical Mechanics
Ludwig Boltzmann attempted in the 1870's to prove Rudolf Clausius' second law of thermodynamics, namely that the entropy of a closed system always increases to a maximum and then remains in thermal equilibrium. Clausius predicted that the universe would end with a "heat death" because of the second law.

Boltzmann formulated a mathematical quantity H for a system of n ideal gas particles, showing that it had the property δΗ/δτ ≤ 0, that H always decreased with time. He identified his H as the opposite of Rudolf Clausius' entropy S.

In 1850 Clausius had formulated the second law of thermodynamics. In 1857 he showed that for a typical gas like air at standard temperatures and pressures, the gas particles spend most of their time traveling in straight lines between collisions with the wall of a containing vessel or with other gas particles. He defined the "mean free path" of a particle between collisions. Clausius and essentially all physicists since have assumed that gas particles can be treated as structureless "billiard balls" undergoing "elastic" collisions. Elastic means no motion energy is lost to internal friction.

Shortly after Clausius first defined the entropy mathematically and named it in 1865, James Clerk Maxwell determined the distribution of velocities of gas particles (Clausius for simplicity had assumed that all particles moved at the average speed 1/2mv2 = 3/2kT).

Maxwell's derivation was very simple. He assumed the velocities in the x, y, and z directions were independent. [more...]

Boltzmann improved on Maxwell's statistical derivation by equating the number of particles entering a given range of velocities and positions to the number leaving the same volume in 6n-dimensional phase space. This is a necessary state for the gas to be in equilibrium. Boltzmann then used Newtonian physics to get the same result as Maxwell, which is thus called the Maxwell-Boltzmann distribution.

Boltzmann's first derivation of his H-theorem (1872) was based on the same classical mechanical analysis he had used to derive Maxwell's distribution function. It was an analytical mathematical consequence of Newton's laws of motion applied to the particles of a gas. But it ran into immediate objections. The objection is the hypothetical and counterfactual idea of time reversibility. If time were reversed, the entropy would simply decrease. Since the fundamental Newtonian equations of motion are time reversible, this appears to be a paradox. How could the irreversibile increase of the macroscopic entropy result from microscopic physical laws that are time reversible?

Lord Kelvin (William Thomson) was the first to point out the time asymmetry in macroscopic processes, but the criticism of Boltzmann's H-theorem is associated with his lifelong friend Joseph Loschmidt. Boltzmann immediately agreed with Loschmidt that the possibility of decreasing entropy could not be ruled out if the classical motion paths were reversed.

Boltzmann then reformulated his H-theorem (1877). He analyzed a gas into "microstates" of the individual gas particle positions and velocities. For any "macrostate" consistent with certain macroscopic variables like volume, pressure, and temperature, there could be many microstates corresponding to different locations and speeds for the individual particles.

Any individual microstate of the system was intrinsically as probable as any other specific microstate, he said. But the number of microstates consistent with the disorderly or uniform distribution in the equilibrium case of maximum entropy simply overwhelms the number of microstates consistent with an orderly initial distribution.

About twenty years later, Boltzmann's revised argument that entropy statistically increased ran into another criticism, this time not so counterfactual. This is the recurrence objection. Given enough time, any system could return to its starting state, which implies that the entropy must at some point decrease. These reversibility and recurrence objections are still prominent in the physics literature.

The recurrence idea has a long intellectual history. Ancient Babylonian astronomers thought the known planets would, given enough time, return to any given position and thus begin again what they called a "great cycle," estimated by some at 36,000 years. Their belief in an astrological determinism suggested that all events in the world would also recur. Friedrich Nietzsche made this idea famous in the nineteenth century, at the same time as Boltzmann's hypothesis was being debated, as the "eternal return" in his Also Sprach Zarathustra.

The recurrence objection was first noted in the early 1890's by French mathematician and physicist Henri Poincaré. He had found an analytic solution to the three-body problem and noted that the configuration of three bodies returns arbitrarily close to the initial conditions after calculable times. Even for a handful of planets, the recurrence time is longer than the age of the universe, if the positions are specified precisely enough. Poincaré then proposed that the presumed "heat death" of the universe predicted by the second law of thermodynamics could be avoided by "a little patience." Another mathematician, Ernst Zermelo, a young colleague of Max Planck in Berlin, is more famous for this recurrence paradox.

Boltzmann accepted the recurrence criticism. He calculated the extremely small probability that entropy would decrease noticeably, even for gas with a very small number of particles (1000). He showed the time associated with such an event was 101010 years. But the objections in principle to his work continued, especially from those who thought the atomic hypothesis was wrong.

It is very important to understand that both Maxwell's original derivation of the velocities distribution and Boltzmann's H-theorem showing an entropy increase are only statistical or probabilistic arguments. Boltzmann's work was done twenty years before atoms were established as real and fifty years before the theory of quantum mechanics established that at the microscopic level all interactions of matter and energy are fundamentally and irreducibly statistical and probabilistic.

Entropy and Quantum Mechanics
A quantum mechanical analysis of the microscopic collisions of gas particles (these are usually molecules - or atoms in a noble gas) can provide revised analyses for the two problems of reversibility and recurrence. Note this requires more than quantum statistical mechanics. It needs the quantum kinetic theory of collisions in gases.

There are great differences between Ideal, Classical, and Quantum Gases.

Boltzmann assumed that collisions would result in random distributions of velocities and positions so that all the possible configurations would be realized in proportion to their number. He called this "molecular chaos." But if the path of a system of n particles in 6n-dimensional phase space should be closed and repeat itself after a short and finite time during which the system occupies only a small fraction of the possible states, Boltzmann's assumptions would be wrong.

What is needed is for collisions to completely randomize the directions of particles after collisions, and this is just what the quantum theory of collisions can provide. Randomization of directions is the norm in some quantum phenomena, for example the absorption and re-emission of photons by atoms as well as Raman scattering of photons.

In the deterministic evolution of the Schrödinger equation, just as in the classical path evolution of the Hamiltonian equations of motion, the time can be reversed and all the coherent information in the wave function will describe a particle that goes back exactly the way it came before the collision.

But if when two particles collide the internal structure of one or both of the particles is changed, and particularly if the two particles form a temporary larger molecule (even a quasi-molecule in an unbound state), then the separating atoms or molecules lose the coherent wave functions that would be needed to allow time reversal back along the original path.

During the collision, one particle can transfer energy from one of its internal quantum states to the other particle. At room temperature, this will typically be a transition between rotational states that are populated. Another possibility is an exchange of energy with the background thermal radiation, which at room temperatures peaks at the frequencies of molecular rotational energy level differences.

Such a quantum event can be analyzed by assuming a short-lived quasi-molecule is formed (the energy levels for such an unbound system are a continuum of, so that almost any photon can cause a change of rotational state of the quasi-molecule.

A short time later, the quasi-molecule dissociates into the two original particles but in different energy states. We can describe the overall process as a quasi-measurement, because there is temporary information present about the new structure. This information is lost as the particles separate in random directions (consistent with conservation of energy, momentum, and angular momentum).

The decoherence associated with this quasi-measurement means that if the post-collision wave functions were to be time reversed, the reverse collision would be very unlikely to send the particles back along their incoming trajectories.

Boltzmann's assumption of random occupancy of possible configurations is no longer necessary. Randomness in the form of "molecular chaos" is assured by quantum mechanics.

The result is a statistical picture that shows that entropy would normally increase even if time could be reversed.

This does not rule out the kind of departures from equilibrium that occur in small groups of particles as in Brownian motion, which Boltzmann anticipated long before Brown's experiments and Einstein's explanation. These fluctuations can be described as forming short-lived information structures, brief and localized regions of negative entropy, that get destroyed in subsequent interactions.

Nor does it change the remote possibility of a recurrence of any particular initial microstate of the system. But it does prove that Poincaré was wrong about such a recurrence being periodic. Periodicity depends on the dynamical paths of particles being classical, deterministic, and thus time reversible. Since quantum mechanical paths are fundamentally indeterministic, recurrences are simply statistically improbable departures from equilibrium, like the fluctuations that cause Brownian motion.

Entropy is Lost Information
Entropy increase can be easily understood as the loss of information as a system moves from an initially ordered state to a final disordered state. Although the physical dimensions of thermodynamic entropy (joules/ºK) are not the same as (dimensionless) mathematical information, apart from units they share the same famous formula.
S = ∑ pi ln pi
To see this very simply, let's consider the well-known example of a bottle of perfume in the corner of a room. We can represent the room as a grid of 64 squares. Suppose the air is filled with molecules moving randomly at room temperature (blue circles). In the lower left corner the perfume molecules will be released when we open the bottle (when we start the demonstration).

What is the quantity of information we have about the perfume molecules? We know their location in the lower left square, a bit less than 1/64th of the container. The quantity of information is determined by the minimum number of yes/no questions it takes to locate them. The best questions are those that split the locations evenly (a binary tree).

For example:

  • Are they in the upper half of the container? No.
  • Are they in the left half of the container? Yes.
  • Are they in the upper half of the lower left quadrant? No.
  • Are they in the left half of the lower left quadrant? Yes.
  • Are they in the upper half of the lower left octant? No.
  • Are they in the left half of the lower left octant? Yes.
Answers to these six optimized questions give us six bits of information for each molecule, locating it to 1/64th of the container. This is the amount of information that will be lost for each molecule if it is allowed to escape and diffuse fully into the room. The thermodynamic entropy increase is Boltzmann's constant k multiplied by the number of bits.

If the room had no air, the perfume would rapidly reach an equilibrium state, since the molecular velocity at room temperature is about 400 meters/second. Collisions with air molecules prevent the perfume from dissipating quickly. This lets us see the approach to equilibrium. When the perfume has diffused to one-sixteenth of the room, the entropy will have risen 2 bits for each molecule, to one-quarter of the room, four bits, etc.

For Teachers
For Scholars


Chapter 1.1 - Creation Chapter 1.3 - Information
Home Part Two - Knowledge
Normal | Teacher | Scholar