The Physics of Information Creation
core creative process in the universe involves quantum mechanics and thermodynamics. To understand information creation, information physics provides new insights into the puzzling "problem of measurement" and the mysterious "collapse of the wave function" in quantum mechanics. It results in a new information interpretation of quantum mechanics that disentangles the Einstein-Podolsky-Rosen paradox and explains the origins of information structures in the universe. Information physics also probes deeply into the second law of thermodynamics to establish the irreversible increase of entropy on a quantum mechanical basis, something that could not be shown by classical statistical mechanics or even quantum statistical physics.. Although "Information physics" is a new "interpretation" of quantum mechanics, it is not an attempt to alter the standard quantum mechanics, for example, extending it to theories such as "hidden variables" to restore determinism or adding terms to the Schrödinger equation to force a collapse. Information physics investigates the quantum mechanical and thermodynamic implications of cosmic information structures, especially those that were created before the existence of human observers. It shows that no "conscious observers" are required as with the Copenhagen Interpretation or the work of John von Neumann or Eugene Wigner. Information physics proposes to show that everything created since the origin of the universe over thirteen billion years ago has involved just two fundamental physical processes that combine to form the core of all creative processes. These two steps occur whenever even a single bit of new information is created and comes into the universe.Information physics shows that the
In classical mechanics, the material universe is thought to be made up of tiny particles whose motions are completely determined by forces that act between the particles, forces such as gravitation, electrical attractions and repulsions, etc. The equations that describe those motions, Newton's laws of motion, were for many centuries thought to be perfect and sufficient to predict the future of any mechanical system. They provided support for many philosophical ideas about determinism. In classical electrodynamics, electromagnetic radiation (light, radio) was known to have wave properties such as interference. When the crest of one wave meets the trough of another, the two waves cancel one another. In quantum mechanics, radiation is found to have some particle-like behavior. Energy comes in discrete physically localized packages. Max Planck in 1900 made the famous assumption that the energy was proportional to the frequency of radiation ν.
E = hνFor Planck, this assumption was just a heuristic mathematical device that allowed him to apply Ludwig Boltzmann's work on the statistical mechanics and kinetic theory of gases. Boltzmann had shown in the 1870's that the increase in entropy (the second law) could be explained if gases were made up of enormous numbers of particles. Planck applied Boltzmann's statistics of many particles to radiation and derived the distribution of radiation at different frequencies (or wavelengths) just as James Clerk Maxwell and Boltzmann had derived the distribution of velocities (or energies) of the gas particles. Note the mathematical similarity of Planck's radiation distribution law (photons) and the Maxwell-Boltzmann velocity distribution (molecules). Both curves have a power law increase on one side to a maximum and an exponential decrease on the other side of the maximum. The molecular velocity curves cross one another because the total number of molecules is the same. With increasing temperature T, the number of photons increases at all wavelengths.
p = h/2πλExperiments confirmed de Broglie's assumption and led Erwin Schrödinger to derive a "wave equation" to describe the motion of de Broglie's waves. Schrödinger's equation replaces the classical Newton equations of motion. Note that Schrödinger's equation describes the motion of only the wave aspect, not the particle aspect, and as such it implies interference. Note also that it is as fully deterministic an equation of motion as Newton's equations. Schrödinger attempted to interpret his "wave function" for the electron as a probability density for electrical charge, but charge density would be positive everywhere and unable to interfere with itself. Max Born shocked the world of physics by suggesting that the absolute values of the wave function ψ squared (|ψ|2) could be interpreted as the probability of finding the electron in various position and momentum states - if a measurement is made. This allows the probability amplitude ψ to interfere with itself, producing highly non-intuitive phenomena such as the two-slit experiment. Despite the probability amplitude going through two slits and interfering with itself, experimenters never find parts of electrons. They always are found whole. In 1932 John von Neumann explained that two fundamentally different processes are going on in quantum mechanics.
Collapse of the Wave Function
Physicists calculate the deterministic evolution of the Schrödinger wave function in time as systems interact or collide. At some point, they make the ad hoc assumption that the wave function "collapses." This produces a set of probabilities of finding the resulting combined system in its various eigenstates.
Although the collapse appears to be a random and ad hoc addition to the deterministic formalism of the Schrödinger equation, it is very important to note that the experimental accuracy of quantum mechanical predictions is unparalleled in physics, providing the ultimate justification for this theoretical kluge. Moreover, without wave functions collapsing, no new information can come into the universe. Nothing unpredicatable would ever emerge. Determinism is "information-preserving." All the information we have today would have to have already existed in the original fireball at the universe origin.
The "Problem" of Measurement
Quantum measurement (the irreducibly random process of wave function collapse) is not a part of the mathematical formalism of wave function time evolution (the Schrödinger equation of motion is a perfectly deterministic process). The hypothesized collapse is an ad hoc heuristic description and method of calculation that predicts the probabilities of what will happen when an observer makes a measurement. In many standard discussions of quantum mechanics, and most popular treatments, it is said that we need the consciousness of a physicist to collapse the wave function. Eugene Wigner and John Wheeler sometimes describe the observer as making up the "mind of the universe." John Bell sardonically asked whether the observer needs a Ph.D. Von Neumann contributed a lot to this confusion by claiming that the location of a "cut" (Schnitt) between the microscopic system and macroscopic measurement system could be anywhere - including inside an observer's brain. Information physics will locate the cut (outside the brain). Measurement requires the interaction of something macroscopic, assumed to be large and adequately determined. In physics experiments, this is the observing apparatus. But in general, measurement does not require a conscious observer. It does require information creation or there will be nothing to observe. In our discussion of Schrödinger's Cat, the cat can be its own observer.
The Boundary between the Classical and Quantum WorldsSome scientists (Werner Heisenberg, John von Neumann, Eugene Wigner and John Bell, for example) have argued that in the absence of a conscious observer, or some "cut" between the microscopic and macroscopic world, the evolution of the quantum system ψ and the macroscopic measuring apparatus A would be described deterministically by Schrödinger's equation of motion for the wave function
| ψ + A > with the Hamiltonian H energy operator,
Our quantum mechanical analysis of the measurement apparatus in the above case allows us to locate the "cut" or "Schnitt" between the microscopic and macroscopic world at those components of the "adequately classical and deterministic" apparatus that put the apparatus in an irreversible stable state providing new information to the observer. John Bell drew a diagram to show the various possible locations for what he called the "shifty split." Information physics shows us that the correct location for the boundary is the first of Bell's possibilities.
The second law of thermodynamics says that the entropy (or disorder) of a closed physical system increases until it reaches a maximum, the state of thermodynamic equilibrium. It requires that the entropy of the universe is now and has always been increasing. (The first law is that energy is conserved.)
This established fact of increasing entropy has led many scientists and philosophers to assume that the universe we have is running down. They think that means the universe began in a very high state of information, since the second law requires that any organization or order is susceptible to decay. The information that remains today, in their view, has always been here. This fits nicely with the idea of a deterministic universe. There is nothing new under the sun. Physical determinism is "information-preserving."
But the universe is not a closed system. It is in a dynamic state of expansion that is moving away from thermodynamic equilibrium faster than entropic processes can keep up. The maximum possible entropy is increasing much faster than the actual increase in entropy. The difference between the maximum possible entropy and the actual entropy is potential information.
Creation of information structures means that in parts of the universe the local entropy is actually going down. Reduction of entropy locally is always accompanied by radiation of entropy away from the local structures to distant parts of the universe, into the night sky for example. Since the total entropy in the universe always increases, the amount of entropy radiated away always exceeds (often by many times) the local reduction in entropy, which mathematically equals the increase in information.
"Ergodic" ProcessesWe will describe processes that create information structures, reducing the entropy locally, as "ergodic." This is a new use for a term from statistical mechanics that describes a hypothetical property of classical mechanical gases. See the Ergodic Hypothesis. Ergodic processes (in our new sense of the word) are those that appear to resist the second law of thermodynamics because of a local increase in information or "negative entropy" (Erwin Schrödinger's term). But any local decrease in entropy is more than compensated for by increases elsewhere, satisfying the second law. Normal entropy-increasing processes we will call "entropic". Encoding new information requires the equivalent of a quantum measurement - each new bit of information produces a local decrease in entropy but requires that at least one bit (generally much much more) of entropy be radiated or conducted away. Without violating the inviolable second law of thermodynamics overall, ergodic processes reduce the entropy locally, producing those pockets of cosmos and negative entropy (order and information-rich structures) that are the principal objects in the universe and in life on earth.
Entropy and Classical MechanicsLudwig Boltzmann attempted in the 1870's to prove Rudolf Clausius' second law of thermodynamics, namely that the entropy of a closed system always increases to a maximum and then remains in thermal equilibrium. Clausius predicted that the universe would end with a "heat death" because of the second law. Boltzmann formulated a mathematical quantity H for a system of n ideal gas particles, showing that it had the property δΗ/δτ ≤ 0, that H always decreased with time. He identified his H as the opposite of Rudolf Clausius' entropy S. In 1850 Clausius had formulated the second law of thermodynamics. In 1857 he showed that for a typical gas like air at standard temperatures and pressures, the gas particles spend most of their time traveling in straight lines between collisions with the wall of a containing vessel or with other gas particles. He defined the "mean free path" of a particle between collisions. Clausius and essentially all physicists since have assumed that gas particles can be treated as structureless "billiard balls" undergoing "elastic" collisions. Elastic means no motion energy is lost to internal friction. Shortly after Clausius first defined the entropy mathematically and named it in 1865, James Clerk Maxwell determined the distribution of velocities of gas particles (Clausius for simplicity had assumed that all particles moved at the average speed 1/2mv2 = 3/2kT). Maxwell's derivation was very simple. He assumed the velocities in the x, y, and z directions were independent. [more...] Boltzmann improved on Maxwell's statistical derivation by equating the number of particles entering a given range of velocities and positions to the number leaving the same volume in 6n-dimensional phase space. This is a necessary state for the gas to be in equilibrium. Boltzmann then used Newtonian physics to get the same result as Maxwell, which is thus called the Maxwell-Boltzmann distribution. Friedrich Nietzsche made this idea famous in the nineteenth century, at the same time as Boltzmann's hypothesis was being debated, as the "eternal return" in his Also Sprach Zarathustra. The recurrence objection was first noted in the early 1890's by French mathematician and physicist Henri Poincaré. He had found an analytic solution to the three-body problem and noted that the configuration of three bodies returns arbitrarily close to the initial conditions after calculable times. Even for a handful of planets, the recurrence time is longer than the age of the universe, if the positions are specified precisely enough. Poincaré then proposed that the presumed "heat death" of the universe predicted by the second law of thermodynamics could be avoided by "a little patience." Another mathematician, Ernst Zermelo, a young colleague of Max Planck in Berlin, is more famous for this recurrence paradox. Boltzmann accepted the recurrence criticism. He calculated the extremely small probability that entropy would decrease noticeably, even for gas with a very small number of particles (1000). He showed the time associated with such an event was 101010 years. But the objections in principle to his work continued, especially from those who thought the atomic hypothesis was wrong. It is very important to understand that both Maxwell's original derivation of the velocities distribution and Boltzmann's H-theorem showing an entropy increase are only statistical or probabilistic arguments. Boltzmann's work was done twenty years before atoms were established as real and fifty years before the theory of quantum mechanics established that at the microscopic level all interactions of matter and energy are fundamentally and irreducibly statistical and probabilistic.
Entropy and Quantum MechanicsA quantum mechanical analysis of the microscopic collisions of gas particles (these are usually molecules - or atoms in a noble gas) can provide revised analyses for the two problems of reversibility and recurrence. Note this requires more than quantum statistical mechanics. It needs the quantum kinetic theory of collisions in gases. There are great differences between Ideal, Classical, and Quantum Gases. Boltzmann assumed that collisions would result in random distributions of velocities and positions so that all the possible configurations would be realized in proportion to their number. He called this "molecular chaos." But if the path of a system of n particles in 6n-dimensional phase space should be closed and repeat itself after a short and finite time during which the system occupies only a small fraction of the possible states, Boltzmann's assumptions would be wrong. What is needed is for collisions to completely randomize the directions of particles after collisions, and this is just what the quantum theory of collisions can provide. Randomization of directions is the norm in some quantum phenomena, for example the absorption and re-emission of photons by atoms as well as Raman scattering of photons. In the deterministic evolution of the Schrödinger equation, just as in the classical path evolution of the Hamiltonian equations of motion, the time can be reversed and all the coherent information in the wave function will describe a particle that goes back exactly the way it came before the collision. But if when two particles collide the internal structure of one or both of the particles is changed, and particularly if the two particles form a temporary larger molecule (even a quasi-molecule in an unbound state), then the separating atoms or molecules lose the coherent wave functions that would be needed to allow time reversal back along the original path. During the collision, one particle can transfer energy from one of its internal quantum states to the other particle. At room temperature, this will typically be a transition between rotational states that are populated. Another possibility is an exchange of energy with the background thermal radiation, which at room temperatures peaks at the frequencies of molecular rotational energy level differences. Such a quantum event can be analyzed by assuming a short-lived quasi-molecule is formed (the energy levels for such an unbound system are a continuum of, so that almost any photon can cause a change of rotational state of the quasi-molecule. A short time later, the quasi-molecule dissociates into the two original particles but in different energy states. We can describe the overall process as a quasi-measurement, because there is temporary information present about the new structure. This information is lost as the particles separate in random directions (consistent with conservation of energy, momentum, and angular momentum). The decoherence associated with this quasi-measurement means that if the post-collision wave functions were to be time reversed, the reverse collision would be very unlikely to send the particles back along their incoming trajectories. Boltzmann's assumption of random occupancy of possible configurations is no longer necessary. Randomness in the form of "molecular chaos" is assured by quantum mechanics. The result is a statistical picture that shows that entropy would normally increase even if time could be reversed. This does not rule out the kind of departures from equilibrium that occur in small groups of particles as in Brownian motion, which Boltzmann anticipated long before Brown's experiments and Einstein's explanation. These fluctuations can be described as forming short-lived information structures, brief and localized regions of negative entropy, that get destroyed in subsequent interactions. Nor does it change the remote possibility of a recurrence of any particular initial microstate of the system. But it does prove that Poincaré was wrong about such a recurrence being periodic. Periodicity depends on the dynamical paths of particles being classical, deterministic, and thus time reversible. Since quantum mechanical paths are fundamentally indeterministic, recurrences are simply statistically improbable departures from equilibrium, like the fluctuations that cause Brownian motion.
Entropy is Lost InformationEntropy increase can be easily understood as the loss of information as a system moves from an initially ordered state to a final disordered state. Although the physical dimensions of thermodynamic entropy (joules/ºK) are not the same as (dimensionless) mathematical information, apart from units they share the same famous formula.
S = ∑ pi ln piTo see this very simply, let's consider the well-known example of a bottle of perfume in the corner of a room. We can represent the room as a grid of 64 squares. Suppose the air is filled with molecules moving randomly at room temperature (blue circles). In the lower left corner the perfume molecules will be released when we open the bottle (when we start the demonstration). What is the quantity of information we have about the perfume molecules? We know their location in the lower left square, a bit less than 1/64th of the container. The quantity of information is determined by the minimum number of yes/no questions it takes to locate them. The best questions are those that split the locations evenly (a binary tree). For example: