John von Neumann

In his 1932 Mathematical Foundations of Quantum Mechanics (in German, English edition 1955) John von Neumann explained that two fundamentally different processes are going on in quantum mechanics (in a temporal sequence for a given particle - not at the same time).
1. Process 1. A non-causal process, in which the measured electron winds up randomly in one of the possible physical states (eigenstates) of the measuring apparatus plus electron.

The probability for each eigenstate is given by the square of the coefficients cn of the expansion of the original system state (wave function ψ) in an infinite set of wave functions φ that represent the eigenfunctions of the measuring apparatus plus electron.

cn = < φn | ψ >

This is as close as we get to a description of the motion of the particle aspect of a quantum system. According to von Neumann, the particle simply shows up somewhere as a result of a measurement.

Information physics says that the particle shows up whenever a new stable information structure is created, information that can be observed.

2. Process 2. A causal process, in which the electron wave function ψ evolves deterministically according to Erwin Schrödinger's equation of motion for the wavelike aspect. This evolution describes the motion of the probability amplitude wave ψ between measurements. The wave function exhibits interference effects. But interference is destroyed if the particle has a definite position or momentum. The particle path itself can not be observed.

(ih/2π) ∂ψ/∂t =

Von Neumann claimed there is another major difference between these two processes. Process 1 is thermodynamically irreversible. Process 2 is reversible. This confirms the fundamental connection between quantum mechanics and thermodynamics that is explainable by information physics and the information interpretation of quantum mechanics.

Information physics establishes that process 1 may create information. It is always involved when information is created.

Process 2 is deterministic and information preserving or conserving.

The first of these processes has come to be called the collapse of the wave function.

It gave rise to the so-called problem of measurement, because its randomness prevents it from being a part of the deterministic mathematics of process 2.

Information physics has solved the problem of measurement by identifying the moment and place of the collapse of the wave function with the creation of an observable information structure. There are interactions which create collapses but do not create stable information structures. These can never be the basis of measurements.

The presence of a conscious observer is not necessary. It is enough that the new information created is observable, should a human observer try to look at it in the future. Information physics is thus subtly involved in the question of what humans can know (epistemology).

We must quote Von Neumann, where he relates irreversibility and reversibility to the time directions future and past. He wrote...

The two interventions 1. and 2. are fundamentally different from one another. That both are formally unique, i.e., causal is unimportant; indeed, since we are working in terms of the statistical properties of mixtures, it is not surprising that each change, even if it is statistical, effects a causal change of the probabilities and the expectation values. Indeed, it is precisely for this reason that one introduces statistical ensembles and probabilities! On the other hand, it is important that 2. does not increase the statistical uncertainty existing in U, but that 1. does: 2. transforms states into states

P[φ] into P[e-(2πi/h)tHφ]

while 1. can transform states into mixtures. In this sense, therefore, the development of a state according to 1. is statistical, while according to 2. it is causal.

Furthermore, for fixed H and t, 2. is simply a unitary transformation of all U: Ut = AUA-1,
A = e-(2πi/h)tH is unitary. That is, Uf = g implies that Ut(Af) = Ag, so that Ut results from U by the unitary transformation of Hilbert space, that is, by an isomorphism which leaves all our basic geometric concepts invariant (cf. the principles set down in I.4.). Therefore, it is reversible: it suffices to replace A by A-1 -- and this is possible, since A, A-1 can be regarded as entirely arbitrary unitary operators because of the far reaching freedom in the choice of H, t. Just as in classical mechanics therefore, 2. does not reproduce one of the most important and striking properties of the real world, namely its irreversibility, the fundamental difference between the time directions, "future" and "past."

1. behaves in fundamentally different fashion: the transition

 U -> U' = ∞ ∑ n=1 (Uφn, φn)P[φn]
is certainly not prima facie reversible. We shall soon see that it is in generable irreversible, in the sense that it is not possible in general to come back from a given U' to its U by repeated application of any processes 1., 2.

Therefore, we have reached a point at which it is desirable to utilize the thermodynamic method of analysis, because it alone makes it possible for us to understand correctly the difference between 1. and 2., into which reversibility questions obviously enter.

Mathematical Foundations of Quantum Mechanics, Princeton University Press, 1955, pp.357-358 (English translation by Robert T. Beyer)

The Schnitt

von Neumann described the collapse of the wave function as requiring a "cut" (Schnitt in German) between the microscopic quantum system and the observer. He said it did not matter where this cut was placed, because the mathematics would produce the same experimental results.

There has been a lot of controversy and confusion about this cut. Eugene Wigner placed it outside a room which includes the measuring apparatus and an observer A, and just before observer B makes a measurement of the physical state of the room, which is imagined to evolve deterministically according to process 2 and the Schrödinger equation.

The case of Schrödinger's Cat is thought to present a similar paradoxical problem.

von Neumann contributed a lot to this confusion in his discussion of subjective perceptions and "psycho-physical parallelism," which was encouraged by Neils Bohr. Bohr interpreted his "complementarity principle" as explaining the difference between subjectivity and objectivity (as well as several other dualisms). Von Neumann wrote:

The difference between these two processes is a very fundamental one: aside from the different behaviors in regard to the principle of causality, they are also different in that the former is (thermodynamically) reversible, while the latter is not.

Let us now compare these circumstances with those which actually exist in nature or in its observation. First, it is inherently entirely correct that the measurement or the related process of the subjective perception is a new entity relative to the physical environment and is not reducible to the latter. Indeed, subjective perception leads us into the intellectual inner life of the individual, which is extra-observational by its very nature (since it must be taken for granted by any conceivable observation or experiment).

Nevertheless, it is a fundamental requirement of the scientific viewpoint -- the so-called principle of the psycho-physical parallelism -- that it must be possible so to describe the extra-physical process of the subjective perception as if it were in reality in the physical world -- i.e., to assign to its parts equivalent physical processes in the objective environment, in ordinary space. (Of course, in this correlating procedure there arises the frequent necessity of localizing some of these processes at points which lie within the portion of space occupied by our own bodies. But this does not alter the fact of their belonging to the "world about us," the objective environment referred to above.)

In a simple example, these concepts might be applied about as follows: We wish to measure a temperature. If we want, we can pursue this process numerically until we have the temperature of the environment of the mercury container of the thermometer, and then say: this temperature is measured by the thermometer. But we can carry the calculation further, and from the properties of the mercury, which can be explained in kinetic and molecular terms, we can calculate its heating, expansion, and the resultant length of the mercury column, and then say: this length is seen by the observer.

Going still further, and taking the light source into consideration, we could find out the reflection of the light quanta on the opaque mercury column, and the path of the remaining light quanta into the eye of the observer, their refraction in the eye lens, and the formation of an image on the retina, and then we would say: this image is registered by the retina of the observer.

And were our physiological knowledge more precise than it is today, we could go still further, tracing the chemical reactions which produce the impression of this image on the retina, in the optic nerve tract and in the brain, and then in the end say: these chemical changes of his brain cells are perceived by the observer. But in any case, no matter how far we calculate -- to the mercury vessel, to the scale of the thermometer, to the retina, or into the brain, at some time we must say: and this is perceived by the observer. That is, we must always divide the world into two parts, the one being the observed system, the other the observer. In the former, we can follow up all physical processes (in principle at least) arbitrarily precisely. In the latter, this is meaningless.

the Schnitt
The boundary between the two is arbitrary to a very large extent. In particular we saw in the four different possibilities in the example above, that the observer in this sense needs not to become identified with the body of the actual observer: In one instance in the above example, we included even the thermometer in it, while in another instance, even the eyes and optic nerve tract were not included. That this boundary can be pushed arbitrarily deeply into the interior of the body of the actual observer is the content of the principle of the psycho-physical parallelism -- but this does not change the fact that in each method of description the boundary must be put somewhere, if the method is not to proceed vacuously, i.e., if a comparison with experiment is to be possible. Indeed experience only makes statements of this type: an observer has made a certain (subjective) observation; and never any like this: a physical quantity has a certain value.

Now quantum mechanics describes the events which occur in the observed portions of the world, so long as they do not interact with the observing portion, with the aid of the process 2, but as soon as such an interaction occurs, i.e., a measurement, it requires the application of process 1. The dual form is therefore justified.* However, the danger lies in the fact that the principle of the psycho-physical parallelism is violated, so long as it is not shown that the boundary between the observed system and the observer can be displaced arbitrarily in the sense given above.

Information physics places the cut or boundary at the place and time of information creation. It is only after information is created that an observer could make an observation. Beforehand, there is no information to be observed.

We can adapt John Bell's illustration of the cut, which Bell called the "shifty split," to show the moment of new information creation.

Information creation occurs as a result of the interaction between the microscopic system and the measuring apparatus. It was a severe case of anthropomorphism to think it required the consciousness of an observer for the wave function to collapse.

The collapse of a wave function and information creation has been going on in the universe for billions of years before human consciousness emerged.

Statistical Regularities and Underlying Determinism
Adolphe Quetelet saw social statistics as implying underlying deterministic laws. Of course, most mathematicians (cf. De Moivre, Laplace) had believed that chance was merely epistemic, the result of human ignorance.

In 1936, Von Neumann attempted to demonstrate that statistical laws could not be reduced to an underlying determinism by the introduction of "hidden variables." His theorem was not convincing to many outside science, especially philosophers of science who have continued to pursue "hidden variable" interpretations of quantum mechanics .

For Teachers
For Scholars

Normal | Teacher | Scholar