John von Neumann
In his 1932 Mathematical Foundations of Quantum Mechanics (in German, English edition 1955) John von Neumann explained that two fundamentally different processes are going on in quantum mechanics (in a temporal sequence for a given particle - not at the same time).
It gave rise to the so-called problem of measurement, because its randomness prevents it from being a part of the deterministic mathematics of process 2.Information physics has solved the problem of measurement by identifying the moment and place of the collapse of the wave function with the creation of an observable information structure. There are interactions which create collapses but do not create stable information structures. These can never be the basis of measurements. The presence of a conscious observer is not necessary. It is enough that the new information created is observable, should a human observer try to look at it in the future. Information physics is thus subtly involved in the question of what humans can know (epistemology). We must quote Von Neumann, where he relates irreversibility and reversibility to the time directions future and past. He wrote...
The two interventions 1. and 2. are fundamentally different from one another. That both are formally unique, i.e., causal is unimportant; indeed, since we are working in terms of the statistical properties of mixtures, it is not surprising that each change, even if it is statistical, effects a causal change of the probabilities and the expectation values. Indeed, it is precisely for this reason that one introduces statistical ensembles and probabilities! On the other hand, it is important that 2. does not increase the statistical uncertainty existing in U, but that 1. does: 2. transforms states into states
The Schnittvon Neumann described the collapse of the wave function as requiring a "cut" (Schnitt in German) between the microscopic quantum system and the observer. He said it did not matter where this cut was placed, because the mathematics would produce the same experimental results. There has been a lot of controversy and confusion about this cut. Eugene Wigner placed it outside a room which includes the measuring apparatus and an observer A, and just before observer B makes a measurement of the physical state of the room, which is imagined to evolve deterministically according to process 2 and the Schrödinger equation. The case of Schrödinger's Cat is thought to present a similar paradoxical problem. von Neumann contributed a lot to this confusion in his discussion of subjective perceptions and "psycho-physical parallelism," which was encouraged by Neils Bohr. Bohr interpreted his "complementarity principle" as explaining the difference between subjectivity and objectivity (as well as several other dualisms). Von Neumann wrote:
The difference between these two processes is a very fundamental one: aside from the different behaviors in regard to the principle of causality, they are also different in that the former is (thermodynamically) reversible, while the latter is not. Let us now compare these circumstances with those which actually exist in nature or in its observation. First, it is inherently entirely correct that the measurement or the related process of the subjective perception is a new entity relative to the physical environment and is not reducible to the latter. Indeed, subjective perception leads us into the intellectual inner life of the individual, which is extra-observational by its very nature (since it must be taken for granted by any conceivable observation or experiment). Nevertheless, it is a fundamental requirement of the scientific viewpoint -- the so-called principle of the psycho-physical parallelism -- that it must be possible so to describe the extra-physical process of the subjective perception as if it were in reality in the physical world -- i.e., to assign to its parts equivalent physical processes in the objective environment, in ordinary space. (Of course, in this correlating procedure there arises the frequent necessity of localizing some of these processes at points which lie within the portion of space occupied by our own bodies. But this does not alter the fact of their belonging to the "world about us," the objective environment referred to above.) In a simple example, these concepts might be applied about as follows: We wish to measure a temperature. If we want, we can pursue this process numerically until we have the temperature of the environment of the mercury container of the thermometer, and then say: this temperature is measured by the thermometer. But we can carry the calculation further, and from the properties of the mercury, which can be explained in kinetic and molecular terms, we can calculate its heating, expansion, and the resultant length of the mercury column, and then say: this length is seen by the observer. Going still further, and taking the light source into consideration, we could find out the reflection of the light quanta on the opaque mercury column, and the path of the remaining light quanta into the eye of the observer, their refraction in the eye lens, and the formation of an image on the retina, and then we would say: this image is registered by the retina of the observer. And were our physiological knowledge more precise than it is today, we could go still further, tracing the chemical reactions which produce the impression of this image on the retina, in the optic nerve tract and in the brain, and then in the end say: these chemical changes of his brain cells are perceived by the observer. But in any case, no matter how far we calculate -- to the mercury vessel, to the scale of the thermometer, to the retina, or into the brain, at some time we must say: and this is perceived by the observer. That is, we must always divide the world into two parts, the one being the observed system, the other the observer. In the former, we can follow up all physical processes (in principle at least) arbitrarily precisely. In the latter, this is meaningless.The boundary between the two is arbitrary to a very large extent. In particular we saw in the four different possibilities in the example above, that the observer in this sense needs not to become identified with the body of the actual observer: In one instance in the above example, we included even the thermometer in it, while in another instance, even the eyes and optic nerve tract were not included. That this boundary can be pushed arbitrarily deeply into the interior of the body of the actual observer is the content of the principle of the psycho-physical parallelism -- but this does not change the fact that in each method of description the boundary must be put somewhere, if the method is not to proceed vacuously, i.e., if a comparison with experiment is to be possible. Indeed experience only makes statements of this type: an observer has made a certain (subjective) observation; and never any like this: a physical quantity has a certain value. Now quantum mechanics describes the events which occur in the observed portions of the world, so long as they do not interact with the observing portion, with the aid of the process 2, but as soon as such an interaction occurs, i.e., a measurement, it requires the application of process 1. The dual form is therefore justified.* However, the danger lies in the fact that the principle of the psycho-physical parallelism is violated, so long as it is not shown that the boundary between the observed system and the observer can be displaced arbitrarily in the sense given above. Information physics places the cut or boundary at the place and time of information creation. It is only after information is created that an observer could make an observation. Beforehand, there is no information to be observed. We can adapt John Bell's illustration of the cut, which Bell called the "shifty split," to show the moment of new information creation. Information creation occurs as a result of the interaction between the microscopic system and the measuring apparatus. It was a severe case of anthropomorphism to think it required the consciousness of an observer for the wave function to collapse. The collapse of a wave function and information creation has been going on in the universe for billions of years before human consciousness emerged.
Statistical Regularities and Underlying DeterminismAdolphe Quetelet saw social statistics as implying underlying deterministic laws. Of course, most mathematicians (cf. De Moivre, Laplace) had believed that chance was merely epistemic, the result of human ignorance. In 1936, Von Neumann attempted to demonstrate that statistical laws could not be reduced to an underlying determinism by the introduction of "hidden variables." His theorem was not convincing to many outside science, especially philosophers of science who have continued to pursue "hidden variable" interpretations of quantum mechanics .
Normal | Teacher | Scholar