Citation for this page in APA citation style.           Close


Philosophers

Mortimer Adler
Rogers Albritton
Alexander of Aphrodisias
Samuel Alexander
William Alston
Anaximander
G.E.M.Anscombe
Anselm
Louise Antony
Thomas Aquinas
Aristotle
David Armstrong
Harald Atmanspacher
Robert Audi
Augustine
J.L.Austin
A.J.Ayer
Alexander Bain
Mark Balaguer
Jeffrey Barrett
William Belsham
Henri Bergson
George Berkeley
Isaiah Berlin
Richard J. Bernstein
Bernard Berofsky
Robert Bishop
Max Black
Susanne Bobzien
Emil du Bois-Reymond
Hilary Bok
Laurence BonJour
George Boole
Émile Boutroux
F.H.Bradley
C.D.Broad
Michael Burke
C.A.Campbell
Joseph Keim Campbell
Rudolf Carnap
Carneades
Ernst Cassirer
David Chalmers
Roderick Chisholm
Chrysippus
Cicero
Randolph Clarke
Samuel Clarke
Anthony Collins
Antonella Corradini
Diodorus Cronus
Jonathan Dancy
Donald Davidson
Mario De Caro
Democritus
Daniel Dennett
Jacques Derrida
René Descartes
Richard Double
Fred Dretske
John Dupré
John Earman
Laura Waddell Ekstrom
Epictetus
Epicurus
Herbert Feigl
John Martin Fischer
Owen Flanagan
Luciano Floridi
Philippa Foot
Alfred Fouilleé
Harry Frankfurt
Richard L. Franklin
Michael Frede
Gottlob Frege
Peter Geach
Edmund Gettier
Carl Ginet
Alvin Goldman
Gorgias
Nicholas St. John Green
H.Paul Grice
Ian Hacking
Ishtiyaque Haji
Stuart Hampshire
W.F.R.Hardie
Sam Harris
William Hasker
R.M.Hare
Georg W.F. Hegel
Martin Heidegger
Heraclitus
R.E.Hobart
Thomas Hobbes
David Hodgson
Shadsworth Hodgson
Baron d'Holbach
Ted Honderich
Pamela Huby
David Hume
Ferenc Huoranszki
William James
Lord Kames
Robert Kane
Immanuel Kant
Tomis Kapitan
Jaegwon Kim
William King
Hilary Kornblith
Christine Korsgaard
Saul Kripke
Andrea Lavazza
Keith Lehrer
Gottfried Leibniz
Leucippus
Michael Levin
George Henry Lewes
C.I.Lewis
David Lewis
Peter Lipton
C. Lloyd Morgan
John Locke
Michael Lockwood
E. Jonathan Lowe
John R. Lucas
Lucretius
Alasdair MacIntyre
Ruth Barcan Marcus
James Martineau
Storrs McCall
Hugh McCann
Colin McGinn
Michael McKenna
Brian McLaughlin
John McTaggart
Paul E. Meehl
Uwe Meixner
Alfred Mele
Trenton Merricks
John Stuart Mill
Dickinson Miller
G.E.Moore
Thomas Nagel
Friedrich Nietzsche
John Norton
P.H.Nowell-Smith
Robert Nozick
William of Ockham
Timothy O'Connor
Parmenides
David F. Pears
Charles Sanders Peirce
Derk Pereboom
Steven Pinker
Plato
Karl Popper
Porphyry
Huw Price
H.A.Prichard
Protagoras
Hilary Putnam
Willard van Orman Quine
Frank Ramsey
Ayn Rand
Michael Rea
Thomas Reid
Charles Renouvier
Nicholas Rescher
C.W.Rietdijk
Richard Rorty
Josiah Royce
Bertrand Russell
Paul Russell
Gilbert Ryle
Jean-Paul Sartre
Kenneth Sayre
T.M.Scanlon
Moritz Schlick
Arthur Schopenhauer
John Searle
Wilfrid Sellars
Alan Sidelle
Ted Sider
Henry Sidgwick
Walter Sinnott-Armstrong
J.J.C.Smart
Saul Smilansky
Michael Smith
Baruch Spinoza
L. Susan Stebbing
Isabelle Stengers
George F. Stout
Galen Strawson
Peter Strawson
Eleonore Stump
Francisco Suárez
Richard Taylor
Kevin Timpe
Mark Twain
Peter Unger
Peter van Inwagen
Manuel Vargas
John Venn
Kadri Vihvelin
Voltaire
G.H. von Wright
David Foster Wallace
R. Jay Wallace
W.G.Ward
Ted Warfield
Roy Weatherford
William Whewell
Alfred North Whitehead
David Widerker
David Wiggins
Bernard Williams
Timothy Williamson
Ludwig Wittgenstein
Susan Wolf

Scientists

Michael Arbib
Bernard Baars
Gregory Bateson
John S. Bell
Charles Bennett
Ludwig von Bertalanffy
Susan Blackmore
Margaret Boden
David Bohm
Niels Bohr
Ludwig Boltzmann
Emile Borel
Max Born
Satyendra Nath Bose
Walther Bothe
Hans Briegel
Leon Brillouin
Stephen Brush
Henry Thomas Buckle
S. H. Burbury
Donald Campbell
Anthony Cashmore
Eric Chaisson
Jean-Pierre Changeux
Arthur Holly Compton
John Conway
John Cramer
E. P. Culverwell
Charles Darwin
Terrence Deacon
Louis de Broglie
Max Delbrück
Abraham de Moivre
Paul Dirac
Hans Driesch
John Eccles
Arthur Stanley Eddington
Paul Ehrenfest
Albert Einstein
Hugh Everett, III
Franz Exner
Richard Feynman
R. A. Fisher
Joseph Fourier
Lila Gatlin
Michael Gazzaniga
GianCarlo Ghirardi
J. Willard Gibbs
Nicolas Gisin
Paul Glimcher
Thomas Gold
A.O.Gomes
Brian Goodwin
Joshua Greene
Jacques Hadamard
Patrick Haggard
Stuart Hameroff
Augustin Hamon
Sam Harris
Hyman Hartman
John-Dylan Haynes
Martin Heisenberg
Werner Heisenberg
John Herschel
Jesper Hoffmeyer
E. T. Jaynes
William Stanley Jevons
Roman Jakobson
Pascual Jordan
Ruth E. Kastner
Stuart Kauffman
Simon Kochen
Stephen Kosslyn
Ladislav Kovàč
Rolf Landauer
Alfred Landé
Pierre-Simon Laplace
David Layzer
Benjamin Libet
Seth Lloyd
Hendrik Lorentz
Josef Loschmidt
Ernst Mach
Donald MacKay
Henry Margenau
James Clerk Maxwell
Ernst Mayr
Ulrich Mohrhoff
Jacques Monod
Emmy Noether
Howard Pattee
Wolfgang Pauli
Massimo Pauri
Roger Penrose
Steven Pinker
Colin Pittendrigh
Max Planck
Susan Pockett
Henri Poincaré
Daniel Pollen
Ilya Prigogine
Hans Primas
Adolphe Quételet
Juan Roederer
Jerome Rothstein
David Ruelle
Erwin Schrödinger
Aaron Schurger
Claude Shannon
David Shiang
Herbert Simon
Dean Keith Simonton
B. F. Skinner
Roger Sperry
Henry Stapp
Tom Stonier
Antoine Suarez
Leo Szilard
William Thomson (Kelvin)
Peter Tse
Heinz von Foerster
John von Neumann
John B. Watson
Daniel Wegner
Steven Weinberg
Paul A. Weiss
John Wheeler
Wilhelm Wien
Norbert Wiener
Eugene Wigner
E. O. Wilson
H. Dieter Zeh
Ernst Zermelo
Wojciech Zurek

Presentations

Biosemiotics
Free Will
Mental Causation
James Symposium
 
John Stewart Bell

In 1964 John Bell showed how the 1935 "thought experiments" of Einstein, Podolsky, and Rosen (EPR) could be made into real experiments. He put limits on local "hidden variables" that might restore a deterministic physics in the form of what he called an "inequality," the violation of which would confirm standard quantum mechanics.

Some thinkers, mostly philosophers of science rather than working quantum physicists, think that Bell's work has restored the determinism in physics that Einstein had wanted and that Bell recovered the "local elements of reality" that Einstein hoped for.

But Bell himself came to the conclusion that local "hidden variables" will never be found that give the same results as quantum mechanics. This has come to be known as Bell's Theorem.

All theories that reproduce the predictions of quantum mechanics will be "nonlocal," Bell concluded. Nonlocality is an element of physical reality and it has produced some remarkable new applications of quantum physics, including quantum cryptography and quantum computing.

Bell based his idea of real experiments on the 1952 work of David Bohm. Bohm proposed an improvement on the original EPR experiment (which measured position and momentum). Bohm's reformulation of quantum mechanics postulates (undetectable) deterministic positions and trajectories for atomic particles, where the instantaneous collapse happens in a new "quantum potential" field that can move faster than light speed. But it is still a "nonlocal" theory.

So Bohm (and Bell) believed that nonlocal "hidden variables" might exist, and that some form of information could come into existence at remote "space-like separations" at speeds faster then light, if not instantaneously.

The original EPR paper was based on a question of Einstein's about two electrons fired in opposite directions from a central source with equal velocities. Einstein imagined them starting from a distance at t0 and approaching one another with high velocities, then for a short time interval from t1 to t1 + Δt in contact with one another, where experimental measurements could be made on the momenta, after which they separate. Now at a later time t2 it would be possible to make a measurement of electron 1's position and would therefore know the position of electron 2 without measuring it explicitly.

Einstein used the conservation of linear momentum to "know" the symmetric position of the other electron. This knowledge implies information about the remote electron that is available instantly. Einstein called this "spooky action-at-a-distance."

Bohm's 1952 thought experiment used two electrons that are prepared in an initial state of known total spin. If one electron spin is 1/2 in the up direction and the other is spin down or -1/2, the total spin is zero. The underlying physical law of importance is still a conservation law, in this case the conservation of angular momentum.

Since Bell's original work, many other physicists have defined other "Bell inequalities" and developed increasingly sophisticated experiments to test them. Most recent tests have used oppositely polarized photons coming from a central source. It is the total photon spin of zero that is conserved.

In his 1964 paper "On the Einstein-Podolsky-Rosen Paradox," Bell made the case for nonlocality.

The paradox of Einstein, Podolsky and Rosen was advanced as a argument that quantum mechanics could not be a complete theory but should be supplemented by additional variables. These additional variables were to restore to the theory causality and locality. In this note that idea will be formulated mathematically and shown to be incompatible with the statistical predictions of quantum mechanics. It is the requirement of locality, or more precisely that the result of a measurement on one system be unaffected by operations on a distant system with which it has interacted in the past, that creates the essential difficulty. There have been attempts to show that even without such a separability or locality requirement no 'hidden variable' interpretation of quantum mechanics is possible. These attempts have been examined [by Bell] elsewhere and found wanting. Moreover, a hidden variable interpretation of elementary quantum theory has been explicitly constructed [by Bohm]. That particular interpretation has indeed a gross non-local structure. This is characteristic, according to the result to be proved here, of any such theory which reproduces exactly the quantum mechanical predictions.

With the example advocated by Bohm and Aharonov, the EPR argument is the following. Consider a pair of spin one-half particles formed somehow in the singlet spin state and moving freely in opposite directions. Measurements can be made, say by Stern-Gerlach magnets, on selected components of the spins σ1 and σ2. If measurement of the component σ1a, where a is some unit vector, yields the value + 1 then, according to quantum mechanics, measurement of σ2a must yield the value — 1 and vice versa. Now we make the hypothesis, and it seems one at least worth considering, that if the two measurements are made at places remote from one another the orientation of one magnet does not influence the result obtained with the other.

"pre-determination" is too strong a term. The previous measurement just "determines" the later measurement.
Since we can predict in advance the result of measuring any chosen component of σ2, by previously measuring the same component of σ1, it follows that the result of any such measurement must actually be predetermined. Since the initial quantum mechanical wave function does not determine the result of an individual measurement, this predetermination implies the possibility of a more complete specification of the state.

Superdeterminism
During a mid-1980's interview by BBC Radio 3 organized by P. C. W. Davies and J. R. Brown, Bell proposed the idea of a "superdeterminism" that could explain the correlation of results in two-particle experiments without the need for faster-than-light signaling. The two experiments need only have been pre-determined by causes reaching both experiments from an earlier time.
I was going to ask whether it is still possible to maintain, in the light of experimental experience, the idea of a deterministic universe?

You know, one of the ways of understanding this business is to say that the world is super-deterministic. That not only is inanimate nature deterministic, but we, the experimenters who imagine we can choose to do one experiment rather than another, are also determined. If so, the difficulty which this experimental result creates disappears.

Free will is an illusion - that gets us out of the crisis, does it?

That's correct. In the analysis it is assumed that free will is genuine, and as a result of that one finds that the intervention of the experimenter at one point has to have consequences at a remote point, in a way that influences restricted by the finite velocity of light would not permit. If the experimenter is not free to make this intervention, if that also is determined in advance, the difficulty disappears.

Bell's superdeterminism would deny the important "free choice" of the experimenter (originally suggested by Niels Bohr and Werner Heisenberg) and later explored by John Conway and Simon Kochen. Conway and Kochen claim that the experimenters' free choice requires that atoms must have free will, something they call their Free Will Theorem.

Following John Bell's idea, Nicholas Gisin and Antoine Suarez argue that something might be coming from "outside space and time" to correlate results in their own experimental tests of Bell's Theorem. Roger Penrose and Stuart Hameroff have proposed causes coming "backward in time" to achieve the perfect EPR correlations, as has philosopher Huw Price.

A Preferred Frame?

A little later in the same BBC interview, Bell suggested that a preferred frame of reference might help to explain nonlocality and entanglement.

[Davies] Bell's inequality is, as I understand it, rooted in two assumptions: the first is what we might call objective reality - the reality of the external world, independent of our observations; the second is locality, or non-separability, or no faster-than-light signalling. Now, Aspect's experiment appears to indicate that one of these two has to go. Which of the two would you like to hang on to?

[Bell] Well, you see, I don't really know. For me it's not something where I have a solution to sell! For me it's a dilemma. I think it's a deep dilemma, and the resolution of it will not be trivial; it will require a substantial change in the way we look at things. But I would say that the cheapest resolution is something like going back to relativity as it was before Einstein, when people like Lorentz and Poincare thought that there was an aether - a preferred frame of reference - but that our measuring instruments were distorted by motion in such a way that we could not detect motion through the aether. Now, in that way you can imagine that there is a preferred frame of reference, and in this preferred frame of reference things do go faster than light. But then in other frames of reference when they seem to go not only faster than light but backwards in time, that is an optical illusion.

The standard explanation of entangled particles usually begins with an observer A, often called Alice, and a distant observer B, known as Bob. Between them is a source of two entangled particles. The two-particle wave function describing the indistinguishable particles cannot be separated into a product of two single-particle wave functions.

The problem of faster-than-light signaling arises when Alice is said to measure particle A and then puzzle over how Bob's (later) measurements of particle B can be perfectly correlated, when there is not enough time for any "influence" to travel from A to B.

Now as John Bell knew very well, there are frames of reference moving with respect to the laboratory frame of the two observers in which the time order of the events can be reversed. In some moving frames Alice measures first, but in others Bob measures first.

Back in the 1960's, C. W. Rietdijk and Hilary Putnam argued that physical determinism could be proved to be true by considering the experiments and observers A and B in a "spacelike" separation and moving at high speed with respect to one another. Roger Penrose developed a similar argument in his book The Emperor's New Mind. It is called the Andromeda Paradox.

If there is a preferred frame of reference, surely it is the one in which the origin of the two entangled particles is at rest. Assuming that Alice and Bob are also at rest in this preferred frame and equidistant from the origin, we arrive at the simple picture in which any measurement that causes the two-particle wave function to collapse makes both particles appear simultaneously at determinate places (just what is needed to conserve energy, momentum, angular momentum, and spin).

The EPR "paradox" is the result of a naive non-relativistic description of events. Although the two events (measurements of particles A and B) are simultaneous in our preferred frame, the space-like separation of the events means that from Alice's point of view, any knowledge of event B is out in her future. Bob likewise sees Alice's event A out in his future. These both cannot be true. Yet they are both true (and in some sense neither is true). Thus the paradox.

Instead of just one particle making an appearance in the collapse of a single-particle wave function, in the two-particle case, when either particle is measured, we know instantly those properties of the other particle that satisfy the conservation laws, including its location equidistant from, but on the opposite side of, the source, and its other properties such as spin.

Let's look at an animation of the two-particle wave function expanding from the origin and what happens when, say, Alice makes a measurement.

You can compare the collapse of the two-particle probability amplitude above to the single-particle collapse here.

We can also ask what happens if Bob is not at the same distance from the origin as Alice. When Alice detects the particle (with say spin up), at that instant the other particle also becomes determinate (with spin down) at the same distance on the other side of the origin. It now continues, in that determinate state, to Bob's measuring apparatus.

Recall Bell's description of the process (quoted above), with its mistaken bias toward assuming first one measurement is made, and the other measurement is made later.

If measurement of the component σ1 • a, where a is some unit vector, yields the value + 1 then, according to quantum mechanics, measurement of σ2 • a must yield the value — 1 and vice versa... Since we can predict in advance the result of measuring any chosen component of σ2, by previously measuring the same component of σ1, it follows that the result of any such measurement must actually be predetermined.
Since the collapse of the two-particle wave function is indeterminate, nothing is pre-determined, although σ2 is indeed determined once σ1 is measured.


In 1987, Bell contributed an article to a centenary volume for Erwin Schrödinger entitled
Are There Quantum Jumps? Schrödinger denied such jumps or any collapses of the wave function. Bell's title was inspired by two articles with the same title by Schrödinger in 1952 (Part I, Part II).

Just a year before Bell's death in 1990, physicists assembled for a conference on 62 Years of Uncertainty (referring to Werner Heisenberg's 1927 principle of indeterminacy).

John Bell's contribution to the conference was an article called "Against Measurement." In it he attacked Max Born's statistical interpretation of quantum mechanics. And he praised the new ideas of GianCarlo Ghirardi and his colleagues, Alberto Rimini and Tomaso Weber:

In the beginning, Schrödinger tried to interpret his wavefunction as giving somehow the density of the stuff of which the world is made. He tried to think of an electron as represented by a wavepacket — a wave-function appreciably different from zero only over a small region in space. The extension of that region he thought of as the actual size of the electron — his electron was a bit fuzzy. At first he thought that small wavepackets, evolving according to the Schrödinger equation, would remain small. But that was wrong. Wavepackets diffuse, and with the passage of time become indefinitely extended, according to the Schrödinger equation. But however far the wavefunction has extended, the reaction of a detector to an electron remains spotty. So Schrödinger's 'realistic' interpretation of his wavefunction did not survive.

Then came the Born interpretation. The wavefunction gives not the density of stuff, but gives rather (on squaring its modulus) the density of probability. Probability of what exactly? Not of the electron being there, but of the electron being found there, if its position is 'measured.'

Why this aversion to 'being' and insistence on 'finding'? The founding fathers were unable to form a clear picture of things on the remote atomic scale. They became very aware of the intervening apparatus, and of the need for a 'classical' base from which to intervene on the quantum system. And so the shifty split.

The kinematics of the world, in this orthodox picture, is given a wavefunction (maybe more than one?) for the quantum part, and classical variables — variables which have values — for the classical part: (Ψ(t, q, ...), X(t),...). The Xs are somehow macroscopic. This is not spelled out very explicitly. The dynamics is not very precisely formulated either. It includes a Schrödinger equation for the quantum part, and some sort of classical mechanics for the classical part, and 'collapse' recipes for their interaction.

It seems to me that the only hope of precision with the dual (Ψ, x) kinematics is to omit completely the shifty split, and let both Ψ and x refer to the world as a whole. Then the xs must not be confined to some vague macroscopic scale, but must extend to all scales. In the picture of de Broglie and Bohm, every particle is attributed a position x(t). Then instrument pointers — assemblies of particles have positions, and experiments have results. The dynamics is given by the world Schrödinger equation plus precise 'guiding' equations prescribing how the x(t)s move under the influence of Ψ. Particles are not attributed angular momenta, energies, etc., but only positions as functions of time. Peculiar 'measurement' results for angular momenta, energies, and so on, emerge as pointer positions in appropriate experimental setups. Considerations of KG [Kurt Gottfried] and vK [N. G. van Kampen] type, on the absence (FAPP) [For All Practical Purposes] of macroscopic interference, take their place here, and an important one, is showing how usually we do not have (FAPP) to pay attention to the whole world, but only to some subsystem and can simplify the wave-function... FAPP.

The Born-type kinematics (Ψ, X) has a duality that the original 'density of stuff' picture of Schrödinger did not. The position of the particle there was just a feature of the wavepacket, not something in addition. The Landau—Lifshitz approach can be seen as maintaining this simple non-dual kinematics, but with the wavefunction compact on a macroscopic rather than microscopic scale. We know, they seem to say, that macroscopic pointers have definite positions. And we think there is nothing but the wavefunction. So the wavefunction must be narrow as regards macroscopic variables. The Schrödinger equation does not preserve such narrowness (as Schrödinger himself dramatised with his cat). So there must be some kind of 'collapse' going on in addition, to enforce macroscopic narrowness. In the same way, if we had modified Schrödinger's evolution somehow we might have prevented the spreading of his wavepacket electrons. But actually the idea that an electron in a ground-state hydrogen atom is as big as the atom (which is then perfectly spherical) is perfectly tolerable — and maybe even attractive. The idea that a macroscopic pointer can point simultaneously in different directions, or that a cat can have several of its nine lives at the same time, is harder to swallow. And if we have no extra variables X to express macroscopic definiteness, the wavefunction itself must be narrow in macroscopic directions in the configuration space. This the Landau—Lifshitz collapse brings about. It does so in a rather vague way, at rather vaguely specified times.

In the Ghirardi—Rimini—Weber scheme (see the contributions of Ghirardi, Rimini, Weber, Pearle, Gisin and Diosi presented at 62 Years of Uncertainty, Erice, Italy, 5-14 August 1989) this vagueness is replaced by mathematical precision. The Schrödinger wavefunction even for a single particle, is supposed to be unstable, with a prescribed mean life per particle, against spontaneous collapse of a prescribed form. The lifetime and collapsed extension are such that departures of the Schrödinger equation show up very rarely and very weakly in few-particle systems. But in macroscopic systems, as a consequence of the prescribed equations, pointers very rapidly point, and cats are very quickly killed or spared.

The orthodox approaches, whether the authors think they have made derivations or assumptions, are just fine FAPP — when used with the good taste and discretion picked up from exposure to good examples. At least two roads are open from there towards a precise theory, it seems to me. Both eliminate the shifty split. The de Broglie—Bohm-type theories retain, exactly, the linear wave equation, and so necessarily add complementary variables to express the non-waviness of the world on the macroscopic scale. The GRW-type theories have nothing in the kinematics but the wavefunction. It gives the density (in a multidimensional configuration space!) of stuff. To account for the narrowness of that stuff in macroscopic dimensions, the linear Schrödinger equation has to be modified, in this GRW picture by a mathematically prescribed spontaneous collapse mechanism.

The big question, in my opinion, is which, if either, of these two precise pictures can be redeveloped in a Lorentz invariant way.

...All historical experience confirms that men might not achieve the possible if they had not, time and time again, reached out for the impossible. (Max Weber)

...we do not know where we are stupid until we stick our necks out. (R. P. Feynman)

On the 22nd of January 1990, Bell gave a talk explaining his theorem at CERN in Geneva
organized by Antoine Suarez, director of the
Center for Quantum Philosophy.

There are links on the CERN website to the
video of this talk, and to a transcription.

In this talk, Bell summarizes the situation as follows:

It just is a fact that quantum mechanical predictions and experiments, in so far as they have been done, do not agree with [my] inequality. And that's just a brutal fact of nature...that's just the fact of the situation; the Einstein program fails, that's too bad for Einstein, but should we worry about that?

I cannot say that action at a distance is required in physics. But I can say that you cannot get away with no action at a distance. You cannot separate off what happens in one place and what happens in another. Somehow they have to be described and explained jointly.

Bell gives three reasons for not worrying.
  1. Nonlocality is unavoidable, even if it looks like "action at a distance."
    [It does not, with a proper understanding of quantum physics. See our EPR page.]
  2. Because the events are in a spacelike separation, either one can occur before the other in some relativistic frame, so no "causal" connection can exist between them.
  3. No faster-than-light signals can be sent using entanglement and nonlocality.
He concludes:
So as a solution of this situation, I think we cannot just say 'Oh oh, nature is not like that.' I think you must find a picture in which perfect correlations are natural, without implying determinism, because that leads you back to nonlocality. And also in this independence as far as our individual experiences goes, our independence of the rest of the world is also natural. So the connections have to be very subtle, and I have told you all that I know about them. Thank you.

The work of GianCarlo Ghirardi that Bell endorsed is a scheme that makes the wave function collapse by adding small (order of 10-24) nonlinear and stochastic terms to the linear Schrödinger equation. GRW can not predict when and where their collapse occurs (it is simply random), but the contact with macroscopic objects such as a measuring apparatus (with the order of 1024 atoms) makes the probability of collapse of order unity.

Information physics removes Bell's "shifty split" without "hidden variables" or making ad hoc non-linear additions like those of Ghirardi-Rimini-Weber to the linear Schrödinger equation. The "moment" at which the boundary between quantum and classical worlds occurs is the moment that irreversible observable information enters the universe.

So we can now look at John Bell's diagram of possible locations for his "shifty split" and identify the correct moment - when irreversible information enters the universe.

In the information physics solution to the problem of measurement, the timing and location of Bell's "shifty split" (the "cut" or "Schnitt" of Heisenberg and von Neumann) are identified with the interaction between quantum system and classical apparatus that leaves the apparatus in an irreversible stable state providing information to the observer.

As Bell may have seen, it is therefore not a "measurement" by a conscious observer that is needed to "collapse" wave functions. It is the irreversible interaction of the quantum system with another system, whether quantum or approximately classical. The interaction must be one that changes the information about the system. And that means a local entropy decrease and overall entropy increase to make the information stable enough to be observed by an experimenter and therefore be a measurement.

References
Against Measurement (PDF)

On the Einstein-Podolsky-Rosen Paradox (PDF)

Are There Quantum Jumps? (PDF, Excerpt)

BBC Interview (PDF, Excerpt)

For Teachers
For Scholars

Chapter 1.5 - The Philosophers Chapter 2.1 - The Problem of Knowledge
Home Part Two - Knowledge
Normal | Teacher | Scholar