Stephen G. Brush is a historian of science who has documented the history of the kinetic theory of gases. He translated from German some of Ludwig Boltzmann's major works on the second law of thermodynamics. In an important 1976 contribution to the Journal of the History of Ideas, Brush argues that many nineteenth-century scientists accepted chance and randomness at the atomic level and that they related it to irreversibility in natural phenomena. (Other historians, e.g., Ian Hacking, made similar claims at the same time.)
In reading the casual remarks, and even what seem to be carefully considered pronouncements by nineteenth-century scientists, the twentieth-century reader often finds a puzzling ambiguity. Words like randomness, irregularity, and indeterminism may appear to imply that molecules or other entities do not move in paths that are completely determined by the positions, velocities, and forces of all particles in the system; yet the same author may express himself in a manner that is completely consistent with the view that these paths really are determined but we simply cannot obtain complete knowledge of them. What we might now call a crucial distinction between ontological and epistemological indeterminism is frequently blurred in these writings. This ambiguity can be used to argue that almost every one of the scientists quoted here believed only in epistemological, not ontological randomness—but such an argument would conceal a gradual but extremely significant historical shift in the meaning of concepts.Brush thus sees more continuity in the idea of indeterminism than typical historians of modern quantum physics who situate the beginnings of indeterminism in quantum mechanics and Werner Heisenberg's indeterminacy principle. Although he notes that many nineteenth-century scientists thought that chance was not ontologically real but rather merely epistemic, a consequence of the limits on human knowledge.
It is generally recognized that statistical ideas and methods were first introduced into physics in connection with the kinetic theory of gases. At the same time, modern writers invariably point out that in this case statistics was used only as a matter of convenience in dealing with large numbers of particles whose precise positions and velocities at any instant are unknown, or, even if known, could not be used in practice to calculate the gross behavior of the gas. Nevertheless, it is claimed, nineteenth-century physicists always assumed that the gas is really a deterministic mechanical system, so that if the super-intelligence imagined by Laplace were supplied with complete information about all the individual atoms at one time he could compute their positions and motions at any other time as well as the macroscopic properties of the gas. As Laplace himself had written in 1783, "The word 'chance,' then expresses only our ignorance of the causes of the phenomena that we observe to occur and to succeed one another in no apparent order." This situation is to be sharply distinguished, according to the usual accounts of the history of modern physics, from the postulate of atomic randomness or indeterminism which was adopted only in the 1920's in connection with the development of quantum mechanics. Thus, part of the "scientific revolution" that occurred in the early twentieth century is supposed to have been a discontinuous change from classical determinism to quantum indeterminism. While I think it is legitimate to say that a revolution in physical thought has occurred since about 1800, I do not believe it is accurate to localize it in the two or three decades at the beginning of the twentieth century. Some of the most dramatic events did take place during that period, but they could not have had the impact they actually did if confidence in the mechanistic world view had not already been undermined to a considerable extent by developments in the nineteenth century. This argument has been presented elsewhere; here I want to focus on one particular component of the revolution, the rejection of determinism, and show that there was some degree of continuity between nineteenth- and twentieth-century ideas. I do not want to overstate the case for continuity; if it were possible to quantify the causal factors responsible for the ultimate effect, I would guess that twentieth-century events (including the discovery of radioactive decay though it actually occurred just before 1900) accounted for perhaps 80% of the impetus toward atomic randomness while the nineteenth-century background accounted for the remaining 20%. That 20% is still significant in view of the fact that most historians and scientists seem to give no weight at all to the role of the well-publicized debates on the statistical interpretation of thermodynanamics of the 1890's, or to the well-established use of probability methods in kinetic theory.But some prominent physicists, who very well knew the statistical nature of the kinetic theory of gases, were convinced that the new quantum indeterminism was of a very different kind. In particular, Arthur Stanley Eddington maintained that the determinism of classical physics, which presumably included chance and probability, was gone forever. In The Nature of the Physical World (1928), Eddington dramatically announced "It is a consequence of the advent of the quantum theory that physics is no longer pledged to a scheme of deterministic law," Most mathematicians who specialized in probability (e.g., Abraham de Moivre and Pierre-Simon Laplace) believed that chance was only epistemic, the result of human ignorance. Indeed, the fact that the laws of error and the normal distribution of chance events followed precise statistical laws in the limit of large numbers of events, was for them proof that an unknown deterministic law governed all events. Brush notes that one could easily argue that Maxwell, Boltzmann, Planck, and Einstein, would also "find it inconceivable that anything in nature could happen 'by chance' without any cause at all."
Hence whenever we find them using the words "probability," "chance," "statistical," or "spontaneous," we must assume that such terms only refer to our lack of knowledge of causes, not to the absence of causes. The fallacy of that interpretation is that it could apply equally well to Born and Heisenberg, or to anyone who believes in an "uncertainty principle" as distinct from an "indeterminacy principle."There is perhaps more continuity between the nineteenth-century thinkers on probability and prominent dissenters from quantum theory such as Max Planck, Albert Einstein, Louis de Broglie, Erwin Schodinger, and David Bohm, who always hoped that an underlying deterministic explanation would be found some day for quantum randomness. In his later History of Modern Science, Brush summarized these arguments about the origin of indeterminism and chance in physics dating back to about 1800.
Just as Darwin denied that he was postulating randomness as a basic principle in evolution even though his theory of natural selection seemed to provide no alternative to the assumption that variations occur by chance, Einstein supplied the basis for indeterminism in physics yet insisted that "God does not play dice." In a third paper published in 1905, Einstein treated the irregular motion of microscopic particles in a fluid ("Brownian movement") as a random process dependent on chance impacts of molecules. Eleven years later he presented a theory of spontaneous and stimulated emission of radiation — a theory that ultimately inspired the invention of the laser — in which he assumed that atomic processes are governed by chance. While Einstein himself was unhappy with this assumption, other physicists made it the basis of the quantum theory of the atom. Randomness or indeterminism, another major theme of the Second Scientific Revolution, did not originate in quantum theory but emerged in nineteenth-century debates about the temporal direction of irreversible processes. Kelvin had proposed in 1852 a "universal tendency in nature to the dissipation of mechanical energy"; this became associated with the second law of thermodynamics, as the statement that "entropy" always increases or remains constant. (Entropy was originally defined as heat transfer divided by temperature.) Maxwell recognized that the natural tendency of heat to flow from hot to cold — the simplest example of energy dissipation or entropy increase — is equivalent to a tendency for molecules to become more and more mixed up as time goes on. Ludwig Boltzmann quantified this insight with a mathematical relation between entropy and disorder and reformulated the kinetic theory of gases so as to imply that all natural processes involve some degree of randomness at the atomic level. Entropy was thereby liberated from its moorings in physics and entered common language as a synonym for randomness. In the twentieth century we no longer fear randomness as a threat of chaos but welcome it as a possible haven for free will and a guarantee of "fairness" in statistical surveys and military draft selection procedures. In "quantum mechanics" (the mature version of quantum theory). randomness reached new heights of abstraction as mechanism retreated. Erwin Schrödinger proposed a mathematical equation from which can be deduced the observable properties of atoms, molecules, and (with a powerful enough computer or accurate approximate methods) any material system under ordinary conditions. But the symbols in the Schrödinger equation have no direct physical meaning. According to Max Born one of these symbols, the "wave function," represents the probability that the system will follow a certain path or be found in a certain state. There is no longer a causal law to determine how the positions and velocities of atoms evolve; instead there is a causal law that determines how their probability distributions evolve. Following the publication in 1926 of Schrodinger's equation and Max Born's statistical interpretation of it, there ensued a friendly but intense debate between Einstein and Niels Bohr, the philosophical leader of the new quantum mechanicians. The major issue was not so much randomness as realism. Einstein argued that quantum mechanics is not wrong but incomplete because it fails to account for some atomic properties that have a real existence. For example, according to Werner Heisenberg's principle, the position and momentum of a particle cannot be simultaneously determined: the more accurately one of these quantities is pinned down, the more indeterminate the other must be. To call this an "uncertainty principle" as is usually done implies that the position and momentum actually have values but we can't find out what they are because any attempt to measure them must disturb the system. But Bohr and Heisenberg, the proponents of the "Copenhagen interpretation" of quantum mechanics, claimed that these properties cannot be said to have an objective existence. It is only the observation that gives them reality. Thus the term "indeterminacy principle" is more accurate.Brush is quite right that indeterminacy is preferable to uncertainty, with the latter's epistemic implications. But the hope that randomness is now welcome as a possible haven for free will was shared by just a few quantum scientists. Philosophers have rejected indeterminism as one of the two horns of a dilemma. For them, chance as the direct cause of human action undermines moral responsibility. Indeterminism is part of the standard argument against free will We argue that it takes a two-stage process of randomness followed by adequate determinism to provide a workable free will model.