Core Concepts
Actualism Adequate Determinism AgentCausality Alternative Possibilities Causa Sui Causal Closure Causalism Causality Certainty Chance Chance Not Direct Cause Chaos Theory The Cogito Model Compatibilism Complexity Comprehensive Compatibilism Conceptual Analysis Contingency Control Could Do Otherwise Creativity Default Responsibility Deliberation Determination Determination Fallacy Determinism Disambiguation Double Effect Either Way Emergent Determinism Epistemic Freedom Ethical Fallacy Experimental Philosophy Extreme Libertarianism Event Has Many Causes Frankfurt Cases Free Choice Freedom of Action "Free Will" Free Will Axiom Free Will in Antiquity Free Will Mechanisms Free Will Requirements Free Will Theorem Future Contingency Hard Incompatibilism Idea of Freedom Illusion of Determinism Illusionism Impossibilism Incompatibilism Indeterminacy Indeterminism Infinities Laplace's Demon Libertarianism Liberty of Indifference Libet Experiments Luck Master Argument Modest Libertarianism Moral Necessity Moral Responsibility Moral Sentiments Mysteries Naturalism Necessity Noise NonCausality Nonlocality Origination Paradigm Case Possibilism Possibilities Predeterminism Predictability Probability PseudoProblem Random When?/Where? Rational Fallacy Refutations Replay Responsibility Same Circumstances Scandal Science Advance Fallacy Second Thoughts SelfDetermination Semicompatibilism Separability Soft Causality Special Relativity Standard Argument Supercompatibilism Superdeterminism Taxonomy Temporal Sequence Tertium Quid Torn Decision TwoStage Models Ultimate Responsibility Uncertainty Up To Us Voluntarism What If Dennett and Kane Did Otherwise? Philosophers

Emergent Determinism
Determinism in classical physics is an idealization and an approximation.
Determinism is an emergent property that shows up for large numbers of elementary particles, which as individuals or in small numbers are more accurately described with indeterministic quantum physics. All the laws of physics are statistical laws. All are the consequence of averaging over the irreducible quantum indeterminacy of the elementary particles. Macroscopic (phenomenological) physical laws are arbitrarily accurate in the limit of infinite numbers of particles. But it is beyond the possibility of experimental accuracy to "prove" the idea of perfect determinism. Nevertheless, we can "prove" that small numbers of elementary particles exhibit indeterminate behavior, to within the highest level of experimental accuracy achieved in modern physics.
Isaac Newton knew his mathematical laws were an approximation. Ludwig Boltzmann, arguably the first scientist to appreciate the atomic nature of matter, also knew it.
He said that the assumption of perfect determinism "goes beyond experience." His colleague at the University of Vienna Franz Exner said the same thing, as did Exner's student Erwin Schrödinger, before he reversed his views to agree with Einstein, Max Planck, and others to reject quantum indeterminacy. Here is Boltzmann, for example. Since today it is popular to look forward to the time when our view of nature will have been completely changed, I will mention the possibility that the fundamental equations for the motion of individual molecules will turn out to be only approximate formulas which give average values, resulting according to the probability calculus from the interactions of many independent moving entities forming the surrounding medium — as for example in meteorology the laws are valid only for average values obtained by long series of observations using the probability calculus. These entities must of course be so numerous and must act so rapidly that the correct average values are attained in millionths of a second. And here is Franz Exner. This significantly restricted version of the law of causality as a purely empirical proposition forces yet another question of fundamental importance: If certain conditions are met, and the lawful flow of phenomena takes place naturally in all its phases, is the state of the system then predetermined (voraus bestimmter) at every moment? Or it is random and is it only the average state that is determined over a period of time? Finally, consider Erwin Schrödinger. In his early career, Schrödinger was a great exponent of fundamental chance in the universe. He followed his teacher Franz S. Exner, who was himself a colleague of the great Ludwig Boltzmann at the University of Vienna. Boltzmann used intrinsic randomness in molecular collisions to derive the increasing entropy of the Second Law of Thermodynamics. Most nineteenthcentury physicists, mathematicians, and philosophers believed that the chance described by the calculus of probabilities was actually completely determined. For them, chance was subjective and an epistemological problem, not objective and ontological. (Adolphe Quételet and Henry Thomas Buckle are important examples, very likely influenced by the views of Immanuel Kant.) The "bell curve" or "normal distribution" of random outcomes was itself so frequently observed that it seemed to imply deterministic laws governing individual events. Statistics and probability were thought to be epistemological questions. We simply lack the knowledge necessary to make exact predictions for these individual events. PierreSimon Laplace saw in his "calculus of probabilities" a universal law that determines the motions of everything from the largest astronomical objects to the smallest particles. On the other hand, in his inaugural lecture at Zurich in 1922, Schrödinger argued that the evidence did not justify our assumptions that physical laws are deterministic and strictly causal. His inaugural lecture was modeled on the inaugural lecture of Exner in Vienna in 1908. "Exner's assertion amounts to this: It is quite possible that Nature's laws are of a thoroughly statistical character. The demand for an absolute law in the background of the statistical law — a demand which at the present day almost everybody considers imperative — goes beyond the reach of experience. Such a dual foundation for the orderly course of events in Nature is in itself improbable. The burden of proof falls on those who champion absolute causality, and not on those who question it. For a doubtful attitude in this respect is today by far the more natural." Several years later, Schrödinger wrote
"Fifty years ago it was simply a matter of taste or philosophic prejudice whether the preference was given to determinism or indeterminism. The former was favored by ancient custom, or possibly by an a priori belief. In favor of the latter it could be urged that this ancient habit demonstrably rested on the actual laws which we observe functioning in our surroundings. As soon, however, as the great majority or possibly all of these laws are seen to be of a statistical nature, they cease to provide a rational argument for the retention of determinism.Despite these strong arguments against determinism, just after he completed the wave mechanical formulation of quantum mechanics in June 1926 (the year Exner died), Schrödinger began to side with the determinists, including especially Max Planck and Albert Einstein, and against those professing statistical interpretations of quantum mechanics, Max Born, Werner Heisenberg, and Niels Bohr.
Quantum Indeterminacy and Emergent Determinism
When small numbers of atoms and molecules interact, their motions and behaviors are indeterministic, governed by the rules of quantum mechanics.
Werner Heisenberg's principle of indeterminacy (mistakenly called "uncertainty," as if the problem is epistemic/subjective and not ontological/objective) gives us the minimum error in simultaneous measurements of position x and momentum p,
Δp Δx ≥ h,
where h is Planck's constant of action. To see how "adequate" determinism emerges for large numbers of particles, note that the momentum p = mv, the product of mass and velocity, so we can write the indeterminacy principle in terms of velocities and positions as
Δv Δx ≥ h / m.
When large numbers of microscopic particles get together in massive aggregates Determinism is an emergent property. The "laws of nature," such as Newton's laws of motion, are all statistical in nature. They "emerge" when large numbers of atoms or molecules get together. For large enough numbers, the probabilistic laws of nature approach practical certainty. But the fundamental indeterminism of component atoms never completely disappears. So determinism "emerges" today from microscopic quantum systems as they become a part of larger and more classical systems. But we can says that determinism also emerged in time. In the earliest years of the universe, large massive objects did not yet exist. All matter was microscopic.
We can now identify that time in the evolution of the universe when determinism first could have emerged. Before the socalled "recombination era," when the universe cooled to a few thousand degrees Kelvin, a temperature at which atoms could form out of subatomic particles (protons, helium nuclei, and electrons), there were no "macroscopic objects" to exhibit deterministic behavior. The early universe was filled with positive ions and negatively charge electrons. The electrons scattered light photons, preventing them from traveling very far. The universe was effectively opaque past very short distances. Then the charged particles combined to form neutral atoms (hydrogen and helium) the photons suddenly could "see" (travel) to enormous distances. The universe first had the transparent sky that we take for granted today (on cloudless nights).
Examples of Emergence
For Teachers
For Scholars
