Epistemic Freedom is a form of compatibilism based on the limits of human knowledge (human ignorance). Determinism implies that there is just one possible future. But physical determinism does not guarantee predictability, and some philosophers have used the unpredictability of the future to argue for a kind of free will called epistemic freedom. It is a freedom that we have because we do not know what the future will be. "God only knows," say those religious thinkers who are not bothered by the foreknowledge of a God or gods. Although it is true that we cannot know what an open future will bring, to accept human ignorance as a basis for human freedom seems perverse. Why do so many thinkers accept epistemic freedom? Perhaps the idea that deterministic laws or divine foreknowledge might exist relieves them of some moral responsibility? The Marquis de Laplace imagined an intelligent being (Laplace's Demon who knows the positions and velocities of the constituent atoms and uses Newton's equations of motion to predict the future (and retrodict the past) of the entire universe. Although many thinkers confuse predictability with determinism, neither Newton nor Laplace likely did so. Newton was painfully aware of the errors made in astronomical observations. Laplace invented his super intelligent being to contrast its infinite mind (like that of God) to the finite minds that must remain infinitely distant from such knowledge. He knew that the information for even one single particle is infinite mathematically. When physical laws are expressible as mathematical functions of time, knowledge of the initial conditions at some time allows us to predict the conditions at all later (and retrospectively earlier) times. Long before Laplace, Gottfried Leibniz imagined a scientist who could see the events of all times, just as all times are thought to be present to the mind of God.
Predictability and SciencePredictability is an important characteristic of law-governed phenomena. Physical laws allow us to predict as well as possible, but of course not perfectly. This fact of our inability to predict perfectly is not an indicator that we are free from deterministic laws, should they exist (though they don't). Predictability (though limited) is an essential part of the scientific method, sometimes called the hypothetico-deductive-experimental-observation method. In the first step, freely invented hypotheses are proposed. In the second, reason, logic, and mathematics are used to deduce quantitative implications of the hypotheses. These deductions suggest observations and experiments (step 3), especially those that can provide quantitative measurements. The best hypothesis is the one that predicts observations or experiments that confirm (verify) or deny (falsify) the implied theory, and more specifically the one hypothesis that leads to quantitative measurements in agreement with the prediction. But what does agreement mean? Predictions are never in perfect mathematical agreement with the observations and experimental measurements. When measurements are repeated (as they must be, preferably by independent observers), they scatter randomly around some average value in a "normal distribution." Confirmation of a prediction is when the prediction lies within an acceptable error (usually reported as "number of standard deviations"), a range of values around the average measured value. Predictability is related to the idea of reproducibility. In order for an experiment to be accepted as scientific evidence, the experiments must be reproducible and repeatable. But, since experimental results are never exact, science offers no "proofs" of knowledge - just best predictions, best explanations, best theories. Like all knowledge, scientific knowledge is merely probable or statistical. Probabilities are a priori, produced by theories.
Statistics are the a posteriori results of experiments. The reason physical science is so well respected is because the precision and accuracy of its measurements are so high. Quantum physics, the most accurate physical theory, produces measurements accurate to within over fifteen significant figures, that is to say with standard deviations of +/- .0000000000000001. Quantum mechanical predictions are unusual in that they contain a minimal indeterminism, consistent with the Heisenberg uncertainty (or indeterminacy) principle. Classical deterministic laws of nature have been traditionally thought to be infinitely accurate, that is, to an infinite number of decimal places. Philosophers (including many philosophers of science) have sometimes misread physical determinism as providing an infinitely accurate predictability.