In
probability theory, stochastic processes are random (indeterministic) processes that are contrasted with
deterministic processes.
Stochasticity is judged by the distribution of randomness in the process.
Computer-generated stochastic noise may consist of random binary number sequences (1's and 0's). As long as the sequence is random, no statistical correlations or detectable patterns in the sequence, it is described as stochastic.
The
Wiener process, is a mathematical construct based on white noise with a Gaussian probability distribution.
Many naturally occurring processes exhibit stochasticity, including the Brownian motion of tiny particles suspended in a liquid.
The atmosphere is considered a source of stochastic noise by
Random.org. They use radio antennae tuned between radio stations to generate random digit patterns from "atmospheric" noise.
Whether this noise is genuinely random in the sense of irreducible quantum randomness is a question of the relationship between thermal noise and quantal noise.
Ultimately, this relationship depends on whether a classical gas is entirely deterministic (cf., deterministic chaos), and whether binary collisions of gas particles can be treated deterministically or must be treated quantum mechanically. If they are deterministic, then collisions are in principle time reversible.
In quantum mechanics,
microscopic time reversibility is taken to mean that the deterministic linear Schrödinger equation is time reversible.
A careful quantum analysis shows that ideal reversibility fails
even in the simplest conditions - the case of two particles in collision.
When they collide, even structureless particles should not be treated as individual particles with single-particle wave functions, but as a single system with a two-particle wave function, because they are now
entangled.
Treating two atoms as a temporary molecule means we must use molecular, rather than atomic, wave functions. The quantum description of the molecule now transforms the six independent degrees of freedom into three for the molecule's center of mass and three more that describe vibrational and rotational quantum states.
The possibility of quantum transitions between closely spaced
vibrational and rotational energy levels in the "quasi-molecule' introduces uncertainty, which could be different for the hypothetical perfectly reversed path.
In information science, noise is generally the enemy of
information. But some noise is the friend of
freedom, since it is the source of novelty, of
creativity and invention, and of variation in the biological gene pool. Too much noise is simply entropic and destructive. With the right level of noise, the cosmic creation process is not overcome by the chaos.
When information is stored in any structure, from galaxies to minds, two fundamental physical processes occur. First is a collapse of a quantum mechanical wave function. Second is a local decrease in the entropy corresponding to the increase in information. Entropy greater than that must be transferred away to satisfy the second law of thermodynamics.
If wave functions did not collapse, their evolution over time would be completely deterministic and information-preserving. Nothing new would emerge that was not implicitly present in the earlier states of the universe.
It is ironic that noise, in the form of quantum mechanical wave function collapses, should be the ultimate source of new information (low or negative entropy), the very opposite of noise (positive entropy).
Because quantum level processes introduce noise, information stored may have errors. When information is retrieved, it is again susceptible to noise, This may garble the information content.
Despite the continuous presence of noise around them and inside them, biological systems have maintained and increased their invariant information content over billions of generations. Humans increase our knowledge of the external world, despite logical, mathematical, and physical
uncertainty. Biological and intellectual information handling balance random and orderly processes by means of sophisticated error detection and correction schemes. The scheme we use to correct
human knowledge is science, a combination of freely invented theories and adequately determined experiments.
In Biology
Molecular biologists have assured neuroscientists for years that the molecular structures involved in neurons are too large to be affected significantly by quantum noise.
But neurobiologists know very well that there is noise in the nervous system in the form of spontaneous firings of an action potential spike, thought to be the result of random chemical changes at the synapses. This may or may not be quantum noise amplified to the macroscopic level.
But there is no problem imagining a role for randomness in the brain in the form of quantum level noise that affects the communication of
knowledge. Noise can introduce random errors into stored memories. Noise can create random associations of ideas during memory recall.
Molecular biologists know that while most biological structures are remarkably stable, and thus
adequately determined, quantum effects drive the mutations that provide variation in the gene pool. So our question is how the typical structures of the brain have evolved to deal with microscopic, atomic level, noise - both thermal and quantal noise. Can they ignore it because they are adequately determined large objects, or might they have remained sensitive to the noise for some reason?
We can expect that if quantum noise, or even ordinary thermal noise, offered beneficial advantages, there would have been evolutionary pressure to take advantage of noise.
Proof that our sensory organs have evolved until they are working at or near quantum limits is evidenced by the eye's ability to detect a single photon (a quantum of light energy), and the nose's ability to smell a single molecule.
Biology provides many examples of ergodic creative processes following a trial and error model. They harness chance as a possibility generator, followed by an adequately determined selection mechanism with implicit information-value criteria.
Darwinian evolution is the first and greatest example of a two-stage creative process, random variation followed by critical selection, but we will consider briefly two other such processes. Both are analogous to our two-stage
Cogito model for the mind. One is at the heart of the immune system, the other provides quality control in protein/enzyme factories.
Stochastic Noise in the
Cogito model
The insoluble problem for previous two-stage models has been to explain how a random event in the brain can be timed and located - perfectly synchronized! - so as to be relevant to a specific decision. The answer is it cannot be, for the simple reason that quantum events are totally unpredictable.
The Cogito solution is not single random events, one per decision, but
many random events in the brain as a result of ever-present
noise, both quantum and thermal noise, that is inherent in any information storage and communication system.
The mind, like all biological systems, has evolved in the presence of stochastic noise and is able to ignore that noise, unless the noise provides a significant competitive advantage, which it clearly does as the basis for
freedom and
creativity.
The only reasonable model for an indeterministic contribution is ever-present stochastic noise throughout the neural circuitry. We call it the Micro Mind.
Quantum (and even some thermal) noise in the neurons is all we need to supply random unpredictable
alternative possibilities.
Not that indeterminism is NOT involved in the
de-liberating Will.
The major difference between Micro and Macro is how they process noise in the brain circuits. The first accepts it, the second suppresses it.
Our "
adequately determined" Macro Mind can overcome the noise whenever it needs to make a determination on thought or action.