John Norton is a professor of the history and philosophy of science at the University of Pittsburgh. His major research includes a careful study of all the papers of
Albert Einstein, starting with Einstein's work on special and general relativity, but including an appreciation of Einstein's enormous contributions to quantum physics. Norton speculates that Einstein's foundational work in quantum physics has been overshadowed and perhaps largely forgotten because of Einstein's severe criticisms of the "new" quantum theory and its statistical interpretation.
Norton teaches a popular course on "Einstein for Everyone," which he published online as a "
web*bookTM." His imaginative illustrations (many animated) explain difficult concepts to beginners in physics.
With his Pitt colleague
John Earman, Norton has for several years been criticizing the claim that digital computers can store data in their memory in a logically and physically reversible way that is an apparent violation of the second law of thermodynamics. It is only the
erasure of such data, the claim says, that generates the entropy needed to obey the second law.
The idea of a reversible step was first introduced in a famous thought experiment by
Leo Szilard, who equated bits of information with the fundamental unit of entropy,
k log2, where
k is
Boltzmann's constant. Years later, IBM scientist
Rolf Landauer enunciated a principle that computers have some of the properties of a hypothetical Maxwell's Demon, an intelligent being that violates the second law.
Although computers have an intelligent aspect, they are made of material structures that cannot evade the second law in even their simplest steps, because they are made of molecules with fluctuations that increase the entropy. It turns out, says Norton, "that for any
process in which fluctuations were used to attempt a violation of the second law, there would be a
second process, also driven by fluctuations, that would undo the violation."
The thermodynamics of computation depends on the assumption that many computational
operations can be carried out at molecular scales by processes that are thermodynamically reversible,
or can be brought arbitrarily close to it. Such processes include the measurement by the device of
another system’s state; or the passing of data from one device to another. This expectation seems
reasonable as long as we exploit intuitions adapted to macroscopic systems. They fail, however, when
applied to molecular-scale systems. For all molecular-scale systems are subject to a continuing barrage
of thermal fluctuations and, unlike macroscopic systems, everything is shaking, rattling and bouncing
about. No single step in some computational process can proceed to completion unless these
fluctuations can be overcome.
A statistical-mechanical analysis shows that fluctuations will
fatally disrupt any effort to implement a thermodynamically reversible process on molecular scales;
and that quantities of thermodynamic entropy in excess of those tracked by Landauer’s principle must
be created to overcome the fluctuations.
We can still manipulate individual molecules in nanotechnology; the no-go result merely requires
that any such manipulation employs machinery that creates thermodynamic entropy as a result of its
need to overcome molecular-scale fluctuations.
We can still employ thermodynamically reversible processes as a useful idealization in macroscopic
thermodynamics. The quantities of entropy that must be created to suppress molecular-scale
fluctuations are quite negligible by macroscopic standards.
("All Shook Up: Fluctuations, Maxwell’s Demon and the Thermodynamics of Computation," prepared for the journal Entropy)