Henry Quastler was a medical doctor who became an expert in radiation after being appalled by the deaths caused by atomic weapons. He and his colleague Sidney Dancoff worked for several years to develop the use of information theory in biology. Quastler organized a conference at the University of Illinois whose proceedings were published as
Essays in the Use of Information Theory in Biology in 1953.
Building on decades of prior work connecting information to thermodynamic entropy by
Ludwig Boltzmann,
Leo Szilard,
Ludwig von Bertalanffy,
Erwin Schrödinger,
Norbert Wiener,
Claude Shannon,
Warren Weaver,
John von Neumann, and
Leon Brillouin, the conference contributors made the first estimates of the information content in a crystal, a protein structure, a bacterial cell, and even in a multicellular organism like a human being (excluding the contents of memory as unknown).
Quastler's introduction was perhaps the most ambitious attempt to put the idea of information, and its opposite - entropy, into words.
"Information Theory" is a name remarkably apt to be misunderstood. The theory deals, in a quantitative way, with something called information, which, however, has nothing to do with meaning. On the other hand, the "information" of the theory is related to such diverse activities as arranging, constraining, designing, determining, differentiating, messaging, ordering, organizing, planning, restricting, selecting, specializing, specifying, and systematizing; it can be used in connection with all operations which aim at decreasing such quantities as disorder, entropy, generality, ignorance, indistinctness, noise, randomness, uncertainty, variability, and at increasing the amount or degree of certainty, design, differentiation , distinctiveness, individualization, information, lawfulness, orderliness, particularity, regularity, specificity, uniqueness. All these quantities refer to some difference between general and specific; in this sense, they can be measured with a common yardstick. Furthermore, measures which are appropriate exist, due to the developments of Information Theory.
The terms specifying, specification, specificity, with proper qualifications where needed, seem to be most adaptable to the diverse situations mentioned, and will be used in this paper.
(Information Theory in Biology, "The Measure of Specificity," p.41, 1953)
The Equivalence of Information and Physical Entropy
Henry Linschitz, Department of Chemistry, Syracuse University
The purpose of this note is to establish a factor by which physical entropy may be converted to an equivalent amount of information when both are expressed in convenient units. The possibility of doing this is indicated by the work of Szilard ('29), Shannon ('49), Brillouin ('51), and others.
We make use of a simple physical model, in which complete knowledge of structure requires a definite amount of information, H, and in which complete randomness of structure leads to a definite amount of physical entropy, S. Then we may say that
H = -αS ,
the independent evaluation of H and S then gives us the constant α.
Consider a crystal made up of molecules of type A-B, which can be placed at their lattice points in either of two, and only two, orientations, A-B or B-A. The specification of the orientation of each molecule thus requires a binary choice, or one "bit" of information. To specify the orientation of N molecules similarly requires N bits.
The physical entropy arising from randomness of orientation is S = k In P, where P is the total number of ways in which the crystal can be assembled, considering only different orientation combinations. If we have N molecules, S = k In 2N and, for N = Avogadro's number, this is R In 2. This formula for the physical entropy is verified by direct calorimetric measurement of the entropy of substances such as CO or N2O, and comparison with values computed from spectroscopic data. Discrepancies of magnitude R In 2 are found between the calorimetric and spectroscopic values which are interpreted as arising from the residual random orientation entropy frozen into the crystal and thus not experimentally observed in the heat capacity measurements.
Physical entropy amounting to R In 2, or 1.377 cal./deg./mole is thus removed when N bits of information are supplied. N bits per
mole correspond to one bit per molecule. Therefore,
Information (bits/molecule) = S(cal./deg./mole) / 1.377 = 0.73 S
If we know the absolute molar entropy in cal./deg., division by 1.377 gives the number of bits of information required to specify the corresponding state of each molecule.
Another way of regarding the matter is as follows: if P represents the total number of quantum states available to a molecule, then the number of binary choices required to select a single particular state is H, where H is obtained by solving the equation 2H = P. The entropy, per molecule, is
s = k In P = k In 2
from which
s /
k In 2 =
s /
k In 2 =
H.
REFERENCES
Brillouin, L.,
J. App. Physics,
22: 338-343 (1951)
Shannon, C., and Weaver, W.,
Mathematical Theory of Communication, University of Illinois Press, Urbana, 1949.
Szilard, L.,
Z. Physik,
53:840-856 (1929)
Normal |
Teacher |
Scholar