Henry Quastler was a medical doctor who became an expert in radiation after being appalled by the deaths caused by atomic weapons. He and his colleague Sidney Dancoff worked for several years to develop the use of information theory in biology. Quastler organized a conference at the University of Illinois whose proceedings were published as Essays in the Use of Information Theory in Biology in 1953. Building on decades of prior work connecting information to thermodynamic entropy by Ludwig Boltzmann, Leo Szilard, Ludwig von Bertalanffy, Erwin Schrödinger, Norbert Wiener, Claude Shannon, Warren Weaver, John von Neumann, and Leon Brillouin, the conference contributors made the first estimates of the information content in a crystal, a protein structure, a bacterial cell, and even in a multicellular organism like a human being (excluding the contents of memory as unknown). Quastler's introduction was perhaps the most ambitious attempt to put the idea of information, and its opposite - entropy, into words.
"Information Theory" is a name remarkably apt to be misunderstood. The theory deals, in a quantitative way, with something called information, which, however, has nothing to do with meaning. On the other hand, the "information" of the theory is related to such diverse activities as arranging, constraining, designing, determining, differentiating, messaging, ordering, organizing, planning, restricting, selecting, specializing, specifying, and systematizing; it can be used in connection with all operations which aim at decreasing such quantities as disorder, entropy, generality, ignorance, indistinctness, noise, randomness, uncertainty, variability, and at increasing the amount or degree of certainty, design, differentiation , distinctiveness, individualization, information, lawfulness, orderliness, particularity, regularity, specificity, uniqueness. All these quantities refer to some difference between general and specific; in this sense, they can be measured with a common yardstick. Furthermore, measures which are appropriate exist, due to the developments of Information Theory. The terms specifying, specification, specificity, with proper qualifications where needed, seem to be most adaptable to the diverse situations mentioned, and will be used in this paper.
The Equivalence of Information and Physical Entropy
The purpose of this note is to establish a factor by which physical entropy may be converted to an equivalent amount of information when both are expressed in convenient units. The possibility of doing this is indicated by the work of Szilard ('29), Shannon ('49), Brillouin ('51), and others. We make use of a simple physical model, in which complete knowledge of structure requires a definite amount of information, H, and in which complete randomness of structure leads to a definite amount of physical entropy, S. Then we may say that