Citation for this page in APA citation style.           Close


Philosophers

Mortimer Adler
Rogers Albritton
Alexander of Aphrodisias
Samuel Alexander
William Alston
Anaximander
G.E.M.Anscombe
Anselm
Louise Antony
Thomas Aquinas
Aristotle
David Armstrong
Harald Atmanspacher
Robert Audi
Augustine
J.L.Austin
A.J.Ayer
Alexander Bain
Mark Balaguer
Jeffrey Barrett
William Barrett
William Belsham
Henri Bergson
George Berkeley
Isaiah Berlin
Richard J. Bernstein
Bernard Berofsky
Robert Bishop
Max Black
Susanne Bobzien
Emil du Bois-Reymond
Hilary Bok
Laurence BonJour
George Boole
Émile Boutroux
F.H.Bradley
C.D.Broad
Michael Burke
Lawrence Cahoone
C.A.Campbell
Joseph Keim Campbell
Rudolf Carnap
Carneades
Ernst Cassirer
David Chalmers
Roderick Chisholm
Chrysippus
Cicero
Randolph Clarke
Samuel Clarke
Anthony Collins
Antonella Corradini
Diodorus Cronus
Jonathan Dancy
Donald Davidson
Mario De Caro
Democritus
Daniel Dennett
Jacques Derrida
René Descartes
Richard Double
Fred Dretske
John Dupré
John Earman
Laura Waddell Ekstrom
Epictetus
Epicurus
Herbert Feigl
Arthur Fine
John Martin Fischer
Frederic Fitch
Owen Flanagan
Luciano Floridi
Philippa Foot
Alfred Fouilleé
Harry Frankfurt
Richard L. Franklin
Michael Frede
Gottlob Frege
Peter Geach
Edmund Gettier
Carl Ginet
Alvin Goldman
Gorgias
Nicholas St. John Green
H.Paul Grice
Ian Hacking
Ishtiyaque Haji
Stuart Hampshire
W.F.R.Hardie
Sam Harris
William Hasker
R.M.Hare
Georg W.F. Hegel
Martin Heidegger
Heraclitus
R.E.Hobart
Thomas Hobbes
David Hodgson
Shadsworth Hodgson
Baron d'Holbach
Ted Honderich
Pamela Huby
David Hume
Ferenc Huoranszki
William James
Lord Kames
Robert Kane
Immanuel Kant
Tomis Kapitan
Walter Kaufmann
Jaegwon Kim
William King
Hilary Kornblith
Christine Korsgaard
Saul Kripke
Thomas Kuhn
Andrea Lavazza
Christoph Lehner
Keith Lehrer
Gottfried Leibniz
Jules Lequyer
Leucippus
Michael Levin
George Henry Lewes
C.I.Lewis
David Lewis
Peter Lipton
C. Lloyd Morgan
John Locke
Michael Lockwood
E. Jonathan Lowe
John R. Lucas
Lucretius
Alasdair MacIntyre
Ruth Barcan Marcus
James Martineau
Storrs McCall
Hugh McCann
Colin McGinn
Michael McKenna
Brian McLaughlin
John McTaggart
Paul E. Meehl
Uwe Meixner
Alfred Mele
Trenton Merricks
John Stuart Mill
Dickinson Miller
G.E.Moore
Thomas Nagel
Otto Neurath
Friedrich Nietzsche
John Norton
P.H.Nowell-Smith
Robert Nozick
William of Ockham
Timothy O'Connor
Parmenides
David F. Pears
Charles Sanders Peirce
Derk Pereboom
Steven Pinker
Plato
Karl Popper
Porphyry
Huw Price
H.A.Prichard
Protagoras
Hilary Putnam
Willard van Orman Quine
Frank Ramsey
Ayn Rand
Michael Rea
Thomas Reid
Charles Renouvier
Nicholas Rescher
C.W.Rietdijk
Richard Rorty
Josiah Royce
Bertrand Russell
Paul Russell
Gilbert Ryle
Jean-Paul Sartre
Kenneth Sayre
T.M.Scanlon
Moritz Schlick
Arthur Schopenhauer
John Searle
Wilfrid Sellars
Alan Sidelle
Ted Sider
Henry Sidgwick
Walter Sinnott-Armstrong
J.J.C.Smart
Saul Smilansky
Michael Smith
Baruch Spinoza
L. Susan Stebbing
Isabelle Stengers
George F. Stout
Galen Strawson
Peter Strawson
Eleonore Stump
Francisco Suárez
Richard Taylor
Kevin Timpe
Mark Twain
Peter Unger
Peter van Inwagen
Manuel Vargas
John Venn
Kadri Vihvelin
Voltaire
G.H. von Wright
David Foster Wallace
R. Jay Wallace
W.G.Ward
Ted Warfield
Roy Weatherford
C.F. von Weizsäcker
William Whewell
Alfred North Whitehead
David Widerker
David Wiggins
Bernard Williams
Timothy Williamson
Ludwig Wittgenstein
Susan Wolf

Scientists

Michael Arbib
Walter Baade
Bernard Baars
Jeffrey Bada
Leslie Ballentine
Gregory Bateson
John S. Bell
Mara Beller
Charles Bennett
Ludwig von Bertalanffy
Susan Blackmore
Margaret Boden
David Bohm
Niels Bohr
Ludwig Boltzmann
Emile Borel
Max Born
Satyendra Nath Bose
Walther Bothe
Hans Briegel
Leon Brillouin
Stephen Brush
Henry Thomas Buckle
S. H. Burbury
Melvin Calvin
Donald Campbell
Anthony Cashmore
Eric Chaisson
Gregory Chaitin
Jean-Pierre Changeux
Arthur Holly Compton
John Conway
Jerry Coyne
John Cramer
Francis Crick
E. P. Culverwell
Antonio Damasio
Olivier Darrigol
Charles Darwin
Richard Dawkins
Terrence Deacon
Lüder Deecke
Richard Dedekind
Louis de Broglie
Stanislas Dehaene
Max Delbrück
Abraham de Moivre
Paul Dirac
Hans Driesch
John Eccles
Arthur Stanley Eddington
Gerald Edelman
Paul Ehrenfest
Manfred Eigen
Albert Einstein
Hugh Everett, III
Franz Exner
Richard Feynman
R. A. Fisher
David Foster
Joseph Fourier
Philipp Frank
Steven Frautschi
Edward Fredkin
Lila Gatlin
Michael Gazzaniga
Nicholas Georgescu-Roegen
GianCarlo Ghirardi
J. Willard Gibbs
Nicolas Gisin
Paul Glimcher
Thomas Gold
A. O. Gomes
Brian Goodwin
Joshua Greene
Dirk ter Haar
Jacques Hadamard
Mark Hadley
Patrick Haggard
J. B. S. Haldane
Stuart Hameroff
Augustin Hamon
Sam Harris
Ralph Hartley
Hyman Hartman
John-Dylan Haynes
Donald Hebb
Martin Heisenberg
Werner Heisenberg
John Herschel
Art Hobson
Jesper Hoffmeyer
E. T. Jaynes
William Stanley Jevons
Roman Jakobson
Pascual Jordan
Ruth E. Kastner
Stuart Kauffman
Martin J. Klein
William R. Klemm
Christof Koch
Simon Kochen
Hans Kornhuber
Stephen Kosslyn
Ladislav Kovàč
Leopold Kronecker
Rolf Landauer
Alfred Landé
Pierre-Simon Laplace
David Layzer
Joseph LeDoux
Gilbert Lewis
Benjamin Libet
Seth Lloyd
Hendrik Lorentz
Josef Loschmidt
Ernst Mach
Donald MacKay
Henry Margenau
Humberto Maturana
James Clerk Maxwell
Ernst Mayr
John McCarthy
Warren McCulloch
George Miller
Stanley Miller
Ulrich Mohrhoff
Jacques Monod
Emmy Noether
Alexander Oparin
Abraham Pais
Howard Pattee
Wolfgang Pauli
Massimo Pauri
Roger Penrose
Steven Pinker
Colin Pittendrigh
Max Planck
Susan Pockett
Henri Poincaré
Daniel Pollen
Ilya Prigogine
Hans Primas
Henry Quastler
Adolphe Quételet
Jürgen Renn
Juan Roederer
Jerome Rothstein
David Ruelle
Tilman Sauer
Jürgen Schmidhuber
Erwin Schrödinger
Aaron Schurger
Thomas Sebeok
Claude Shannon
David Shiang
Herbert Simon
Dean Keith Simonton
B. F. Skinner
Lee Smolin
Ray Solomonoff
Roger Sperry
John Stachel
Henry Stapp
Tom Stonier
Antoine Suarez
Leo Szilard
Max Tegmark
Libb Thims
William Thomson (Kelvin)
Giulio Tononi
Peter Tse
Francisco Varela
Vlatko Vedral
Mikhail Volkenstein
Heinz von Foerster
John von Neumann
Jakob von Uexküll
John B. Watson
Daniel Wegner
Steven Weinberg
Paul A. Weiss
John Wheeler
Wilhelm Wien
Norbert Wiener
Eugene Wigner
E. O. Wilson
Stephen Wolfram
H. Dieter Zeh
Ernst Zermelo
Wojciech Zurek
Konrad Zuse
Fritz Zwicky

Presentations

Biosemiotics
Free Will
Mental Causation
James Symposium
 
Dirk ter Haar
Dirk ter Haar was a Dutch physicist who studied physics in Scotland (St. Andrews) and England (Oxford).,

During the last years of World War II he attended Hendrik Kramers' famous lectures on statistical mechanics, which led to the great school of Dutch statistical physicists (F. J. Belinfante, Max Dresden, Nico van Kampen, Abraham Pais).

After the war he went to study at Niels Bohr's Institute in Copenhagen, then went back to Leiden to receive the Ph.D. under Hendrik Kramers,

His academic career was spent largely at the University of Oxford. One of his students was the historian of statistical physics Steven Brush.

He wrote a number of important books on statistical mechanics in the 1950's and one on the Old Quantum Theory in 1967. Perhaps his most famous book was the 1954 Elements of Statistical Mechanics, which ter Haar said was largely a new version of of the Kramers lectures in 1944-45.

ter Haar considered the implications of quantum mechanics for statistical mechanics. Many statistical physicists argued that quantum mechanics requires no changes to the conclusions of classical thermodynamics and statistical mechanics. These thinkers tended to be determinists who were uncomfortable with Werner Heisenberg's claim that quantum mechanics had eliminated causality in physics.

ter Haar also challenged the applicability of Claude Shannon's information theory and Norbert Wiener's cybernetics to statistical mechanics. But he accepted Leo Szilard's analysis of Maxwell's demon for a single particle, showing how a measurement of a single particle involves k ln2 of entropy, or one bit of information, and Leon Brillouin's analysis, where he coined the term "negentropy."

He wrote...

The relationship was introduced because Boltzmann's formula for entropy is identical to Shannon's formula with a minus sign.
S = k ∑ pi ln pi.
If all pi are identical,S = k ln W.
Information is neither matter nor energy, but where an information structure is present, entropy is low and Gibbs free energy is high.
The relationship between entropy and lack of information has led many authors, notably Shannon, to introduce “entropy” as a measure for the information transmitted by cables and so on, and in this way entropy has figured largely in recent discussions in information theory.

It must be stressed here that the entropy introduced in information theory is not a thermodynamical quantity and that the use of the same term is rather misleading. It was probably introduced because of a rather loose use of the term “information.”

In this connection we may briefly discuss Maxwell’s demon. Maxwell introduced in 1871 his famous demon, “a being whose faculties are so sharpened that he can follow every molecule in its course, and would be able to do what is at present impossible to us. . . . Let us suppose that a vessel is divided into two portions A and B by a division in which there is a small hole, and that a being who can see the individual molecules opens and closes this hole, so as to allow only the swifter molecules to pass from A to B, and only the slower ones to pass from B to A. He will, thus, without expenditure of work raise the temperature of B and lower that of A, in contradiction to the second law of thermodynamics.”

Maxwell’s demon has been widely discussed and various authors have set out to show that various attempts to circumvent the second law by using the demon are bound to fail. Although their discussions differ in some respects they have a few points in common. The first point is the observation that one should take the demon to be part of the total system and then one must consider the total entropy of the original system and the demon. The second point which was most clearly developed for the first time by Szilard is that the demon, in order to be able to operate the trapdoor through which the molecules pass, must receive information. Its own entropy increases therefore and it is now the question whether the increase of the demon’s entropy is smaller or larger than the decrease of the entropy of the gas. Both Szilard and Brillouin consider possible arrangements and show that in those cases the net change of entropy is positive. Szilard analyzes the problem very thoroughly and shows that one can describe a generalized Maxwell’s demon as follows. By some means an operation on a system is determined by the result of a measurement on the system which immediately precedes the operation. In Maxwell’s original scheme the operation was the opening of the trapdoor and the measurement was the determination of the velocity of an approaching molecule. The result of the operation will be a decrease of entropy, but the preceding measurement will be accompanied by an increase in entropy, and once again one must consider the balance. Wiener takes a simpler point of view.; He considers the situation, where the demon acts, as a metastable state and writes: “In the long run, the Maxwell demon is itself subject to a random motion corresponding to the temperature of its environment and it receives a large number of small impressions until it falls into ‘a certain vertigo’ and is incapable of clear perceptions. In fact, it ceases to act as a Maxwell demon.”

This point of view is probably too simplified and we prefer that of Szilard’s and refer the reader to his paper for a more extensive discussion.

We can note that in the most widely used modern textbook on statistical physics, Fred Reif says,

The quantity —In Γ, i.e., the function Σ Pr In Pr, can be used as a measure of nonrandomness, or information, available about systems in the ensemble.

This function plays a key role as a measure of information in problems of communication and general “information theory.”*

* See, for example, L. Brillouin, “Science and Information Theory,” 2d ed., Academic Press, New York, 1962; or J. R. Pierce, “Symbols, Signals, and Noise,” Harper, New York, 1961. Statistical mechanics is considered from the point of view of information theory by E. T. Jaynes in Phys. Rev., vol. 106, p. 620 (1957).
Normal | Teacher | Scholar