Knowledge
Retrieved June 30, 2025, from Information Philosopher
Web site https://www.informationphilosopher.com/knowledge/
Core Concepts
Abduction Belief Best Explanation Cause Certainty Chance Coherence Correspondence Decoherence Divided Line Downward Causation Emergence Emergent Dualism ERR Identity Theory Infinite Regress Information Intension/Extension Intersubjectivism Justification Materialism Meaning Mental Causation Multiple Realizability Naturalism Necessity Possible Worlds Postmodernism Probability Realism Reductionism Schrödinger's Cat Supervenience Truth Universals Philosophers
|
Knowledge
Epistemologists
William P. Alston David M. Armstrong Robert Audi Laurence BonJour Rudolf Carnap Fred Dretske Edmund Gettier Alvin Goldman Hilary Kornblith George Henry Lewes C.I. Lewis Willard Van Orman Quine Frank Ramsey Wilfrid Sellars
Metaphysics is the study of what there is, what exists, and how we know that it exists. The ancients described it as the problem of Being. We cannot know what there is without knowing how we can know anything.
Knowing how we know is the sub-discipline of metaphysics called epistemology. What there is is the study of ontology. Knowing how we know is a fundamentally circular problem when it is described in human language, normally as a set of logical propositions. And knowing something about what exists adds another complex circle, if the knowing being must itself be one of those things that exists. These circular definitions and inferences need not be vicious circles. They may simply be a coherent set of ideas that we use to describe ourselves and the external world. If the descriptions are logically valid and/or verifiable empirically, we think we are approaching the "truth" about things and acquiring knowledge. How then do we describe the knowledge itself - as an existing thing in our existent minds and in the existing external world. Information philosophy does it by basing everything on the abstract but quantitative notion of information. All information in the universe is created by a single two-step process. We call it the cosmic creation process. In the first step, something new and different is created at random. If it was determined by the past, it would not be new information. New information is a local reduction in the entropy. To satisfy the second law of thermodynamics, positive entropy must travel away from the new information structure, to the sink of the expanding universe. This is the second step. If it fails, the new information structure is destroyed, returning to its prior equilibrium state. Information (or negative entropy) lies in the arrangement of matter. Boltzmann's definition of entropy is the logarithm of the number of microscopic arrangements of matter that are consistent with the macroscopic properties (the thermodynamics). S = k lnW. For the first nine billion years or so, information structures were created by known physical forces like the combination of elementary particles into subatomic particles, then into atoms. Eventually gravitation condensed these randomly distributed atoms into astrophysical objects like the planets, stars, and galaxies. Then, on our planet, some complex molecules accidentally began to replicate themselves. Random accidents in replication began the process of biological evolution. Some very complicated replicants began to share information about their neighbors, perhaps by atoms or small molecules secreted by them into the environment. This was the beginning of communications between information structures. This is the beginning of knowledge. Information is stored or encoded in physical and biological structures. Structures in the world build themselves, following natural laws, including physical and biological laws. Structures in the mind are partly built by biological processes and partly built by intelligence, which is free, creative, and unpredictable. For information philosophy, knowledge is information created and stored in minds and in human artifacts like stories, books, and internetworked computers. Information is neither matter nor energy, though it requires matter for its embodiment and energy for its communication. Knowledge is actionable information that forms the basis for thoughts and actions, by the higher animals and humans. Knowledge includes all the cultural information created by human societies. We call it the Sum. It includes the theories and experiments of scientists, who collaborate to establish our knowledge of the external world. Scientific knowledge comes the closest of any knowledge to being independent of any human mind, though it is still dependent on an open interdependent community of fundamentally subjective inquirers. To the extent of the correspondence, the isomorphism, the one-to-one mapping, between information structures (and processes) in the world and representative structures and functions in the mind, information philosophy claims that we as individuals have quantifiable personal or subjective knowledge of the world. To the extent of the agreement (again a correspondence or isomorphism) between information in the minds of an open community of inquirers seeking the best explanations for phenomena, information philosophy further claims that we have quantifiable inter-subjective knowledge of other minds and of an external world. Although science depends on their inter-subjective agreement, this is as close as we come to "objective" knowledge, to knowledge of objects, the Kantian "things in themselves." Empiricists like John Locke thought "primary" qualities of objects are inaccessible. He believed our senses are only able to receive "secondary" qualities. Information philosophy makes this a distinction without a difference. Analytic language philosophers have a much narrower definition of knowledge. They identify it with language, logic, and human beliefs. For them, epistemology has been reduced to the "truth" of statements and propositions that can be logically analyzed and validated. Epistemologists say persons have knowledge only 1) if a statement is true, 2) if they believe that a statement is true, and 3) if their belief is "justified," where justification may be because their belief was the consequence of a "reliable" cognitive process, or because the belief was "caused" by facts in the world about the belief. They trace their three-step conditions for knowledge back to Plato's Theaetetus and Aristotle's Posterior Analytics. Plato did talk about opinions, which could be true or false. The true or "right" opinions could be further supported by giving an "account" of the reasons why an opinion is "true" and not "false." But like many Platonic dialogues, there was no resolution or agreement in the Theaetetus that these three elements could indeed produce knowledge. The Greek word Plato used for knowledge was episteme, which translates more nearly as "know how" than the "know that" associated with knowledge of the "facts" in propositions. Our English word for knowledge comes from the Indo-European and later Greek gno as in gnosis. In Greek it meant a mark or token that was familiar and immediately recognizable, with an act of cognition or cognizance. It gives us the word ken (our close relatives are "kin"), the German cognate kennen, and the French connaisance. Bertrand Russell distinguished "knowledge by acquaintance" as immediate (viz. non-mediated) direct awareness of a particular thing. He contrasted such basic knowledge with knowledge of concepts, ideas or "universals," which can be used to describe many particular things. He called this "knowledge by description." He included the sense data of "red, here, now" in immediate knowledge, knowledge we less likely to doubt and that serves as a logical foundation. All this works well for one idea of knowledge, but unfortunately for analytic language philosophy, the English language is philosophically impoverished, lacking another word for knowledge that is found in all other European languages, one based on words whose root means "to have seen."
Justified True Belief
Nevertheless, the modern field of epistemology has generally defined knowledge in three parts as "justified true belief," specifically the truth of beliefs about statements or propositions. For example,
S knows that P if and only if In the long history of the problem of knowledge, all three of these knowledge or belief "conditions" have proven very difficult for epistemologists. Among the reasons... (i) A belief is an internal mental state beyond the full comprehension of expert external observers. Even the subject herself has limited immediate access to all she knows or believes. On deeper reflection, or consulting external sources of knowledge, she might "change her mind." It is no surprise that epistemologists have failed in every effort to put knowledge on a sound basis, let alone establish knowledge with apodeictic certainty, as Plato and Aristotle expected and René Descartes thought he had established beyond any reasonable doubt. Perhaps overreacting to the threat from science as a demonstrably more successful method for establishing knowledge, epistemologists have hoped to differentiate and preserve their own philosophical approach. Some have held on to the goal of logical positivism (e.g., Russell, early Wittgenstein, and the Vienna Circle) that philosophical analysis would provide an a priori normative ground for merely empirical scientific knowledge. Logical positivist arguments for the non-inferential self-validation of logical atomic perceptions like "red, here, now" have perhaps misled some epistemologists to think that personal perceptions can directly justify some "foundationalist" beliefs. The philosophical method of linguistic analysis (inspired by the later Wittgenstein) has not achieved much more. It is unlikely that knowledge of any kind reduces simply to the careful conceptual analysis of sentences, statements, and propositions. Information philosophy looks deeper than the surface ambiguities of language. Information philosophy distinguishes at least three kinds of knowledge, each requiring its own special epistemological analysis:
This last kind of knowledge is based on the "scientific method," roughly defined as a combination of
When information is stored in any structure, whether in the world, in human artifacts like books and the Internet, or in human minds, two fundamental physical processes occur. These are the two parts of the cosmic creative process. First is a collapse of a quantum mechanical wave function that is needed to create even a single "bit" of new information in an experimental measurement. Second is a local decrease in the entropy corresponding to the increase in information. Without this, the new bit would be erased and the system returned to equilibrium. Entropy greater than the increase in information (negative entropy) must be transferred away from the location of the new information to satisfy the second law of thermodynamics. Leo Szilard calculated the mean value of the quantity of entropy produced by a 1-bit measurement as These quantum level processes are susceptible to noise. Information stored may have errors. When information is retrieved, it is again susceptible to noise, This may garble the information content. In information science, noise is generally the enemy of information. But some noise is the friend of freedom, since it is the source of novelty, of creativity and invention, and of variation in the biological gene pool. Biological systems have maintained and increased their invariant information content over billions of generations. Humans increase our knowledge of the external world, despite logical, mathematical, and physical uncertainty or indeterminacy. Both do it in the face of random noise, bringing order (or cosmos) out of chaos. Both do it with sophisticated error detection and correction schemes that limit the effects of chance. The scheme we use to correct human knowledge is science, a combination of freely invented theories and adequately determined experiments.
For Teachers
For Scholars
|