Grammatical Man: Information, Entropy, Language, and Life is a 1982 book written by Jeremy Campbell, then Washington correspondent for the Evening Standard.[1] The book examines the topics of
probability,
information theory,
cybernetics,
genetics, and
linguistics.
Information processes are used to frame and examine all of existence, from the Big Bang to DNA to human communication to artificial intelligence.
Not Too Dull, Not Too Exciting addresses the problem of clarifying order from disorder within communication by highlighting the role that
redundancy plays in information theory.
In the last chapter of Part 1, The Struggle Against Randomness, Campbell addresses the concepts published by
Shannon in 1948—that a message can be sent from one place to another, even under noisy conditions, and be as free from error as the sender cares to make it, as long as it is coded in the proper form.
Part 2: Nature as an Information Process
Campbell uses Arrows in All Directions discusses the potential inverse relation between entropy and novelty, invoking such concepts as
Laplace's Superman. Campbell quotes
David Layzer:
For Laplace's "intelligence," as for the God of Plato, Galileo and Einstein, the past and future coexist on equal terms, like the two rays into which an arbitrarily chosen point divides a straight line. If the theories I have presented are correct, however, not even the ultimate computer --the universe itself-- ever contains enough information to specify completely its own future states. The present moment always contains an element of genuine novelty and the future is never wholly predictable. Because biological processes also generate information and because consciousness enables us to experience those processes directly, the intuitive perception of the world as unfolding in time captures one of the most deepseated properties of the universe.
Chapter 8, Chemical Word and Chemical Deed, examines the processes of DNA as information processes. Campbell makes the distinction between first order DNA messages and second order, or structural, DNA messages (e.g., "how to bake a cake" versus "how to read a recipe"). This distinction he relates to the linguistic principles of
Noam Chomsky's Universal Grammar.
In Jumping the Complexity Barrier, Campbell discusses the concept of
emergence and notes that Information Theory, thermodynamics, linguistics, and the
theory of evolution make significant use of terms and phrases such as "complexity," "novelty," and "constraints on possibilities." Campbell writes:
To understand complex systems, such as a large computer or a living organism, we cannot use ordinary, formal logic, which deals with events that definitely will happen or definitely will not happen. A probabilistic logic is needed, one that makes statements about how likely or unlikely it is that various events will happen.
Campbell also discusses
John von Neumann in relating information theory, evolution, and linguistics to machines. The chapter closes with an examination of emergent systems and their relation to
Gödel incompleteness.
Something Rather Subtle
Part 3: Coding Language, Coding Life
Algorithms and Evolution
Partly Green Till the Day We Die
No Need for Ancient Astronauts
The Clear and the Noisy Messages of Language
A Mirror of the Mind
Part 4: How the Brain Puts It All Together
The Brain as Cat on a Hot Tin Roof and Other Fallacies