From Atoms to People to Economics
Page xv-xvii: Information must not be confused by meaning:
"In 1949 Claude Shannon and Warren Weaver published a short book entitled The Mathematical Theory of Communication. In its first section, Weaver described the conceptual aspects of information. In the second, Shannon described the mathematics of what we now know as information theory.
For information theory to be properly understood, Shannon and Weaver needed to detach the word information from its colloquial meaning. Weaver made this distinction early in his essay: "The word information, in this theory, is used in a special sense that must not be confused with meaning."
Shannon also made this point early in his section, albeit invoking engineering arguments instead of semantic distinctions: "The fundamental problem of communication is that of reproducing in one point exactly or approximately a message selected at another point. Frequently, the messages have meaning ... These semantic aspects of communication [referring to the meaning of a message] are irrelevant to the engineering problem.
But why were Shannon and Weaver so eager to divorce information and meaning? They needed to separate information from meaning for both technical and philosophical reasons. On the technical side, Shannon was interested in the construction of machines that could help communicate information regardless the meaning of the message. mixing information and meaning obfuscated the engineering problem. On the philosophical side, Shannon and Weaver understood that their use of the words information and meaning referred to concepts that are fundamentally different. Humans, and some machines, have the ability to interpret messages and infuse them with meaning. But what travels through the wires or electromagnetic wires is not that meaning. It is simpler. It is just information.
It is hard for us humans to separate information from meaning because we cannot help interpret messages. We infuse messages with meaning automatically, fooling ourselves to believe that the meaning of the message is carried in the message. But it is not. This is only an illusion. Meaning is derived from context and prior knowledge. Meaning is the interpretation that a knowledge agent, such as a human, gives to a message, and different from the message itself. Meaning emerges when a message reaches a lifeform or machine with the ability to process information; it is not carried in the blots of ink, sound waves, beams of light, or electric pulses that transmit information.
Think of the phrase "September 11." When I say that phrase, most Americans will automatically think of the 2001 attack on the Twin Towers. Chileans usually think about the 1973 coup d'état. But maybe when I am saying "September 11" I am telling my students that I will be back at MIT on that date. As you can see, the meaning of the message is something that you construct. It is not part of the message, even if it seems to be. Meaning is something that we attach seamlessly as we interpret messages, because humans cannot help interpreting incoming bursts of physical order. The seamlessness does not mean that meaning and information are the same.
To create machines that could transmit information regardless of the meaning of the message, Shannon needed a formula to estimate the minimum number of characters required to encode a message. Building on the work of Harry Nyquist and Ralph Hartley, Shannon estimated how much information was needed to transmit a message through a clean or noisy channel. He also estimated the economics of communications brought by correlations in the structure of messages—such as the fact that in Englishthe letter t is more likely to precede h than q. Shannon's philosophical excursions put him on a mathematical path similar to the one traversed by Boltzmann. At the end of the path, Shannon found a basic formula for encoding an arbitrary message with maximum efficiency. This formula allowed anyone to embody information in a magnetic disk, electromagnetic waves, or ink and paper. Shannon's formula was identical to the one Boltzmann had put forth almost fifty years earlier. This coincidence was not an accident.
The convergence of Shannon's formula with Boltzmann's points to the physical nature of information.
Non-physical information
Information may exist as pure relations, i.e. without any physical appearance at all.
Mental configurations like music, lyrics, software, etc. are information in an abstract sense and should count as an economic category apart from goods (items made of atoms) and service (competence embedded in shared time). The comic stripe may illustrate this concept.
The common component of all economic entities is competence, i.e. the combination of knowledge, skills, and attitudes
That physical reality is critical to seeing how a study of atoms can help us understand the economy. For the most part, the natural sciences have focused on describing our universe from atoms to people, connecting the simplicity of the atom with the complexity of life. The social sciences have focused on the links among people, society, and economies, recasting humans as a fundamental unit—a social and economic atom, if I may. Yet this divorce is not lossless, as the mechanisms that allow information to grow trancend the barriers that separate the lifeless from the living, the living from the social, and the social from the economic.
So I will dedicate the following pages to an exploration of the mechanism that contribute to the growth of information at all scales, from atoms to economies. Not from atoms to people, or from people to economies, as it is usually done. This will help us create bridges between the physical, biological, social and economic factors that contribute to the growth of information and also limit our capacity to process information. The information processing capacity involves computation, and at the scale of humans it requires the "software" we know colloquially as knowledge and knowhow. The result will be a book about the history of our universe, centered not on the arrow of time but on the arrow of complexity.