vol III Development:
Chapter 3: Cybernetics
page 3: Communication
Entropy and communication
From the point of view of engineering thermodynamics, entropy has had bad press over the years, being associated with the 'heat death' of the Universe and bounds on the efficiency of heat engines. Heat death of the universe - Wikipedia
Here we are less interested in the energy-momentum physical view of the Universe and more interested in its intelligence, complexity and creativity. From a physical point of view, the second law of thermodynamics seems to point to increasing disorder. From an information theoretical point of view, however, entropy and information are numerically the same thing.
The network model allows us to picture the flow of information in the Universe as a flow of entropy. The flow of entropy is bandwidth.
We may see entropy as a measure of meaning. The 'amount of meaning' in an event is its cardinal, and places it somewhere in the transfinite hierarchy of meaning. The unreachable 'God' stands at the topless top of this hierarchy, Cantor's Absolute. The absolute is a mathematically impossible figment of the imagination that effectively bounds mathematics and any mathematical representation of the Universe. Hallett, Absolute infinite - Wikipedia
Communication is the process that brings distinct systems into similar states. When you and I have talked about x for long enough to agree on a position, we can say that our mental states, (with regard to x) are correlated. Communication is essential for control: you cannot control something you cannot communicate with. We might assume that non-communicating (isolated) systems are at best randomly related to one another. This symmetry of randomness is broken (in some degree) when two system communicate.
The aim of science and technology is to establish communication with the world, first to learn how it works and then to tell it what we want it to do for us.
The mathematical theory of communication
The mathematical theory of communication devised by Claude Shannon applies the concept of entropy devised by Boltzmann and Gibbs to develop a measure of information. The basic idea is that the information carried by a point in a space is equal to the entropy of the space. The entropy of a space of W equiprobable events is measured by Boltzmann's formula, S = k log W where k is a constant of proportionality. Boltzmann's entropy formula - Wikipedia, Entropy (statistical thermodynamics) - Wikipedia
Gibbs and Shannon's measure of entropy deals with events in communication space with different probabilities. In this case, the entropy,
S = - ∑i pi log pi
Here in English text like this the letter frequencies are quite different, space and e, for instance, being far more frequent that q or z. When all the probabilities of the states, symbols or letters pi are equal, H is at a maximum and Boltzmann's and Gibbs equations become the same.
Shannon realized that the information carried by a symbol is numerically equal to the entropy of the space of symbols from which the symbol is drawn. Khinchin: The Mathematical Foundations of Information Theory
The mathematical theory of communication shows that we can make communication error free by coding our messages into packets that are so far apart in message space that the probability of their confusion is negligible. Shannon sought the limits of error free communication over noiseless and noisy channels. The theory he developed is now well known and lies at the heart of communication networks worldwide. Claude Shannon: Communication in the presence of noise, Claude E Shannon: A mathematical theory of communication
Messages are made further apart by making them longer. We may think of each message as a vector in a space with the same number of orthogonal dimensions as there are components in the vector. Linear increases in the number of components in vectors cause exponential increases in the volume of space occupied by these vectors so that they become further and further apart. At the same time, by making the packets larger, we have to send fewer of them.
The validity of these strategies is illustrated by our current ability to send gigabytes of information error free over noisy phone lines. The quantization of communication at the microscopic level supports the hypothesis that our world is a communication network that has evolved to resist error. Wojciech Hubert Zurek: Quantum origin of quantum jumps
A system that transmits without errors at the limiting rate C predicted by Shannon’s theorems is called an ideal system. Some features of an ideal system are:
1. To avoid error there must be no overlap between signals representing different messages. They must, in other words, be orthogonal, as with the eigenfunctions of a quantum mechanical measurement operator.
2. Such ‘basis signals’ may be chosen at random in the signal space, provided only that they are orthogonal. The same message may be encoded into any satisfactory basis provided that the transformations used by used by the transmitter and receiver to encode the message into the signal and decode the signal back to the message are inverses of one another.
3. The signals transmitted by an ideal system are indistinguishable from noise. The fact that a set of physical observations looks like a random sequence is not therefore evidence for meaninglessness. Until the algorithms used to encode and decode such a sequence are known, nothing can be said about its significance.
4. Only in the simplest cases are the mappings used to encode and decode messages linear and topological. For practical purposes, however, they must all be computable with available machines.
5. As a system approaches the ideal, the length of the transmitted packets, the delay at the transmitter while it takes in a chunk of message for encoding, and the corresponding delay at the receiver, increase indefinitely.
We may think of observable every entity in the Universe as a message. We see messages as fixed points, since they have a lifetime in transit from source to receiver. We may think of every thing in the Universe as a message propagating from some point in the past when it was created to some point in the future when it may be annihilated.
Encoding and decoding
We imagine a communication link as having five elements: 1: a source: 2: a computing system to encode the output of the source into a form suitable for transmission over a physical channel; 3: the physical channel itself; 4: a second computing system, the inverse of the first, to decode the physical message to recover the output of the source; and finally 5: the receiver of the message.
The output of the coding system in a digital network is a string of packets that have been designed to resist damage by noise in the physical channel. The code used must be both computable and reversible, so that the original message can be recovered exactly. This requirement places a boundary on error free messaging: we cannot use incomputable functions. Although there are ℵ1 mappings from the ℵo natural numbers to themselves, there are only ℵ1 different Turing machines.
This boundary on computation also places a boundary on control. We may imagine that world has a computable heart within which it is possible to transmit error free messages and maintain control, surrounded by a cloud of incomputable possibilities which we cannot control. The source of creativity lies in this uncontrollable sector of reality.
(revised 8 January 2019)