natural theology

We have just published a new book that summarizes the ideas of this site. Free at Scientific Theology, or, if you wish to support this project, buy at Scientific Theology: A New Vision of God

Contact us: Click to email
vol III Development:

Chapter 3: Cybernetics

page 3: Communication

Entropy and communication

From the point of view of engineering thermodynamics, entropy has had bad press over the years, being associated with the 'heat death' of the Universe and bounds on the efficiency of heat engines. Heat death of the universe - Wikipedia

Here we are less interested in the energy-momentum physical view of the Universe and more interested in its intelligence, complexity and creativity. From a physical point of view, the second law of thermodynamics seems to point to increasing disorder. From an information theoretical point of view, however, entropy and information are numerically the same thing.

The network model allows us to picture the flow of information in the Universe as a flow of entropy. The flow of entropy is bandwidth.

We may see entropy as a measure of meaning. The 'amount of meaning' in an event is its cardinal, and places it somewhere in the transfinite hierarchy of meaning. The unreachable 'God' stands at the topless top of this hierarchy, Cantor's Absolute. The absolute is a mathematically impossible figment of the imagination that effectively bounds mathematics and any mathematical representation of the Universe. Hallett, Absolute infinite - Wikipedia

Communication is the process that brings distinct systems into similar states. When you and I have talked about x for long enough to agree on a position, we can say that our mental states, (with regard to x) are correlated. Communication is essential for control: you cannot control something you cannot communicate with. We might assume that non-communicating (isolated) systems are at best randomly related to one another. This symmetry of randomness is broken (in some degree) when two system communicate.

The aim of science and technology is to establish communication with the world, first to learn how it works and then to tell it what we want it to do for us.

The mathematical theory of communication

The mathematical theory of communication devised by Claude Shannon applies the concept of entropy devised by Boltzmann and Gibbs to develop a measure of information. The basic idea is that the information carried by a point in a space is equal to the entropy of the space. The entropy of a space of W equiprobable events is measured by Boltzmann's formula, S = k log W where k is a constant of proportionality. Boltzmann's entropy formula - Wikipedia, Entropy (statistical thermodynamics) - Wikipedia

Gibbs and Shannon's measure of entropy deals with events in communication space with different probabilities. In this case, the entropy,

S = - ∑i pi log pi

Here in English text like this the letter frequencies are quite different, space and e, for instance, being far more frequent that q or z. When all the probabilities of the states, symbols or letters pi are equal, H is at a maximum and Boltzmann's and Gibbs equations become the same.

Shannon realized that the information carried by a symbol is numerically equal to the entropy of the space of symbols from which the symbol is drawn. Khinchin: The Mathematical Foundations of Information Theory

The mathematical theory of communication shows that we can make communication error free by coding our messages into packets that are so far apart in message space that the probability of their confusion is negligible. Shannon sought the limits of error free communication over noiseless and noisy channels. The theory he developed is now well known and lies at the heart of communication networks worldwide. Claude Shannon: Communication in the presence of noise, Claude E Shannon: A mathematical theory of communication

Messages are made further apart by making them longer. We may think of each message as a vector in a space with the same number of orthogonal dimensions as there are components in the vector. Linear increases in the number of components in vectors cause exponential increases in the volume of space occupied by these vectors so that they become further and further apart. At the same time, by making the packets larger, we have to send fewer of them.

The validity of these strategies is illustrated by our current ability to send gigabytes of information error free over noisy phone lines. The quantization of communication at the microscopic level supports the hypothesis that our world is a communication network that has evolved to resist error. Wojciech Hubert Zurek: Quantum origin of quantum jumps

A system that transmits without errors at the limiting rate C predicted by Shannon’s theorems is called an ideal system. Some features of an ideal system are:

1. To avoid error there must be no overlap between signals representing different messages. They must, in other words, be orthogonal, as with the eigenfunctions of a quantum mechanical measurement operator.

2. Such ‘basis signals’ may be chosen at random in the signal space, provided only that they are orthogonal. The same message may be encoded into any satisfactory basis provided that the transformations used by used by the transmitter and receiver to encode the message into the signal and decode the signal back to the message are inverses of one another.

3. The signals transmitted by an ideal system are indistinguishable from noise. The fact that a set of physical observations looks like a random sequence is not therefore evidence for meaninglessness. Until the algorithms used to encode and decode such a sequence are known, nothing can be said about its significance.

4. Only in the simplest cases are the mappings used to encode and decode messages linear and topological. For practical purposes, however, they must all be computable with available machines.

5. As a system approaches the ideal, the length of the transmitted packets, the delay at the transmitter while it takes in a chunk of message for encoding, and the corresponding delay at the receiver, increase indefinitely.

We may think of observable every entity in the Universe as a message. We see messages as fixed points, since they have a lifetime in transit from source to receiver. We may think of every thing in the Universe as a message propagating from some point in the past when it was created to some point in the future when it may be annihilated.

Encoding and decoding

We imagine a communication link as having five elements: 1: a source: 2: a computing system to encode the output of the source into a form suitable for transmission over a physical channel; 3: the physical channel itself; 4: a second computing system, the inverse of the first, to decode the physical message to recover the output of the source; and finally 5: the receiver of the message.

The output of the coding system in a digital network is a string of packets that have been designed to resist damage by noise in the physical channel. The code used must be both computable and reversible, so that the original message can be recovered exactly. This requirement places a boundary on error free messaging: we cannot use incomputable functions. Although there are 1 mappings from the o natural numbers to themselves, there are only 1 different Turing machines.

This boundary on computation also places a boundary on control. We may imagine that world has a computable heart within which it is possible to transmit error free messages and maintain control, surrounded by a cloud of incomputable possibilities which we cannot control. The source of creativity lies in this uncontrollable sector of reality.

(revised 8 January 2019)

Copyright:

You may copy this material freely provided only that you quote fairly and provide a link (or reference) to your source.

Further reading

Books

Click on the "Amazon" link below each book entry to see details of a book (and possibly buy it!)

Hallett, Michael, Cantorian Set Theory and Limitation of Size, Oxford UP 1984 Jacket: 'This book will be of use to a wide audience, from beginning students of set theory (who can gain from it a sense of how the subject reached its present form), to mathematical set theorists (who will find an expert guide to the early literature), and for anyone concerned with the philosophy of mathematics (who will be interested by the extensive and perceptive discussion of the set concept).' Daniel Isaacson. 
Amazon
  back

Khinchin, Aleksandr Yakovlevich, Mathematical Foundations of Information Theory (translated by P A Silvermann and M D Friedman), Dover 1957 Jacket: 'The first comprehensive introduction to information theory, this book places the work begun by Shannon and continued by McMillan, Feinstein and Khinchin on a rigorous mathematical basis. For the first time, mathematicians, statisticians, physicists, cyberneticists and communications engineers are offered a lucid, comprehensive introduction to this rapidly growing field.' 
Amazon
  back

Links

Absolute infinite - Wikipedia, Absolute infinite - Wikipedia, the free encyclopedia, 'The Absolute Infinite is mathematician Georg Cantor's concept of an "infinity" that transcended the transfinite numbers. Cantor equated the Absolute Infinite with God. He held that the Absolute Infinite had various mathematical properties, including that every property of the Absolute Infinite is also held by some smaller objec' back

Boltzmann's entropy formula - Wikipedia, Boltzmann's entropy formula - Wikipedia, the free encyclopedia, 'In statistical mechanics, Boltzmann's equation is a probability equation relating the entropy S of an ideal gas to the quantity W, which is the number of microstates corresponding to a given macrostate:
S = k ln W
where k is the Boltzmann constant, . . . which is equal to 1.38062 x 10−23 J/K. back

Claude E Shannon, A Mathematical Theory of Communication, 'The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point. Frequently the messages have meaning; that is they refer to or are correlated according to some system with certain physical or conceptual entities. These semantic aspects of communication are irrelevant to the engineering problem. The significant aspect is that the actual message is one selected from a set of possible messages.' back

Claude Shannon, Communication in the Presence of Noise, 'A method is developed for representing any communication system geometrically. Messages and the corresponding signals are points in two “function spaces,” and the modulation process is a mapping of one space into the other. Using this representation, a number of results in communication theory are deduced concerning expansion and compression of bandwidth and the threshold effect. Formulas are found for the maximum rate of transmission of binary digits over a system when the signal is perturbed by various types of noise. Some of the properties of “ideal” systems which transmit at this maximum rate are discussed. The equivalent number of binary digits per second for certain information sources is calculated.' back

Entropy (statistical thermodynamics) - Wikipedia, Entropy (statistical thermodynamics) - Wikipedia, the free encyclopedia, 'In classical statistical mechanics, the entropy function earlier introduced by Clausius is interpreted as statistical entropy using probability theory. The statistical entropy perspective was introduced in 1870 with the work of the Austrian physicist Ludwig Boltzmann.' back

Heat death of the universe - Wikipedia, Heat death of the universe - Wikipedia, the free encyclopedia, 'The heat death is a possible final state of the universe, in which it has "run down" to a state of no thermodynamic free energy to sustain motion or life. In physical terms, it has reached maximum entropy. The hypothesis of a universal heat death stems from the 1850s ideas of William Thomson (Lord Kelvin) who extrapolated the theory of heat views of mechanical energy loss in nature, as embodied in the first two laws of thermodynamics, to universal operation' back

Wojciech Hubert Zurek, Quantum origin of quantum jumps: breaking of unitary symmetry induced by information transfer and the transition from quantum to classical, 'Submitted on 17 Mar 2007 (v1), last revised 18 Mar 2008 (this version, v3)) "Measurements transfer information about a system to the apparatus, and then further on -- to observers and (often inadvertently) to the environment. I show that even imperfect copying essential in such situations restricts possible unperturbed outcomes to an orthogonal subset of all possible states of the system, thus breaking the unitary symmetry of its Hilbert space implied by the quantum superposition principle. Preferred outcome states emerge as a result. They provide framework for the ``wavepacket collapse'', designating terminal points of quantum jumps, and defining the measured observable by specifying its eigenstates. In quantum Darwinism, they are the progenitors of multiple copies spread throughout the environment -- the fittest quantum states that not only survive decoherence, but subvert it into carrying information about them -- into becoming a witness.' back

www.naturaltheology.net is maintained by The Theology Company Proprietary Limited ACN 097 887 075 ABN 74 097 887 075 Copyright 2000-2019 © Jeffrey Nicholls