natural theology

This site is part of the The natural religion project
dedicated to developing and promoting the art of peace.

Contact us: Click to email
volume II: Synopsis

section IV: Divine Dynamics

page 28: Claude Shannon


We are modelling the universe as a transfinite computer network. An essential element of a stable communication network is error free communications. In ordinary face to face communications we arrive at true communication by conversation, discussing things until we understand each other.

The first step in error free communication is to transmit the actual words without error, leaving questions of meaning and interpretation until later. Error free transmission of messages became an engineering issue with the invention of the telephone.

Because of attenuation in the wires, a long distance call requires the signal to be amplified may times along its way. Each amplifier is required to make the signal stronger without distorting it. A little bit of distortion per amplifier multiplied by a hundred amplifiers may render the signal unintelligible. We want the amplifiers to be linear, error free channels, boosting the signal but not changing it. Linear amplifier - Wikipedia

The signal to noise ratio measures how well a signal stands out against the background noise. Since one system's signal may be another system's noise, this ratio is tied to particular communication links. Noise is the source of error, so low signal to noise ratio corresponds to high error rate and a high signal to noise ratio high fidelity. Signal-to-noise ratio - Wikipedia

In the early days of telephony much ingenuity went into designing low noise linear amplifies to multiply the power of continuous signals. Nevertheless it was often hard to understand long distance conversations, and a better approach was needed. Negative feedback amplifier - Wikipedia, Cybernetics - Wikipedia

Communication entered the digital era with Shannon's mathematical theory of communication. Shannon models a source emitting a string of symbols. Like Gödel and Turing, he is thinking in terms of text, a string of symbols, rather than speech, a discrete string rather than modulated continuum. This simplifies the problem, because letters are easier to handle mathematically than sentences. Given the Nyquist-Shannon sampling theorem nothing is lost by this approach since all the information in a continuous signal can be encoded as a string of discrete samples measured at twice the maximum frequency of the continuous signal. Nyquist-Shannon sampling theorem - Wikipedia

Information is 'that which removes uncertainty' or 'breaks symmetry'. The measure of an item of information is the amount of uncertainty it removes. Shannon measured the information output of a source by its entropy. Entropy measures the number of distinct states in a system. As the number of possible states of a system increases, more information is obtained when we learn the actual state of the system. Entropy (information theory) - Wikipedia

We say that communication is error free if one message is not confused with another. If we think of messages as points in a message space, we may avoid confusion by placing individual messages as far apart as possible in the space. Ideally, messages should be orthogonal, each one completely independent of the others.

By discussing communication in a multidimensional function space, Shannon was able to show that one could send a message without error over a noisy channel. The theory he developed is now well known and lies at the heart of communication engineering. The first step is to translate the message from an alphabet in which the symbols are not equiprobable into another alphabet where the symbols are equiprobable, thus in effect making the symbols further apart. Shannon

The second step is to encode messages into large blocks or packets. Depending on their size, these packets can be made as far apart in a message space as we wish, to that the probability of confusing them (and so falling into error) approaches zero.

For a given channel, Shannon’s theorems define a maximum rate of information transmission C. A system that transmits without errors at the rate C is an ideal system. Shannon proved that ideal encodings exist, but they are not easy to find. Coding theory - Wikipedia

A comparison of the properties of an ideal coding system with the Universe revealed by quantum mechanics, suggests that the quantization of the Universe arises from the need for the error free transmission of information. Quantization - Wikipedia

In order to avoid error, there must be no overlap between signals representing different messages. They must, in other words, be orthogonal, as with the eigenfunctions of a quantum mechanical basis. Wojciech Hubert Zurek

Such ‘basis signals’ may be chosen at random in the signal space, provided only that they are orthogonal. The same message may be encoded in any orthogonal basis provided that the transformations used by the transmitter and receiver to encode and decode the message are modified accordingly. Orthogonality - Wikipedia

The signals transmitted by an ideal system are indistinguishable from noise. The fact that a set of physical observations looks like a random sequence is not therefore evidence for meaninglessness. Until the algorithms used to encode and decode such a sequence are known, nothing can be said about its significance. Chaitin

As the system approaches the ideal and the length of the transmitted packet increases, the delay at the transmitter while it takes in a chunk of message for encoding, and the corresponding delay at the receiver, increase indefinitely. The ideal rate C is only reached when packets comprise a countably infinite number of bits.

Only in the simplest cases are the mappings used to encode and decode messages linear and topological. For practical purposes, however, they must all be computable. In addition, in order to recover encoded messages, the computations used to encode messages must be invertible so that the decoded message is identical to the original.

We observe quantization of the Universe all scales from quanta of action through trees and planets to galaxies. This line of thought suggests that all the distinct observable elements of the universe are messages, carrying information from the past to the future through the present. We may understand this structure as a means to combat error so that the Universe operates as a stable relatively error free network. The mathematical theory of communication, coupled with the second law of thermodynamics, which encodes the tendency for entropy to increase suggest that we may approach a true understanding of the Universe by considering it as a processor of information.

(revised 25 May 2013)

Back to Synopsis toc


You may copy this material freely provided only that you quote fairly and provide a link (or reference) to your source.

Further reading


Click on the "Amazon" link below each book entry to see details of a book (and possibly buy it!)

Brillouin, Leon, Science and Information Theory, Academic 1962 Introduction: 'A new territory was conquered for the sciences when the theory of information was recently developed. . . . Physics enters the picture when we discover a remarkable likeness between information and entropy. . . . The efficiency of an experiment can be defined as the ratio of information obtained to the associated increase in entropy. This efficiency is always smaller than unity, according to the generalised Carnot principle. . . . ' 
Campbell, Jeremy, Grammatical Man: Information, Entropy, Language and Life, Allen Lane 1982 Foreword: 'This book is an attempt to tell the story of information theory and how it evolved out of the ferment of scientific activity during the Second World War. ... The laws and theorems of this science stimulated exciting ideas in biology and language, probability theory, psychology, philosophy, art, computers and the study of society.' 
Campbell, Jeremy, Grammatical Man: Information, Entropy, Language and Life, Allen Lane 1982 Foreword: 'This book is an attempt to tell the story of information theory and how it evolved out of the ferment of scientific activity during the Second World War. ... The laws and theorems of this science stimulated exciting ideas in biology and language, probability theory, psychology, philosophy, art, coputers and the study of society.' 
Chaitin, Gregory J, Information, Randomness & Incompleteness: Papers on Algorithmic Information Theory, World Scientific 1987 Jacket: 'Algorithmic information theory is a branch of computational complexity theory concerned with the size of computer programs rather than with their running time. ... The theory combines features of probability theory, information theory, statistical mechanics and thermodynamics, and recursive function or computability theory. ... [A] major application of algorithmic information theory has been the dramatic new light it throws on Goedel's famous incompleteness theorem and on the limitations of the axiomatic method. ...' 
Gatlin, Lila L, Information Theory and the Living System, Columbia University Press 1972 Chapter 1: 'Life may be defined operationally as an information processing system -- a structural hierarchy of functioning units -- that has acquired through evolution the ability to store and process the information necessary for its own accurate reproduction. The key word in the definition is information. This definition, like all definitions of life, is relative to the environment. My reference system is the natural environment we find on this planet. However, I do not think that life has ever been defined even operationally in terms of information. This entire book constitutes a first step towar dsuch a definition.' 
Khinchin, A I, Mathematical Foundations of Information Theory (translated by P A Silvermann and M D Friedman), Dover 1957 Jacket: 'The first comprehensive introduction to information theory, this book places the work begun by Shannon and continued by McMillan, Feinstein and Khinchin on a rigorous mathematical basis. For the first time, mathematicians, statisticians, physicists, cyberneticists and communications engineers are offered a lucid, comprehensive introduction to this rapidly growing field.' 
Pierce, John Robinson, An Introduction to Information Theory: Symbols Signals and Noise, Dover 1980 Jacket: 'Behind the familiar surfaces of the telephone, radio and television lies a sophisticated and intriguing body of knowledge known as information theory. This is the theory that has permitted the rapid development of all forms of communication ... Even more revolutionary progress is expected in the future.'  
Shannon, Claude, and Warren Weaver, The Mathematical Theory of Communication, University of Illinois Press 1949 'Before this there was no universal way of measuring the complexities of messages or the capabilities of circuits to transmit them. Shannon gave us a mathematical way . . . invaluable . . . to scientists and engineers the world over.' Scientific American 
Chaitin, Gregory J, "Randomness and Mathematical Proof", Scientific American, 232, 5, May 1975, page 47-52. 'Although randomness can be precisely defined and can even be measured, a given number cannot be proved random. This enigma establishes a limit in what is possible in mathematics'. back
Shannon, Claude E, "Communication in the Presence of Noise", Proceedings of the IEEE, 86, 2, February 1998, page 447-457. Reprint of Shannon, Claude E. "Communication in the Presence of Noise." Proceedings of the IEEE, 37 (January 1949) : 10-21. 'A method is developed for representing any communication system geometrically. Messages and the corresponding signals are points in two function spaces, and the modulation process is a mapping of one space into the other. Using this representation, a number of results in communication theory are deduced concerning expansion and compression of bandwidth and the threshold effect. Formulas are found for the maximum rate of transmission of binary digits over a system when the signal is perturbed by various types of noise. Some of the properties of "ideal" systems which transmit this maximum rate are discussed. The equivalent number of binary digits per second of certain information sources is calculated.' . back
Coding theory - Wikipedia Coding theory - Wikipedia, the free encyclopedia 'Coding theory is the study of the properties of codes and their fitness for a specific application. Codes are used for data compression, cryptography, error-correction and more recently also for network coding. Codes are studied by various scientific disciplines—such as information theory, electrical engineering, mathematics, and computer science—for the purpose of designing efficient and reliable data transmission methods.' back
Cybernetics - Wikipedia Cybernetics - Wikipedia, the free encyclopedia 'Cybernetics is the interdisciplinary study of the structure of regulatory systems. Cybernetics is closely related to control theory and systems theory. Both in its origins and in its evolution in the second-half of the 20th century, cybernetics is equally applicable to physical and social (that is, language-based) systems.' back
Entropy (information theory) - Wikipedia Entropy (information theory) - Wikipedia, the free encyclopedia 'In information theory, entropy is a measure of the uncertainty associated with a random variable. In this context, the term usually refers to the Shannon entropy, which quantifies the expected value of the information contained in a message, usually in units such as bits. In this context, a 'message' means a specific realization of the random variable. Equivalently, the Shannon entropy is a measure of the average information content one is missing when one does not know the value of the random variable. The concept was introduced by Claude E. Shannon in his 1948 paper "A Mathematical Theory of Communication".' back
Linear amplifier - Wikipedia Linear amplifier - Wikipedia, the free encyclopedia 'A linear amplifier is an electronic circuit whose output is proportional to its input, but capable of delivering more power into a load. The term usually refers to a type of radio-frequency (RF) power amplifier, some of which have output power measured in kilowatts, and are used in amateur radio. Other types of linear amplifier are used in audio and laboratory equipment.' back
Negative feedback amplifier - Wikipedia Negative feedback amplifier - Wikipedia, the free encyclopedia 'A negative feedback amplifier (or more commonly simply a feedback amplifier) is an amplifier which combines a fraction of the output with the input so that a negative feedback opposes the original signal. The applied negative feedback improves performance (gain stability, linearity, frequency response, step response) and reduces sensitivity to parameter variations due to manufacturing or environment. Because of these advantages, negative feedback is used in this way in many amplifiers and control systems.' back
Nyquist-Shannon sampling theorem - Wikipedia Nyquist-Shannon sampling theorem - Wikipedia, the free encyclopedia 'The Nyquist–Shannon sampling theorem, named after Harry Nyquist and Claude Shannon, is a fundamental result in the field of information theory, in particular telecommunications and signal processing. Sampling is the process of converting a signal (for example, a function of continuous time or space) into a numeric sequence (a function of discrete time or space). Shannon's version of the theorem states:

If a function x(t) contains no frequencies higher than B hertz, it is completely determined by giving its ordinates at a series of points spaced 1/(2B) seconds apart.' back

Orthogonality - Wikipedia Orthogonality - Wikipedia, the free encyclopedia Orthogonality occurs when two things can vary independently, they are uncorrelated, or they are perpendicular. back
Quantization - Wikipedia Quantization - Wikipedia, the free encyclopedia 'Quantization is the procedure of constraining something from a relatively large or continuous set of values (such as the real numbers) to a relatively small discrete set (such as the integers).' back
Signal-to-noise ratio - Wikipedia Signal-to-noise ratio - Wikipedia, the free encyclopedia 'Signal-to-noise ratio (often abbreviated SNR or S/N) is a measure used in science and engineering that compares the level of a desired signal to the level of background noise. It is defined as the ratio of signal power to the noise power. A ratio higher than 1:1 indicates more signal than noise. While SNR is commonly quoted for electrical signals, it can be applied to any form of signal (such as isotope levels in an ice core or biochemical signaling between cells). Signal-to-noise ratio is sometimes used informally to refer to the ratio of useful information to false or irrelevant data in a conversation or exchange. For example, in online discussion forums and other online communities, off-topic posts and spam are regarded as "noise" that interferes with the "signal" of appropriate discussion.' back
Wojciech Hubert Zurek Quantum origin of quantum jumps: breaking of unitary symmetry induced by information transfer and the transition from quantum to classical 'Submitted on 17 Mar 2007 (v1), last revised 18 Mar 2008 (this version, v3)) "Measurements transfer information about a system to the apparatus, and then further on -- to observers and (often inadvertently) to the environment. I show that even imperfect copying essential in such situations restricts possible unperturbed outcomes to an orthogonal subset of all possible states of the system, thus breaking the unitary symmetry of its Hilbert space implied by the quantum superposition principle. Preferred outcome states emerge as a result. They provide framework for the ``wavepacket collapse'', designating terminal points of quantum jumps, and defining the measured observable by specifying its eigenstates. In quantum Darwinism, they are the progenitors of multiple copies spread throughout the environment -- the fittest quantum states that not only survive decoherence, but subvert it into carrying information about them -- into becoming a witness.' back is maintained by The Theology Company Proprietary Limited ACN 097 887 075 ABN 74 097 887 075 Copyright 2000-2018 © Jeffrey Nicholls