natural theology

We have just published a new book that summarizes the ideas of this site. Free at Scientific Theology, or, if you wish to support this project, buy at Scientific Theology: A New Vision of God

Contact us: Click to email

Essay 8: On the (non-) quantization of gravitation (2008)

Abstract

For more than 60 years much sweat, maybe some tears (and possibly a little blood) has been spent in the (so far) unsuccessful effort to quantize gravity. Here we interpret this situation as evidence that gravity is not quantized. The core argument is based on the notion that the universe may be modelled as a computer network. An important feature of useful networks is error free communication.

Shannon has shown that we can approach error free transmission over a noisy channel by encoding our messages into packets so far apart in message space that the probability of confusion is minimal. We assume that the exchange of particles in the physical universe corresponds to the exchange of messages in the network model.

This correspondence enables us to interpret quantization as a consequence of the error free communication which (we assume) underlies the stability of universal structure.

Conversely, we should not expect to find quantization where error is impossible, that is in a regime where every possible message is a valid message. Since gravitation couples universally to energy alone, and is blind to the particular nature of the particles or fields associated with energy, we can imagine that gravitation involves unconstrained and therefore intrinsically error free and non-quantized interaction.

Introduction

Classical physics does not ask for details of the interaction between massive bodies. Following Newton it assumes that some invisible power makes things happen, and that the physicists’ task is to devise algorithms which predict the consequences of various initial conditions. Isaac Newton: The General Scholium to the Principia Mathematica

Quantum mechanics goes deeper, producing vastly more detailed accounts of interaction represented by much fiercer looking mathematics. It has become a truism in physics that our space-time intuition, very helpful in imagining the orbits of planets, is of limited help in quantum mechanics. Bernard d'Espagnat

Following Faraday and Maxwell, quantum mechanics seeks to describe the inner workings of the world in terms of invisible fields that choreograph the creation, annihilation and interplay of the particles that we do see. Quantum field theory has often worked to give us astronomically precise matches between calculation and observation. This has led to a certain amount of hubris in the community, some imagining that we are seeing the mind of Newton’s God in the models we use to predict the behaviour of observable particles. Davies: The Mind of God: Science and the Search for Ultimate Meaning

Despite its success, quantum field theory is prone to logical and mathematical problems that have so far made it impossible to bring gravitation into the field theoretical fold. Greene: The Elegant Universe

Field theory sees the forces acting between observable particles as the unobservable exchange either of the particles themselves, or of gauge particles (like photons or gravitons) created and annihilated by the interagents.

One of the fundamental principles of quantum mechanics is that what we see depends upon how we look. We observe only eigenvalues corresponding to the orthogonal eigenvectors of our measurement operator. So let us look at quantum mechanics not with space-time intuition, but with social, that is networking, intuition. Wojciech Hubert Zurek: The quantum origin of quantum jumps

From this point of view, the exchange of particles is an exchange of messages. Messages on a network act as a force, changing the states of both sender and receiver. So if you tell obedient children to tidy up, they will move as though impelled by a force. Others may not hear the message at all and continue their lives unperturbed.

From this point of view, the invisible substratum of the universe we call field becomes analogous to the computation in a real world network as it transforms ordered sets of data from one encoding to another.

Here we draw attention to two features of networks, the need for error correction if they are to be useful and stable; and the layering of software on a hardware foundation in real world computer networks. Andrew Tanenbaum (1996): Computer Networks

Why is the universe quantized?

From a mathematical point of view, a message is an ordered set of symbols. In practical networks, such messages are transmitted serially over physical channels. The purpose of error control technology is to make certain that the receiver receives the string that the transmitter sends.

Mathematical communication theory shows that by encoding messages into discrete packets, we can maximize the distance between different messages in message space, and so minimize the probability of their confusion. This theory enables us to send gigabytes of information error free over scratchy phone lines. The art lies in packaging information, that is quantization. We are so led to assume that the quantization of the universe results from error prevention mechanisms that contribute to its stability. Claude E Shannon: A Mathematical Theory of Communication, Khinchin: Mathematical Foundations of Information Theory

The layering of computer networks

One of the foundations of the theory of computation is the abstract machine that Turing used to prove the existence of incomputable functions. Turing began by specifying the elementary operations of his machine and then began increasing its power by recursively specifying subroutines which served as building blocks for more complex processes until he arrived at a universal machine which could (formally speaking) compute any computable function. Alan Turing: On Computable Numbers

The foundation of every concrete computer and computer network is the physical layer made of copper, silicon, glass and other materials which serve to represent and information and move it about. The physical layer implements Landauer’s hypothesis that all information is represented physically. Rolf Landauer (1999): Information is a Physical Entity

Software engineering is devoted to marshalling physical representatives of information so as to represent formal algorithms. Software engineering - Wikipedia

In practical networks, the first layer after the physical layer is concerned with error correction, so that noisy physical channels are seen as noise free and deterministic by subsequent layers of the system.

Once errors are eliminated, an uninterrupted computation proceeds formally and deterministically according to the laws of propositional calculus. As such it imitates a formal Turing machine and may be considered to be in inertial motion, subject to no forces (messages).

It is well known a that all the operations of propositional calculus can be modelled using the binary Sheffer stroke or NAND operation. We can thus imagine building up a layered computer network capable of any computable transformation, a universal computer, using a sufficient network of NAND gates. Whitehead & Russell: Principia Mathematica, Sheffer stroke - Wikipedia

Using our social intuition, we might imagine a similar structure for the universe, interpreting fields as computer networks which send observable messages to one another in the form of quantized particles.

The continuum

Continuous mathematics gives us the impression that it carries information at the highest possible density. Cantor’s set theory has shown us how to treat continua as transfinite sets of discrete points. This theory is so convincing that we are inclined to treat such points as real. Much of the trouble in quantum field theory comes from the assumption that point particles really exist.

The field of quantum computation believes that the information density of the continuum is real and hopes to exploit it. It is inclined to see the quantum world as a perfect analogue computer. From the point of view of algorithmic information theory, however, a continuum is just that, a featureless nothing. Nielsen & Chuang: Quantum Information and Quantum Computation, Gregory J. Chaitin: Randomness and mathematical proof

Analytical continuity is the logical foundation of the argument from continuity, embedded in tensor calculus. Einstein’s principle of general covariance embodies the idea that if two points of view of the same observable event can be joined by a continuous transformation, both express the same information in a different coordinate frame (or language). Einstein: Relativity: The Special and General Theory, Continuity - Wikipedia

This idea is supported by algorithmic information theory, that holds that there can be no more information in a theory derived logically from a set of axioms than there is in the axioms themselves. This leads us to suspect that there is no more information in differential geometry than is contained in the notations used to represent it. Chaitin: Information, Randomness and Incompleteness: Papers on Algorithmic Information Theory

It also suggests that we identify the argument from continuity as a species of ‘logical continuity’. A logical continuum is equivalent to the deterministic argument (that is the computation) that binds the hypothesis of a theorem to its conclusion: if we accept Euclidian geometry, the interior angles of a triangle total 180 degrees. A turing machine or deterministic computer is a logical continuum.

Continua therefore, represent no information in themselves, but represent the transmission or transformtion of information without error from one representation to another.

Gravitation

Let us assume that gravitation is the fundamental force in the universe and that, from the network point of view, it can be modelled as the most physical layer. We have already noted that we can use a binary logical operation, NAND, to build a computable universe, but there are simpler unary operations, no operation, NOP, and NOT.

An important feature of both these operations is that they cannot go wrong, and therefore have no need for error correction, or, on our assumptions, quantization. In other words, we may think of them as continuous. They are in a sense empty transformations.

Let us associate dynamical action with the execution of a Turing machine. From one point of view, because it represents a deterministic transformation, we can think of a Turing machine as a NOP, since (following Einstein) we think of reality going its own way and in no way changing because someone observes it from a new frame of reference.

The NOT operation is more complex, in that it introduces the idea of time or sequence. First there was p, now there is not-p, NOT(p) = not-p. From this point of view the NOT operation is part of a clock, a ticker without a counter. If we could count the ticks of this clock, we would introduce energy, the rate of action. Since down here in the foundations of the universal network there are still no observers, we can imagine that energy exists, but is not measured.

Since gravitation couples only energy, and sees only the energy of the particles with which it interacts, we might assume that it is a feature of an isolated system, that is one without outside observers. The universe as a whole is such a system.

Symmetry breaking

The mathematical axioms of quantum mechanics represent unobserved systems by a function space (set of algorithms) of finite or infinite dimension which evolves by unitary transformations of the functions represented in the space.

The observational axioms of quantum mechanics place constraints on this mathematical system. We assume that there can be but one actual outcome from each event, no matter how many possible outcomes there may be. This requirement is fixed by defining the length of permissible state vectors to be 1. Given the formalism, this gives a quantum mechanical event the same normalization as a communication source. For a quantum system sum of the probabilities of possible outcomes is 1; for a communication source, the sum of the probabilities of possible messages is 1.

The discussion above suggests that gravitation is concerned with events with only one possible outcome. Things stay the same (which can hardly be called an event) or they change from p to not-p. We might imagine this situation as characteristic of the initial singularity.

Let us now endow this rudimentary universe with memory, so that both p and not-p can exist simultaneously. Since the two parts of this broken symmetry are identical and indistinguishable, they interfere quantum mechanically and our initial singularity becomes a two state system.

Now we can imagine that p can observe not-p, that is communicate with it. We expect this to be possible because despite their difference, the two states have a common ancestry. In network terms, they are both users of the layer of processing beneath them. This layer is the undivided universe which we may imagine as a power distribution network, a flow of completely unmodulated (and therefore error free) energy.

The vector representing an observation in quantum mechanics exists in a Hilbert space which is the tensor product of the Hilbert spaces of the observing system and the observed system. Observation is thus quantum mechanically connected to the increase in complexity of the world and the breaking of symmetry. As the universe complexifies, it remains true that all emergent processes share the energy layer and so are subject to gravitation.

Conclusion

Memory is by definition something that stays the same through the passage of time, only changing when it is changed by some message or force. The origin of space is the origin of history - the origin of memory. We can get no closer to the initial singularity that the beginning of memory, because everything before that is forgotten. No particle can see before its own creation. Here we have a statement that applies to the universe as a whole as well as every particle within it, a cosmological principle.

Whenever there is observation (communication), there is quantization. There can be no observation when there are no observers. This is the initial state of the universe described by gravitation.

With the breaking of the initial total symmetry of the initial singularity space, time, memory and structure enter the universe to give us the system we observe today. Nevertheless, as in a network, the more primitive operations are to be found inherent in the more complex. So everything one does on a computer or network can be broken down into an ordered network of NANDs.

Gravitation, which describes nothing but the conserved flow of processing power, that is energy, is thus present at every point in the universe. In the context of our present four dimensional universe, this conserved flow is described by Einstein’s field equations. Because this flow cannot go wrong, it requires no error protection, and so it does not need to be quantized.

This line of argument suggests that efforts to quantize gravity may be misdirected. It may also give us some understanding of 'spooky action at a distance' which has been found to propagate much faster than the velocity of light. Let us guess that the velocity of light is limited by the need to encode and decode messages to prevent error. If there is no possibility of error, and so no need for coding, it may be that such 'empty' messages can travel with infinite velocity. Juan Yin et al: Bounding the speed of 'spooky action at a distance

(revised 27 February 2022)

back to top

back to table of contents

Copyright:

You may copy this material freely provided only that you quote fairly and provide a link (or reference) to your source.

Further reading

Books

Chaitin, Gregory J, Information, Randomness & Incompleteness: Papers on Algorithmic Information Theory, World Scientific 1987 Jacket: 'Algorithmic information theory is a branch of computational complexity theory concerned with the size of computer programs rather than with their running time. . . . The theory combines features of probability theory, information theory, statistical mechanics and thermodynamics, and recursive function or computability theory. ... [A] major application of algorithmic information theory has been the dramatic new light it throws on Goedel's famous incompleteness theorem and on the limitations of the axiomatic method. . . .' 
Amazon
  back

Davies, Paul, The Mind of God: Science and the Search for Ultimate Meaning, Penguin Books 1992 'Paul Davies' "The Mind of God: Science and the Search for Ultimate Meaning" explores how modern science is beginning to shed light on the mysteries of our existence. Is the universe - and our place in it - the result of random chance, or is there an ultimate meaning to existence? Where did the laws of nature come from? Were they created by a higher force, or can they be explained in some other way? How, for example, could a mechanism as complex as an eye have evolved without a creator? Paul Davies argues that the achievement of science and mathematics in unlocking the secrets of nature mean that there must be a deep and significant link between the human mind and the organization of the physical world. . . . ' 
Amazon
  back

Einstein, Albert, and Robert W Lawson (translator) Roger Penrose (Introduction), Robert Geroch (Commentary), David C Cassidy (Historical Essay), Relativity: The Special and General Theory, Pi Press 2005 Preface: 'The present book is intended, as far as possible, to give an exact insight into the theory of relativity to those readers who, from a general scientific and philosophical point of view, are interested in the theory, but who are not conversant with the mathematical apparatus of theoretical physics. ... The author has spared himself no pains in his endeavour to present the main ideas in the simplest and most intelligible form, and on the whole, in the sequence and connection in which they actually originated.' page 3  
Amazon
  back

Greene, Brian, The Elegant Universe: superstrings, hidden dimensions and the quest for the ultimate theory, W W Norton and Company 1999 Jacket: 'Brian Greene has come forth with a beautifully crafted account of string theory - a theory that appears to be a most promising way station to an ultimate theory of everything. His book gives a clear, simple, yet masterful account that makes a complex theory very accessible to nonscientists but is also a delightful read for the professional.' David M Lee 
Amazon
  back

Khinchin, Aleksandr Yakovlevich, Mathematical Foundations of Information Theory (translated by P A Silvermann and M D Friedman), Dover 1957 Jacket: 'The first comprehensive introduction to information theory, this book places the work begun by Shannon and continued by McMillan, Feinstein and Khinchin on a rigorous mathematical basis. For the first time, mathematicians, statisticians, physicists, cyberneticists and communications engineers are offered a lucid, comprehensive introduction to this rapidly growing field.' 
Amazon
  back

Nielsen, Michael A, and Isaac L Chuang, Quantum Computation and Quantum Information, Cambridge University Press 2000 Review: A rigorous, comprehensive text on quantum information is timely. The study of quantum information and computation represents a particularly direct route to understanding quantum mechanics. Unlike the traditional route to quantum mechanics via Schroedinger's equation and the hydrogen atom, the study of quantum information requires no calculus, merely a knowledge of complex numbers and matrix multiplication. In addition, quantum information processing gives direct access to the traditionally advanced topics of measurement of quantum systems and decoherence.' Seth Lloyd, Department of Quantum Mechanical Engineering, MIT, Nature 6876: vol 416 page 19, 7 March 2002. 
Amazon
  back

Nielsen (2016), Michael A, and Isaac L Chuang, Quantum Computation and Quantum Information, Cambridge University Press 2016 Review: A rigorous, comprehensive text on quantum information is timely. The study of quantum information and computation represents a particularly direct route to understanding quantum mechanics. Unlike the traditional route to quantum mechanics via Schroedinger's equation and the hydrogen atom, the study of quantum information requires no calculus, merely a knowledge of complex numbers and matrix multiplication. In addition, quantum information processing gives direct access to the traditionally advanced topics of measurement of quantum systems and decoherence.' Seth Lloyd, Department of Quantum Mechanical Engineering, MIT, Nature 6876: vol 416 page 19, 7 March 2002. 
Amazon
  back

Tanenbaum (1996), Andrew S, Computer Networks, Prentice Hall International 1996 Preface: 'The key to designing a computer network was first enunciated by Julius Caesar: Divide and Conquer. The idea is to design a network as a sequence of layers, or abstract machines, each one based upon the previous one. . . . This book uses a model in which networks are divided into seven layers. The structure of the book follows the structure of the model to a considerable extent.'  
Amazon
  back

Links

Alan Turing, On Computable Numbers, with an application to the Entscheidungsproblem, 'The "computable" numbers may be described briefly as the real numbers whose expressions as a decimal are calculable by some finite means. Although the subject of this paper is ostensibly the computable numbers, it is almost equally easy to define and investigate computable functions of an integral variable of a real or computable variable, computable predicates and so forth. . . . ' back

Bernard d'Espagnat, The Quantum Theory and Reality, 'The doctrine that the world is made up of objects whose existence is independent of human consciousness turns out to be in conflict with quantum mechanics and with facts established by experiment'
Bernard d'Espagnat, "Quantum theory and reality", Scientific American 241, (November 1979): 5, 128. back

Claude E Shannon, A Mathematical Theory of Communication, ' The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point. Frequently the messages have meaning; that is they refer to or are correlated according to some system with certain physical or conceptual entities. These semantic aspects of communication are irrelevant to the engineering problem. The significant aspect is that the actual message is one selected from a set of possible messages.' back

Continuity - Wikipedia, Continuity - Wikipedia, the free encyclopedia, 'In mathematics, a continuous function is, roughly speaking, a function for which small changes in the input result in small changes in the output. Otherwise, a function is said to be a discontinuous function. A continuous function with a continuous inverse function is called a homeomorphism.' back

Gregory J. Chaitin, Randomness and Mathematical Proof, 'Although randomness can be precisely defined and can even be measured, a given number cannot be proved to be random. This enigma establishes a limit to what is possible in mathematics.'
Scientific American 232, No. 5 (May 1975), pp. 47-52 back

Isaac Newton, The General Scholium to the Principia Mathematica, 'Published for the first time as an appendix to the 2nd (1713) edition of the Principia, the General Scholium reappeared in the 3rd (1726) edition with some amendments and additions. As well as countering the natural philosophy of Leibniz and the Cartesians, the General Scholium contains an excursion into natural theology and theology proper. In this short text, Newton articulates the design argument (which he fervently believed was furthered by the contents of his Principia), but also includes an oblique argument for a unitarian conception of God and an implicit attack on the doctrine of the Trinity, which Newton saw as a post-biblical corruption. The English translation here is that of Andrew Motte (1729). Italics and orthography as in original.' back

Juan Yin et al, Lower Bound on the Speed of Nonlocal Correlations without Locality and Measurement Choice Loopholes , ' In their well-known paper, Einstein, Podolsky, and Rosen called the nonlocal correlation in quantum entanglement a “spooky action at a distance.” If the spooky action does exist, what is its speed? All previous experiments along this direction have locality and freedom-of-choice loopholes. Here, we strictly closed the loopholes by observing a 12 h continuous violation of the Bell inequality and concluded that the lower bound speed of spooky action was 4 orders of magnitude of the speed of light if Earth’s speed in any inertial reference frame was less than 10-3 time the speed of light. ' back

Rolf Landauer (1999), Information is a Physical Entity, 'Abstract: This paper, associated with a broader conference talk on the fundamental physical limits of information handling, emphasizes the aspects still least appreciated. Information is not an abstract entity but exists only through a physical representation, thus tying it to all the restrictions and possibilities of our real physical universe. The mathematician's vision of an unlimited sequence of totally reliable operations is unlikely to be implementable in this real universe. Speculative remarks about the possible impact of that on the ultimate nature of the laws of physics are included.' back

Sheffer stroke - Wikipedia, Sheffer stroke - Wikipedia, the free encyclopedia, 'In Boolean functions and propositional calculus, the Sheffer stroke, named after Henry M. Sheffer, written "|" . . . denotes a logical operation that is equivalent to the negation of the conjunction operation, expressed in ordinary language as "not both". It is also called nand ("not and") or the alternative denial, since it says in effect that at least one of its operands is false.' back

Software engineering - Wikipedia, Software engineering - Wikipedia, the free encyclopedia, 'Software engineering is the study and an application of engineering to the design, development, and maintenance of software.' back

Wojciech Hubert Zurek, Quantum origin of quantum jumps: breaking of unitary symmetry induced by information transfer and the transition from quantum to classical, 'Submitted on 17 Mar 2007 (v1), last revised 18 Mar 2008 (this version, v3)) Measurements transfer information about a system to the apparatus, and then further on – to observers and (often inadvertently) to the environment. I show that even imperfect copying essential in such situations restricts possible unperturbed outcomes to an orthogonal subset of all possible states of the system, thus breaking the unitary symmetry of its Hilbert space implied by the quantum superposition principle. Preferred outcome states emerge as a result. They provide framework for the “wavepacket collapse”, designating terminal points of quantum jumps, and defining the measured observable by specifying its eigenstates.' back

www.naturaltheology.net is maintained by The Theology Company Proprietary Limited ACN 097 887 075 ABN 74 097 887 075 Copyright 2000-2022 © Jeffrey Nicholls