##### vol **6:** Essays

### On the (non-) quantization of gravitation

##### Abstract

For more than 60 years much sweat, maybe some tears (and possibly a little blood) has been spent in the (so far) unsuccessful effort to quantize gravity. Here we interpret this situation as evidence that gravity is not quantized. The core argument is based on the notion that the universe may be modelled as a computer network. An important feature of useful networks is error free communication.

Shannon has shown that we can approach error free transmission over a noisy channel by encoding our messages into packets so far apart in message space that the probability of confusion is minimal. We assume that the exchange of particles in the physical universe corresponds to the exchange of messages in the network model.

This correspondence enables us to interpret quantization as a consequence of the error free communication which (we assume) underlies the stability of universal structure.

Conversely, we should not expect to find quantization where error is impossible, that is in a regime where every possible message is a valid message. Since gravitation couples universally to energy alone, and is blind to the particular nature of the particles or fields associated with energy, we can imagine that gravitation involves unconstrained and therefore intrinsically error free and non-quantized interaction.

##### Introduction

Classical physics does not ask for details of the interaction between massive bodies. Following Newton it assumes that some invisible power makes things happen, and that the physicists’ task is to devise algorithms which predict the consequences of various initial conditions. Newton: *The Principia*, General Scholion

Quantum mechanics goes deeper, producing vastly more detailed accounts of interaction represented by much fiercer looking mathematics. It has become a truism in physics that our space-time intuition, very helpful in imagining the orbits of planets, is of limited help in quantum mechanics. Bernard d'Espagnat

Following Faraday and Maxwell, quantum mechanics seeks to describe the inner workings of the world in terms of invisible fields that choreograph the creation, annihilation and interplay of the particles that we do see. Quantum field theory has often worked to give us astronomically precise matches between calculation and observation. This has led to a certain amount of hubris in the community, some imagining that we are seeing the mind of Newton’s God in the models we use to predict the behaviour of observable particles. Davies: *The Mind of God: Science and the Search for Ultimate Meaning*

Despite its success, quantum field theory is prone to logical and mathematical problems that have so far made it impossible to bring gravitation into the field theoretical fold. Greene: *The Elegant Universe*

Field theory sees the forces acting between observable particles as the unobservable exchange either of the particles themselves, or of gauge particles (like photons or gravitons) created and annihilated by the interagents.

One of the fundamental principles of quantum mechanics is that what we see depends upon how we look. We observe only eigenvalues corresponding to the orthogonal eigenvectors of our measurement operator. So let us look at quantum mechanics not with space-time intuition, but with social, that is networking, intuition. Wojciech Hubert Zurek: The quantum origin of quantum jumps

From this point of view, the exchange of particles is an exchange of messages. Messages on a network act as a force, changing the states of both sender and receiver. So if you tell obedient children to tidy up, they will move as though impelled by a force. Others may not hear the message at all and continue their lives unperturbed.

From this point of view, the invisible substratum of the universe we call field becomes analogous to the computation in a real world network as it transforms ordered sets of data from one encoding to another.

Here we draw attention to two features of networks, the need for error correction if they are to be useful and stable; and the layering of software on a hardware foundation in real world computer networks. Tanenbaum: *Computer Networks*

##### Why is the universe quantized?

From a mathematical point of view, a message is an ordered set of symbols. In practical networks, such messages are transmitted serially over physical channels. The purpose of error control technology is to make certain that the receiver receives the string that the transmitter sends.

Mathematical communication theory shows that by encoding messages into discrete packets, we can maximize the distance between different messages in message space, and so minimize the probability of their confusion. This theory enables us to send gigabytes of information error free over scratchy phone lines. The art lies in packaging information, that is quantization. We are so led to assume that the quantization of the universe results from error prevention mechanisms that contribute to its stability. Claude E Shannon: A Mathematical Theory of Communication, Khinchin: *Mathematical Foundations of Information Theory*

##### The layering of computer networks

One of the foundations of the theory of computation is the abstract machine that Turing used to prove the existence of incomputable functions. Turing began by specifying the elementary operations of his machine and then began increasing its power by recursively specifying subroutines which served as building blocks for more complex processes until he arrived at a universal machine which could (formally speaking) compute any computable function. Alan Turing: On Computable Numbers

The foundation of every concrete computer and computer network is the physical layer made of copper, silicon, glass and other materials which serve to represent and information and move it about. The physical layer implements Landauer’s hypothesis that all information is represented physically. Rolf Landauer

Software engineering is devoted to marshalling physical representatives of information so as to represent formal algorithms. Software engineering - Wikipedia

In practical networks, the first layer after the physical layer is concerned with error correction, so that noisy physical channels are seen as noise free and deterministic by subsequent layers of the system.

Once errors are eliminated, an uninterrupted computation proceeds formally and deterministically according to the laws of propositional calculus. As such it imitates a formal Turing machine and maybe considered to be in inertial motion, subject to no forces (messages).

It is well known a that all the operations of propositional calculus can be modelled using the binary Sheffer stroke or NAND operation. We can thus imagine building up a layered computer network capable of any computable transformation, a universal computer, using a sufficient network of NAND gates. Whitehead & Russell: *Principia Mathematica*, Sheffer stroke - Wikipedia

Using our social intuition, we might imagine a similar structure for the universe, interpreting fields as computer networks which send observable messages to one another in the form of quantized particles.

##### The continuum

Continuous mathematics gives us the impression that it carries information at the highest possible density. Cantor’s set theory has shown us how to treat continua as transfinite sets of discrete points. This theory is so convincing that we are inclined to treat such points as real. Much of the trouble in quantum field theory comes from the assumption that point particles really exist.

The field of quantum computation believes that the information density of the continuum is real and hopes to exploit it. It is inclined to see the quantum world as a perfect analogue computer. From the point of view of algorithmic information theory, however, a continuum is just that, a featureless nothing. Nielsen & Chuang: *Quantum Information and Quantum Computation*, Gregory J. Chaitin: Randomness and mathematical proof

Analytical continuity is the logical foundation of the argument from continuity, embedded in tensor calculus. Einstein’s principle of general covariance embodies the idea that if two points of view of the same observable event can be joined by a continuous transformation, both express the same information in a different coordinate frame (or language). Einstein: *Relativity: The Special and General Theory*, Continuity - Wikipedia

This idea is supported by algorithmic information theory, that holds that there can be no more information in a theory derived logically from a set of axioms than there is in the axioms themselves. This leads us to suspect that there is no more information in differential geometry than is contained in the notations used to represent it. Chaitin: *Information, Randomness and Incompleteness: Papers on Algorithmic Information Theory*

It also suggests that we identify the argument from continuity as a species of ‘logical continuity’. A logical continuum is equivalent to the deterministic argument (that is the computation) that binds the hypothesis of a theorem to its conclusion: if we accept Euclidian geometry, the interior angles of a triangle total 180 degrees. A turing machine or deterministic computer is a logical continuum.

Continua therefore, represent no information in themselves, but represent the transmission or transformtion of information without error from one representation to another.

##### Gravitation

Let us assume that gravitation is the fundamental force in the universe and that, from the network point of view, it can be modelled as the most physical layer. We have already noted that we can use a binary logical operation, NAND, to build a computable universe, but there are simpler unary operations, no operation, NOP, and NOT.

An important feature of both these operations is that they cannot go wrong, and therefore have no need for error correction, or, on our assumptions, quantization. In other words, we may think of them as continuous. They are in a sense empty transformations.

Let us associate dynamical action with the execution of a Turing machine. From one point of view, because it represents a deterministic transformation, we can think of a Turing machine as a NOP, since (following Einstein) we think of reality going its own way and in no way changing because someone observes it from a new frame of reference.

The NOT operation is more complex, in that it introduces the idea of time or sequence. First there was *p*, now there is *not-p*, NOT(*p*) = *not-p*. From this point of view the NOT operation is part of a clock, a ticker without a counter. If we could count the ticks of this clock, we would introduce energy, the rate of action. Since down here in the foundations of the universal network there are still no observers, we can imagine that energy exists, but is not measured.

Since gravitation couples only energy, and sees only the energy of the particles with which it interacts, we might assume that it is a feature of an isolated system, that is one without outside observers. The universe as a whole is such a system.

##### Symmetry breaking

The mathematical axioms of quantum mechanics represent unobserved systems by a function space (set of algorithms) of finite or infinite dimension which evolves by unitary transformations of the functions represented in the space.

The observational axioms of quantum mechanics place constraints on this mathematical system. We assume that there can be but one actual outcome from each event, no matter how many possible outcomes there may be. This requirement is fixed by defining the length of permissible state vectors to be 1. Given the formalism, this gives a quantum mechanical event the same normalization as a communication source. For a quantum system sum of the probabilities of possible outcomes is 1; for a communication source, the sum of the probabilities of possible messages is 1.

The discussion above suggests that gravitation is concerned with events with only one possible outcome. Things stay the same (which can hardly be called an event) or they change from *p* to *not-p.* We might imagine this situation as characteristic of the initial singularity.

Let us now endow this rudimentary universe with memory, so that both *p* and *not-p* can exist simultaneously. Since the two parts of this broken symmetry are identical and indistinguishable, they interfere quantum mechanically and our initial singularity becomes a two state system.

Now we can imagine that *p* can observe *not-p*, that is communicate with it. We expect this to be possible because despite their difference, the two states have a common ancestry. In network terms, they are both users of the layer of processing beneath them. This layer is the undivided universe which we may imagine as a power distribution network, a flow of completely unmodulated (and therefore error free) energy.

The vector representing an observation in quantum mechanics exists in a Hilbert space which is the tensor product of the Hilbert spaces of the observing system and the observed system. Observation is thus quantum mechanically connected to the increase in complexity of the world and the breaking of symmetry. As the universe complexifies, it remains true that all emergent processes share the energy layer and so are subject to gravitation.

##### Conclusion

Memory is by definition something that stays the same through the passage of time, only changing when it is changed by some message or force. The origin of space is the origin of history - the origin of memory. We can get no closer to the initial singularity that the beginning of memory, because everything before that is forgotten. No particle can see before its own creation. Here we have a statement that applies to the universe as a whole as well as every particle within it, a cosmological principle.

Whenever there is observation (communication), there is quantization. There can be no observation when there are no observers. This is the initial state of the universe described by gravitation.

With the breaking of the initial total symmetry of the initial singularity space, time, memory and structure enter the universe to give us the system we observe today. Nevertheless, as in a network, the more primitive operations are to be found inherent in the more complex. So everything one does on a computer or network can be broken down into an ordered network of NANDs.

Gravitation, which describes nothing but the conserved flow of processing power, that is energy, is thus present at every point in the universe. In the context of our present four dimensional universe, this conserved flow is described by Einstein’s field equations. Because this flow cannot go wrong, it requires no error protection, and so it does not need to be quantized.

This line of argument suggests that efforts to quantize gravity may be misdirected. It may also give us some understanding os 'spooky action at a distance' which has been found to propagate much faster than the velocity of light. Let us guess that the velocity of light is limited by the need to encode and decode messages to prevent error. If there is no possibility of error, and so no need for coding, it may be that such 'empty' messages can travel with infinite velocity. Juan Yin *et al*: Bounding the speed of 'spooky action at a distance

(revised 4 October 2015)