##### vol **6:** Essays

### On modelling the world

*In the beginning God created the Heavens and the Earth.*

(Genesis 1:1 [Unknown authors, circa 1000 bce] Genesis)

0. Abstract

1. Introduction

2. A definite model: computer networks

3 The physical layer: some fixed points

3.1 Aristotle’s unmoved mover4. From isolation to communication

3.2 The world of light

3.3 The initial singularity

3.4 The isolated quantum system

3.5 The Turing machine

3.6 Isomorphism

4.1 The Word of God5. Quantum field theory and the computer network

4.2 Quantum measurement

5.1 Why is the Universe quantized?6. A transfinite computer network

5.2 Calculus and the quantum

5.3 Action and the path integral

5.4 Feynman diagrams

6.1 The Cantor Universe7 Some corollaries

6.2 The transfinite symmetric universe

6.3 Cantor symmetry

6.4 Computation: quantum information theory

6.5 Creative evolution

7.1 ‘The Unreasonable effectiveness of mathematics’

7.2 The view from inside the Universe

7.3 Network intelligence

7.4 Past and future

7.5 Entanglement and ‘spooky action at a distance’

7.6 Numerical results

#### 0. Abstract

Each of us follows a path through life. To follow a path, one needs navigation, and to navigate one needs both a model (map) of the space to be navigated and a method of locating oneself on the map. Traditionally, navigation in human space is studied by philosophers and theologians with very different world views.

Over the last few centuries, science has provided us with a quite comprehensive picture of our habitat that begins with the initial singularity predicted by general relativity and traces the evolution of the Universe to its present state.

To physicists, this work is an approach to a ‘theory of everything’. It provides a good foundation for practical technologies like computers, power grids, space vehicles and medicine, but has little to say about human spiritual issues. The purpose of this article is to extend the physical model into the spiritual realm, rather as Aristotle generated metaphysics out of physics. Aristotle

Traditionally the majority of philosophers and theologians have postulated a very real and sharp distinction between such categories of being as spiritual and material, living and non-living, intelligent and not-intelligent, eternal and ephemeral and so on.

Here I propose a model that embraces these and all other distinctions in the world in a single consistent picture.

#### 1. Introduction

Physical cosmology has taught us that in terms of space and time we are a small element of a large Universe. Yet we are still inclined to believe that we are unique, rather overlooking the intelligence of the Universe that made us. Here we assume that creative intelligence is universal and attempt to understand how it works to create the Universe, ourselves, and our theories about ourselves.

This essay is based on a model of the Universe derived from the structure of the computer communication networks that are rapidly evolving round us. Some of the oldest questions in philosophy concern the relationship between motion and stillness. As far as we know, Parmenides started the ball rolling when a Goddess told him that true reality was full and eternal, the moving world being somehow less than fully real and an imperfect object of true knowledge. John Palmer - Parmenides

Truth may be defined as an isomorphism between two meaningful entities, often a sentence and reality: The truth of a sentence consists in its agreement with (or correspondence to) reality. Parmenides is
speaking of an eternal correspondence between eternal reality and eternal
knowledge. Here we admit a time element to truth: *he is standing on his
head* remains true as long as he is standing on his head, but become false
when he falls over. Tarski

We live in a dynamic Universe which nevertheless exhibits points that remain fixed for various lengths of time. Some, the subjects of science, may remain true for all time. Others may be temporary, but we may track changes by revising our sentences as fast as the reality to which they refer is changing. This idea is encapsulated in mathematical sampling theorems, that tells us that as long as we sample a changing reality twice as fast as the maximum frequency of change, our samples will be a true representation of the reality sampled. Nyquist, Claude E Shannon

Here we propose that a class of mathematical theorems known as fixed point theorems provide a relatively simple answer to the problematic relationship between motion and stillness. These theorems, pioneered by Brouwer, that tell us that whenever a convex compact set is mapped onto itself at least one point remains unmoved. van Heijenoort, Casti

It is easy to see that the Universe might fulfill the hypotheses of these
theorems. Convex means that it has no ‘holes’: one can get from any *A* to
any *B* without having to go outside the Universe; compact means that it
contains its own boundaries. By definition, there is nothing outside the
Universe. Brouwer allows us to turn Parmenides around, seeing the dynamics as
primary and the forms, that is the fixed points, as fixed points in the
motions. The consistency of the form is to be explained by the consistency
of the motion that contains it. Brouwer fixed point theorem - Wikipedia

This approach is consistent with the view that the Universe is pure act (in the sense used by Aquinas, (*Summa* I, 2, 3) and so divine. It is also
consistent with the idea that we are inside rather than outside the divinity, so
that the dynamics of the Universe is equivalent to the life of God. Aquinas 13: *Does God exist?*, Aquinas 113: *Is life properly attributed to God?*

#### 2 A model: the computer network

A network is a set of computers (like the internet) physically connected to enable the transmission of information. Although the processes in individual computers are deterministic, a network is not, since any computer may be interrupted at any point in its process and steered onto a different course by information received.

At its most abstract, a network comprises a set of addressed memories and a set of agents that can read from and write to those memories. The engineering of robust practical computer networks is a complex business. Tanenbaum

To deal with these complex problems, networks are constructed in layers. The lowest layer is the physical (or hardware) layer, the wires, transistors, etc necessary to physically realize formal logical operators and connect them together. The physical layer is necessary because all observable information is represented physically, as electric currents, ink on paper, sounds and so on. Rolf Landauer

At the opposite extreme is the user layer, people sending and receiving data through the network. In between these two layers are the various software layers that transform user input into a suitable form for physical transmission and vice versa.

Each subsequent software layer uses the routines of the layer beneath it as an alphabet of operations to achieve its ends. Extrapolating beyond the computer network, a human in the user layer may be a part of a corporate network, reporting through further layers of management to the board of his organization. By analogy to this layered hierarchy, we may consider the Universe as a whole to be the ultimate user of a universal network.

Corresponding layers (‘peers’) of two nodes in a network may communicate if they share a suitable protocol. All such communication uses the services of all layers between the peers and the physical layer. These services are generally invisible or transparent to the peers unless they fail. Thus in conversation we are almost completely unaware of the complex physiological, chemical and physical processes that make our communication possible.

#### 3. The physical layer: some isolated fixed points

Using the network model, we interpret motion as the transmission and receipt of messages. Further, we accept Landauer’s hypothesis that all information is represented physically, so that there are no ‘pure spirits’. Everything is embodied. We begin our discussion with a set of isolated fixed points which have been defined in various historical contexts. We then note their identity before turning to the task of binding them into a communication network.

##### 3.1 Aristotle’s ‘unmoved mover’

Aristotle supposed that nothing could move itself since he saw motion as the realization of a potential, and held it to be axiomatic that no potential could actualize itself. This led him to propose an unmoved mover ‘whose essence is actuality’ as the source of all motion in the world [Aristotle 1071b3 sqq].

Aristotle notes that ‘the object of desire and the object of thought move without being moved’ [1072a26]. The first mover moves things, not by pushing them, but by attracting them as a ‘final cause’.

##### 3.2 The world of light

The fundamental axiom of special relativity is that observers moving without acceleration (inertially) and at rest with respect to the object observed see the same data. In particular, all inertial observers will see the same velocity of light regardless of their state of motion.

A simple geometrical argument based on this observation leads to the Lorentz transformation. Some well known consequences of this transformation are that if I observe you going past me at a significant fraction of the velocity of light, your clocks will appear slow relative to mine, your rulers will appear foreshortened in the direction of motion and your mass will appear greater.

The mathematical apparatus of Lorentz transformation was greatly simplified by Minkowski. In the Minkowski representation the space-time interval between events appears to be the same regardless of the relative inertial motions of the observer and observed. Using this metric, we arrive at a property of photons and other particles travelling at the velocity of light: from the point of view of any observer a clock on a photon appears to be stopped and the photon has zero length. Given these facts, we might imagine that from the point of view of an hypothetical observer, photons exist in an eternal world of zero size, another fixed point. The velocity of light is not special here. A similar situation arises whenever the speed of communication is equal to the speed of the entity with which one attempts to communicate. Streater & Wightman

##### 3.3 The initial singularity

Einstein built the theory of general relativity on the special theory. The general theory predicts that the Universe is either expanding or contracting. Observation suggests that the latter is true, and we can extrapolate toward the past to a point known as the initial singularity where space-time as we experience it ceases to exist [Hawking & Ellis 1975]. This point, representing the beginning of the whole Universe, is effectively fixed and isolated since there is, by the definition of Universe, nothing outside it. Misner, Thorne & Wheeler, Hawking & Ellis

##### 3.4 The isolated quantum system

Quantum mechanics falls naturally into two sections. The first describes the unitary evolution of state vectors in an isolated quantum system. The second describes what happens when such isolated systems communicate. Here we deal with the isolated system, turning to the theory of communication below.

Quantum theory is developed in a mathematical *Hilbert space*. Hilbert
spaces are a species of function space, each point in the space representing a
function or computation which transforms an input domain to an output
range. We may consider this text as a function from the natural numbers to
the english alphabet. The domain of this function is the natural numbers
which we use to number and order the symbols in the text, beginning with 1
for the first symbol and *n* for the last.

The value of this function at any location is the symbol appearing at that
point. We can represent this text in an *n* dimensional space, each dimension corresponding to a symbol. We can use the same space to represent any *n* symbol text. Using this approach, each text is represented by a vector or point in *n-symbol text space*.

Quantum states are represented by vectors in Hilbert space where the symbols are not letters of the alphabet but complex numbers. The evolution of isolated quantum systems is governed by the Schrödinger equation

*iℏ ∂ψ* / *∂τ = H ψ*

where *ψ* is a state vector, *H *represents the energy associated with each state vector, *ℏ* is Planck’s constant, *τ* is time and *i* is the symbol for the ‘imaginary’ basis vector of complex numbers.

This equation describes a *superposition* of an infinity of different
frequencies, analogous to the sound of an orchestra. It is a generalized
version of the ‘Einstein-deBroglie’ equation, *E =ℏω*, which represents the fixed mechanical relation between energy and frequency, *ω*.

Quantum processes are in perpetual motion as long as there is energy
present. Given that there is nothing outside the Universe, such processes
must map the Universe onto itself, and we would therefore expect fixed
points. These fixed points are identified by the *eigenvalue* (special value) equation *H ψ = c ψ*, where *c *is an eigenvalue. This equation is satisfied by those eigenfunctions of the operator *H* which leave the phase of the state vector *ψ* essentially unchanged.

This representation is purely conjecture, since we cannot, by definition, observe an isolated system and gain information from it. Nevertheless, the quantum mechanics built on this idea works, making it practically credible.

##### 2.5 The Turing machine

A *Turing machine* is a formal deterministic machine generally agreed to be
able to compute all computable functions. It thus serve as a definition of
‘computable’. From a practical point of view, a computer is the physical embodiment of an algorithm or set of algorithms. There are a countable infinity of different algorithms. A machine which can be programmed to compute any of these algorithms is called a universal Turing machine. Alan Turing, Davis

Any algorithm except the simplest can be constructed by a sequence of lesser algorithms (like the individual arithmetic operations in multiplication). Modern computers implement Russell’s idea that logic embraces the whole of computable (and communicable) mathematics, since in a binary digital computer all algorithms are ultimately broken down to a sequence of propositional functions. Further, all these operations can be modelled with a single operation known to logicians as the Sheffer Stroke and to computer people as the NAND gate. Russell, Mendelson

Part of the power of a digital computer lies in its ability to execute very simple logical functions repeatedly (periodically) at high frequency, and to build simple high frequency routines into longer complex routines which execute at a lower frequency. We may thus imagine a computation as a superposition of frequencies analogous to the superposition of frequencies we see in quantum systems.

Further, the eigenvalues selected by the eigenvalue equation correspond to eigenfunctions of the operator involved, which correspond in turn to algorithms or Turing machines. The similarity between quantum mechanics and computation was first noted by Feynman and has now become an important area of research. Feynman, Nielsen & Chuang.

##### 3.6 The isomorphism of these isolated points

Each of the systems outlined above contains an invisible isolated process: Aristotle’s god enjoys the pleasure of thinking about itself [1072b15] while remaining completely independent of the world; to communicate with an inertial system would be to exert a force upon it so that it was no longer inertial; the initial singularity is isolated since it contains the Universe outside which there is, by definition, nothing; according to the theory, isolated quantum systems cannot be observed without changing them; one cannot observe the internal state of a Turing machine without breaking into its process and so destroying its determinism.

Here we propose that motions of these isolated systems are equivalent to Turing machines and therefore isomorphic to one another despite their differences in definition and their historical cultural roles. This proposition is based on the premise that the motion of a universal Turing machine embraces all computable functions, that is all observable transformations.

### 4. From isolation to communication

##### 4.1 The Word of God

In ancient times many cultures established a one to one correspondence between their Gods and different features of human existence like love, war, reproduction and so on. The Hebrews, on the contrary, became monotheist attributing all functions to one God which was ultimately transformed into the Christian God.

One of their thorniest problems facing Christian theologians was reconciling Hebrew monotheism with the Trinity that the early writers had woven into the New Testament: the Father (reminiscent of the Old Testament God), the Son (who came to Earth as a man) and the Holy Spirit (who guides the evolution of Christianity).

The standard psychological model of the Trinity was first suggested by Augustine and developed by Aquinas (Augustine ad 400; Aquinas ad 1265). Their theories of knowledge require that the known exist in the knower, a representation often called a ‘mental word’ by analogy to the spoken word derived from it. Augustine, Aquinas 160

Aquinas saw this representation as accidental in human knowledge but essential in God. Thus he considered God’s knowledge of Himself, the Word of God, equivalent to God, distinguished only by the relationships of paternity and sonship established by the procession of the Word from the Father (Lonergan 1997; 2007). Lonergan

From our point of view, the procession of the Son from the Father is the equivalent to the creation of a minimal two unit network within an isolated system. Christianity, guided by its interpretation of the Biblical text, proposes that the Spirit is the substantial love between the Father and the Son, but stops the process at this point. Here we see this ancient model as representing the first steps in the creation of a transfinite network of independent but communicating agents which is isomorphic the the Universe as we know it.

##### 4.2 Quantum measurement

We learn about isolated quantum systems by observing or measuring them.
Mathematically, quantum mechanics represents a measurement as an
operator called an *observable*.

In quantum mechanics the outcomes of measurement are restricted to eigenvalues corresponding to eigenfunctions of the observable. The mathematical ideas involved here are complex but, since we are dealing with communication and we are natural communicators, quantum measurement is easily explained in terms of human experience and communication theory.

Quantum mechanics tells us, in effect, that the things we see are the things we look for. From this point of view, the information available from quantum mechanics lies not in what we see, but in the frequency of appearance of the different things seen.

The *measured observable S* is the operator representing the system we are
using to sense the unknown state*ψ* of the system to be measured. We can only observe elements of the set of the eigenfunctions {*s _{k}*} of the operator

*S*. No other vectors are visible, although we might assume that they exist at some point in the evolution of the measured system.

The probability of finding a given outcome is given by Born’s rule:

*p _{k}* = |<

*s*|

_{k}*ψ*>|

^{2}where

*p*is the probability of observing the

_{k}*k*th eigenfunction. Born rule - Wikipedia

Perhaps the biggest surprise in quantum mechanics is that quantum
measurement does not reveal the fixed value of a given set of parameters at
a given time, like the position of Mars at time *t*.

Instead it yields a probability distribution, telling us that if the observed
system is in state *ψ* and we observe with operator *S*, we will see one or other of a fixed set of results, each appearing with a certain probability. When we measure the spectrum of an atom, we find ‘lines’ of fixed
frequency corresponding to various processes in the atom, and each line has a certain ‘weight’ which tells us how frequently the corresponding process
occurs.

The sum of the probabilities of the possible outcomes of an observation must be 1, since it is certain that one and only one of them will appear at each measurement. This constraint is identical to the constraint on a communication source: the sum of the probabilities of emission of the various letters of the source alphabet must also be 1.

This mathematical similarity leads us to consider that a sequence of quantum measurements is (at the level of letter frequencies) statistically identical to the output of a communication source.

### 5 Quantum field theory and the computer network

The formalism of quantum mechanics makes no particular reference to ordinary physical space or time, but deals essentially with the motion of state vectors in Hilbert space. Historically, quantum mechanics was first applied to Newtonian space, but it soon became clear that a true description of the world must combine quantum mechanics with special relativity in Minkowski space. The result of this marriage is quantum field theory (QFT).

QFT is the foundation of the Standard model, which although very successful, fails to comprehend gravitation and suffers from some logical and mathematical difficulties. Veltman

Perhaps the most counterintuitive feature of the Universe, from the point of
view of the continuous mathematical models of classical physics, is that *all* physical observations are quantized or ‘digital’. Experimental physics revolves around classifying (‘binning’) and counting events. When we
observe the output of physicists and mathematicians (‘the literature’) we see
that it too is quantized, into discrete volumes, articles, words and symbols.

Many of the problems of QFT arise from the attempt to explain these quantized observations using continuous mathematics. Here we propose that these problems can be overcome by applying discrete mathematics, which embraces integral (Diophantine) arithmetic, logic and the theory of computation (Chaitin 1987). Chaitin

##### 5.1 Why is the Universe quantized

Here we take the quantization of the Universe as evidence that it can be modelled as a communication system. Science proceeds by measurement. Shannon, who founded the mathematical theory of communication, saw that entropy can be used as measure of information [Shannon 1948, Khinchin 1957]. The information carried by a point in any space is equivalent to the entropy of the space. Claude E Shannon, Khinchin

Entropy is simply a count, usually converted to a logarithm for ease of
computation. In communication theory we imagine a message source *A*
with a source alphabet of* i* letters *a _{i }*whose probability of emission is

*p*. The sum of these probabilities is taken to be 1, meaning that at any moment the source is emitting 1 and only 1 letter. The entropy

_{i}*H*of such a source is defined to be

*H = - Σ*. By using the logarithm to base 2 we measure of entropy (and information) in

_{i}p_{i }log_{2}p_{i}*bits*.

The mathematical theory of communication is not concerned with the meaning of messages, only with the rate of error free transmission of strings of symbols from a certain source over a certain channel.

Given this measure of information, Shannon sought limits on the rate of error free communication over noiseless and noisy channels (Shannon 1948). The theory he developed is now well known and lies at the heart of communication engineering. Claude Shannon

In essence, Shannon showed that by encoding messages into larger blocks or packets, these packets can be made so far apart in message space that the probability of confusing them (and so falling into error) approaches zero. This is identical to the quantization observed wherever we look in the Universe.

For a given channel, Shannon’s theorems define a maximum rate
information transmission *C*. A system that transmits without errors at the
rate *C* is an *ideal system*. Features of an ideal system that are relevant here are:

1. In order to avoid error, there must be no overlap between signals representing different messages. They must, in other words, be orthogonal, as with the eigenfunctions of a quantum mechanical basis [Zurek 2007]. In other words, error free communication demands quantization of messages. Wojciech Hubert Zurek

2. Such ‘basis signals’ may be chosen at random in the signal space, provided only that they are orthogonal. The same message may be encoded in any orthogonal basis provided that the transformations used by the transmitter and receiver to encode and decode the message are modified accordingly.

3. The signals transmitted by an ideal system are indistinguishable from noise. The fact that a set of physical observations looks like a random sequence is not therefore evidence for meaninglessness. Until the algorithms used to encode and decode such a sequence are known, nothing can be said about its significance.

4. As the system approaches the ideal and the length of the transmitted signal increases, the delay at the transmitter while it takes in a chunk of message for encoding, and the corresponding delay at the receiver, increase indefinitely. The ideal rateCis only reached when packets comprise a countably infinite number of bits.

5. Only in the simplest cases are the mappings used to encode and decode messages linear and topological. For practical purposes, however, they must all be computable. In addition, in order to recover encoded messages, the computations used to encode messages must be invertible so that the decoded message is identical to the original.

##### 5.2 Calculus and the quantum

Experience has shown that mathematical models are extraordinarily effective tools for predicting the behaviour of the physical world (Wigner 1960). The general approach to modelling a physical system is to create a ‘configuration space’ large enough to be placed into one to one correspondence with all possible physical states of the system and then to seek ‘laws’ or ‘symmetries’ (often expressed as equations) that constrain events in the configuration space to just those that are actually observed in the world. Eugene Wigner

We may date modern mathematical physics from Newton’s discovery of calculus which provides a means to discuss ephemeral states of motion using static mathematical symbolism to express the invariant features (stationary points) of a motion.

The configuration space for the Newtonian dynamics of the Solar system is
a three dimensional Euclidean space and a universal time that applies
equally to all points in the space. Newton’s fundamental insight is that
acceleration (*a*) is proportional to force (*F*), expressed by the equation *a = F/m*, where the mass *m* is the constant of proportionality between force and acceleration.

Acceleration is the rate of change of velocity (*v*) with respect to time, in symbols *a = dv/dt*. Velocity itself is is the rate of change of position (*x*) with respect to time:* v = dx/dt*. Putting these together we write *a = d ^{2}x/dt^{2} = F/m*.

Such differential equations are statements of differential calculus, and enable us to calculate accelerations and forces from measurements of position. The moving system changes, the differential equation does not. Using the laws discovered by Kepler in the astronomical measurements of Brahe, Newton was able to arrive at his law of gravitation: the gravitational force acting between two heavenly bodies is proportional to the product of their masses divided by the square of the distance between them.

The inverse of differentiation is integration, which enables us to work in the opposite direction, from forces to positions. Given the law of gravitation, integration enables us to predict the future positions of the planets given their current positions. Together differentiation and integration are the subject of calculus.

Mathematically, calculus is closely related to continuity. Newton’s discoveries led close reexamination of the problems of continuity reflected in Zeno’s paradoxes and the the ancient discovery that no rational number corresponds to certain geometric magnitudes like the diagonal of a unit square.

The remarkable success of Newton’s methods led most physicists and philosophers to assume that physical Universe is continuous. Quantum mechanics was born, however, when Planck discovered that continuous mathematics cannot describe the relationship between ‘matter’ and ‘radiation’.

The development of quantum mechanics was rather slow and painful, since it was not clear how to apply the traditional mathematical methods of calculus to discrete quantum events. The eventual solution is Born’s rule, stated above (section 5.2).

The continuous equations of quantum mechanics do not describe the motions of particles in the Newtonian sense. They describe instead the probabilities of various discrete events. This raises the problem of reconciling the quantum mechanical and Newtonian descriptions of the world. A common route begins with Feynman’s path integral method (Feynman 1965). Feynman

##### 5.3 Action and the path integral

Although Newton’s method of predicting the future positions of the planets looks simple in theory, the difficulty of applying it to systems three or more bodies motivated the search for more powerful methods.

An important results of this search is *Hamilton’s principle*: that the world appears to optimize itself using the *principle of stationary action*. Hamilton defined action (*S*) as the time integral of the *Lagrangian function *(*L*). The Lagrangian is the difference between the kinetic (*T*) and potential (*V*) energy of a system: *L = T - V; S = ∫ L dt*. The action is said to be stationary when small variations in the Lagrangian do not change the action.

Hamilton’s principe has proven enormously fruitful, and serves as a bridge between classical and quantum mechanics. One way to understand this is Feynman’s path integral formulation of quantum mechanics.

A classical particle moving from *a* to *b* moves on a definite path through space and time which may be precisely computed using Hamilton’s
principle. In contrast, Feynman assumed that all possible paths from *a* to *b* are equally probable, and then used Hamilton’s principle and the quantum mechanical principle of superposition to select a particular path.

It is clear from the solution to the Schrödinger equation set down above that one quantum of action corresponds to one period of the state vector or, in our model, one elementary computation. The rate of rotation of a state vector is directly proportional to the energy associated with it so that the total number of revolutions made by the state vector associated with a particle following a certain path is exactly equivalent to the action associated with that path.

It is an axiom of quantum mechanics that the amplitude for an event which
can occur in many independent and indistinguishable ways is obtained by
adding the amplitudes for the individual events: *ψ*= *Σ _{i}ψ_{i}*.

This is the superposition principle, and is the case in the famous two slit experiment. Feynman added amplitudes for all possible indistinguishable paths to obtain the total amplitude for the event, and idea that originated with Dirac.

Feynman’s principal contribution was to apply the principle of stationary action to this superposition. The path of stationary action is that path whose neighbours have almost the same action, so that the amplitudes add ‘constructively’ thus (according to Born’s rule) maximizing the probability that this is the path actually taken. Feynman’s approach thus provides a rational basis for Hamilton’s principle: the classical trajectory is the one whose neighbours all have the same action.

How does this look from the computational point of view? We regard each quantum of action as the physical manifestation of one logical operation. A ‘path’ then corresponds to an algorithm, an ordered sequence of logical operations.

The efficiency of such algorithms is measured by the number of operations
necessary to go from the initial state *a* to the final state *b*, more efficient algorithms requiring fewer operations and being very similar to one another, so that the algorithms clustered around the most efficient algorithm will be very close together, that is, their action will be stationary under slight variations

##### 5.4 Feynman diagrams

The Feynman diagram is a natural complement to the path integral. The path integral gives equal weight to all possible paths between two quantum states and uses superposition and Hamilton’s principle to select the most probable path. A Feynman diagram is a representation of such a path, and the probability of a particular transition is calculated as a superposition of all the relevant Feynman diagrams.

A Feynman diagram represents a network in which the lines represent particles (messages) and the vertices or nodes where the lines meet represent interactions between the particles. The path integral approach assigns equal weight to all possible paths, and from that computes the probability of a particular transition. The Feynman diagram aggregates these weights into a larger structure that represents the fact that particles interact in every possible way: many path integrals contribute to one Feynman diagram.

Taking an even broader view, one may see the whole Universe as an aggregate of aggregates of Feynman diagrams, thus building up a universal network that embraces all physical interactions. In the network picture, we see particles as messages and vertices in the Feynman diagram as computers in the network whose inputs and outputs are messages. We can now turn from this intuitive approach to the Universe as a computer network to a more formal development of the idea.

### 6. A transfinite computer network

##### 6.1 The Cantor Universe

The purpose of this section is to construct a logical phase space in which to model the behaviour of the universal network.

We see a network as a set of memory locations whose states can be changed by the users of the network. As in everyday networks and computers, every memory location has an address and content which may be read from and written to by a processor.

In a finite network, these addresses may be placed into correspondence with the natural numbers. To address the memory in an infinite network, we turn to the transfinite numbers invented by Georg Cantor (1898). Cantor

Cantor established the transfinite cardinal and ordinal numbers by Cantor’s
theorem: given a set of any cardinal *n*, there exists a set with a greater
cardinal. He proved this using his famous diagonal argument. A modern
axiomatic proof relies on the axiom of the power set: given a set *S*, there exists the set *P(S)* of all subsets of* S* called the power set of S. If the cardinal of *S* is n, the cardinal of *P(S)* is 2^{n}. This is the true for all finite or infinite values of *n* greater than 1.

Using this theorem, Cantor constructed the formal ‘Cantor universe’ by transfinite recursion. The set *N* of natural numbers is said to be countably infinite. Cantor used the symbol *ℵ _{0}* to represent the cardinal of

*N*. The power set of

*N, P(N)*has the next greatest cardinal

*ℵ*, and so on without end. Cantor hypothesized that

_{1}*ℵ*was the cardinal of the continuum. Cohen (1966) later found that this continuum hypothesis is independent of the axioms of set theory. Cohen

_{1}##### 6.2 A transfinite symmetric universe

Here we construct a Cantor universe in a slightly different way, using
permutations rather than subsets to generate greater cardinals. Permutation
simply means rearrangement, so the triplet {*a, b, c*} may be arranged in six ways *abc, acb, bac, bca, cab, cba*. In general *n* different things can be arranged in *n × (n-1) × . . . × 2 × 1* different ways, called factorial *n* and written *n!*. The set of permutations of a number of objects form a group called the symmetric group.

We begin with the natural numbers, *N*. These numbers have a natural order,
but we can also order them in any other way we like to give a set of permutations of *N*. We know that there are *n!* permutations of *n* objects and conjecture that there are *ℵ _{n+1}* permutations of

*ℵ*things. Every one of these permutations is formally different.

_{n}We take this structure as the configuration or memory space for a transfinite, symmetric network. The essential feature of this model is the fact that by ordering sets to create position significant numerals, we gain a huge increase on the complexity of numbers that can be represented with a given stock of symbols. This fact, it seems, lies at the root of the creative power of the Universe.

Cantor made a very bold claim when he presented his transfinite cardinal and ordinal numbers to the world. He wrote:

The concept of "ordinal type" developed here, when it is transferred in like manner to "multiply ordered aggregates", embraces, in conjunction with the concept of "cardinal number" or "power" introduced in §1, everything capable of being numbered (Anzahlmassige) that is thinkable and in this sense cannot be further generalized. It contains nothing arbitrary, but is the natural extension of the concept of number. (Cantor 1898; 1955: 117)

We might call this structure a transfinite symmetric network. Transfinite because it is isomorphic to Cantor’s transfinite cardinal and ordinal numbers. Symmetric because it contains all possible permutation (= symmetric) groups which in turn contain all possible groups.

##### 6.3 ‘Cantor symmetry’

Cantor [1955: 109], explaining the genesis of transfinite numbers, writes:

We shall show that the transfinite cardinal numbers can be arranged according to their magnitude, and, in this order, form, like the finite numbers, a “well ordered aggregate” in an extended sense of the words. Out ofℵproceeds by a definite law, the next greater cardinal number_{0}ℵ, and out of this by the same law the next greater,_{1}ℵand so on . . . without end._{2}

This ‘definite law’ is indifferent to the size of the number upon which is operates to create the successor to that number. Let us call this definite law Cantor symmetry or symmetry with respect to complexity.

In the symmetric universe as we have described it, the ‘definite law’ is the process of generating all the permutations of a given set of operations. The order of operations in the real world is often important (to make an omelette, beat the eggs then cook rather than cook then beat the eggs).

Such operations are ‘*non-commutative*’. One of the many features that
distinguishes quantum from classical mechanics is non-communtative
multiplication. Cantor symmetry provides us with an heuristic principle for
understanding interactions at all levels of the Universe. At this level of
abstraction, the gossip at a cocktail party and the inner workings of an atom
share the same network features, and we may use one to throw light upon
the other.

Cantor symmetry applies to quantum mechanics so that our understanding of two state systems is essentially the same as our understanding of system with a transfinite number of states.

##### 6.4 Computation and quantum information theory

The transfinite symmetric universe is a 'Platonic' structure whose principal properties are:

1. Unlimited size, sufficient to be placed in one to one correspondence with any real system of events, no matter how complex. We measure the size of an event by the number of quanta of action involved in its execution. By this measure, the largest event is the live of the Universe itself, which contains all other events.

2. Layered, since each new level of complexity arises from permutations of the elements of the layer beneath it, giving a sequence of ever increasing complexityℵ,_{0}ℵ,_{1}ℵ. . ._{2}

Our task now becomes to find the constraints on this space which enable us
to map it to the Universe of experience. We may imagine a communication
system as a set of computable functions that can be strung together (as in an
ordinary computer) to transform the input of the channel to its output.
Isolated Turing machines are deterministic. This determinism is broken,
however, when processes communicate. Turing envisaged this situation
when he described the *oracle-* or *o-machine*. An oracle machine, like any real computer, can stop and wait for input from an outside source or oracle.

In practical communication networks, processing may also be interrupted by a message which claims priority over the current process. A message entering a network using an acceptable protocol changes the state of the machine receiving it. Given communication delay (which may effectively cause space-like separation of machines) it is impossible to predict when a machine will be interrupted and what the effect of the interrupt may be. Uncertainty (ie unpredictability) thus enters a network, even if all the functions driving the network are deterministically computable.

This opens the way to understanding the probabilistic nature of quantum mechanics if we interpret a quantum systems as networks. In this way we replace a deterministic continuous formalism with a probabilistic interpretation with a set of deterministic machines whose communication with one another produces a set of events whose occurrences are unpredictable.

The events themselves, however, fall into equivalence classes which are determined by the communication protocols in use. The number of possible protocols is restricted to the number of possible Turing machines, that is to a countable infinity. If we consider quantum eigenfunctions to be computable, this suggests that the number of eigenfunctions (ie observables) in the Universe is itself countably infinite. One motivation for the development of quantum computers is the belief that they will be much more powerful than digital machines. This hope is based on two features of current quantum mechanics, continuity and superposition.

First, because quantum mechanical superpositions in Hilbert space have continuous complex amplitudes, it is felt that even a vector in two dimensional Hilbert space (a ‘qubit’) may encode an infinity of detail and therefore be equal to an infinite digital memory.

Second, because in an infinite dimensional Hilbert space each vector may be decomposed into a superposition of an infinity of other vectors, all of which are acted upon simultaneously by operators on the space, it is felt that quantum computations can be massively parallel, computing all the instances of an algorithm represented by each element of the superposition simultaneously.

To the contrary, we have the approach taken by algorithmic information theory, which measures the total information in a transformation by the length of the program required to implement the transformation (Chaitin 1987).

From this point of view, the equation x = y defined on the real numbers does not represent an infinite amount of information, but only the few bits represented by the symbolic expression ‘x = y’. This alerts us to the fact that the entropy we assign to a set depends on how we decide to count its elements (Chaitin 1975). From a computational point of view, the algorithmic measure seems most appropriate.

Further, from the point of view of communication theory, a continuum is the essence of confusion. In continuous mathematics, the real information is carried only by the singularities, eigenvalues and other fixed points we may find within or on the boundaries of a continuous set.

From this we conclude that one can neither compute a continuum, nor use a continuum to compute, except insofar as it can be broken into disjoint pieces. All computation modelled by the Turing machine must be digital or quantized, that is executed using discrete symbols, that is or discrete physical entities.

Quantum mechanics makes sense if we look at it from the point of view of communication. It is an intuitively satisfactory point of view, since we are born natural communicators. It emerges from the above discussion that when we communicate with the world, even a quantum system, we are part of a network.

##### 6.5 Creative evolution

Cosmological theory, supported by a wealth of observation, suggests that the Universe began as a ‘formless and void’ initial singularity. (Genesis; Hawking & Ellis 1975). We have a quite comprehensive picture of the development of the Universe within this singularity, but very little information about (i) why the initial singularity existed in the first place and (ii) what motivated its *complexification* (Teillhard de Chardin, 1980). Teilhard de Chardin

There is really nothing to be said about (i), except to accept that we are here. The second question is more amenable to treatment. As Parmenides noted, we can only know dynamic systems through their stationary points. The stationary points in the transfinite computer network are the elements of the Cantor universe which we understand to be memory locations which carry information.

The ‘expansion’ of the Cantor universe is driven by Cantor’s theorem, which tells us that given any set, there exists a set of higher cardinal. The proof of Cantor’s theorem is non-constructive: it is proved by showing that its falsity implies a contradiction. Insofar as the Universe is consistent, it necessarily increases its cardinality. We take this ‘Cantor force’ to be the ultimate driver of universal expansion.

The theory of evolution is founded on the Malthusian observation that the exponential growth of reproducing organisms will eventually exhaust any resource supply, no matter how great. As a result the number of organisms has an upper limit and fitter organisms will survive at the expense of the less fit.

We note first that while there are only a countably infinite number of different Turing machines (computable functions), the phase space of the model is transfinitely bigger, so that we may see computation as a limiting resource in the Universe. This lays the foundation for an evolutionary scheme: many possibilities, confronting limited resources select for those possibilities that use the resource most efficiently to secure their own survival and replication.

Although the evolutionary paradigm was first developed in the biological realm, the layered network model suggets that it accounts for the selection of all structure in the Universe, from the most elementary particles in the physical layer through all user layers to the the Universe as a whole.

### 7 Some corollaries

##### 7.1 ‘The Unreasonable effectiveness of mathematics’

Eugene Wigner noted the ‘unreasonable effectiveness of mathematics’ in the physical sciences (Wigner 1960) If there is any truth in the picture painted here, this fact may have a simple explanation. Mathematics is a consistent symbolic system devised by the mathematical community. We may think of the mathematical literature as a set of stationary points in the dynamics of this community. Wigner

More generally stationary points (particles or messages) in the observed Universe also form a symbolic system whose consistency is guaranteed by the dynamic processes which contains them. Given that the symmetric network spans the whole space of consistent symbolic systems, it may not be surprising to find that mathematics is wonderfully effective as a descriptor of the world, as Galileo proposed.

##### 7.2 The view from inside the Universe

The transfinite computer network is a dynamical model whose size is limited only by consistency. It was originally developed in a theological context as a candidate model of God. The idea is to map it onto the observed world as evidence for the thesis that the the Universe is divine. Traditional models of God and the World suggest that they could not be more different. This difference is partly based on ‘proofs’ for the existence of God which prove, in effect, that God is other than the World [Aquinas 1981].

In the picture proposed here, we see ourselves as part of the living God. As noted above, Christian theologians long ago developed models of God that allowed for a Trinity of personalities in God. Here we allow for a transfinity of personalities, where we mean by personality a communication source, any entity capable of sending and receiving messages, whether it be an atom or a God.

Our existence depends upon many layers of structure beginning with fundamental particles and moving through atoms, molecules and cells, each of which is a layered network within itself and connected to its peers through an ‘internet’.

Further we ourselves are elements of larger networks, families, tribes and nations. In all of these structures, the fundamental binding element is communication, and it is hard to imagine any feature of human experience that does not fall within the network paradigm.

##### 7.3 Network intelligence

Only two sources may be involved in any particular communication within a network, but, since each of these machines is connected to many other machines, what we see if we look at a given machine for long enough is not just the machine we are looking at, but a superposition of all the inputs from all the machines in the network, weighed by the probability of communication between them and the machine we are watching.

We may see this scenario as analogous to the structure of a biological neural network like our own central nervous systems, where memory in the network is encoded in the weights of synaptic junctions between the individual neurons in the network. We may see these weights as analogous to the amplitudes for the propagation of signals between the various vertices of a Feynman diagram.

There can be little doubt that the neural networking in our central nervous systems is the hardware layer of the human mind and the seat of our intelligence. In the network paradigm proposed here, the Universe as a whole is a network similar to our individual neural networks so that we may see the cosmic intelligence as isomorphic (up to a Cantor symmetry) with our microcosmic minds.

##### 7.4 Past and future

Observable events lie on the boundary between past and future. Turing
answered Hilbert’s *Entscheidungsproblem* by devising a machine that could
perform anything that might reasonably be called a computation and then
showing that there was a set of incomputable problems which such a
machine could never complete.

In the context of Cantorian set theory, Turing showed that while there is a just a countable infinity of different Turing machines, capable of evaluating a countable infinity of computable functions. Since there is a transfinite number of possible functions mapping the set of natural numbers onto itself, computable functions represent at most a small fraction of possible functions.

Until a computer has halted, its state is uncertain. Once it is halted, its state becomes definite. In the network no computer can determine its inputs, and so its output, if any, is also indeterminate. As a consequence, the past is definite, and not subject to any uncertainty principle. The future, however, cannot be completely determined by the past. The network model therefore suggests that the boundary between past and future can be modelled by the boundary between halted and not-halted computations.

##### 7.5 ‘Entanglement’ and ‘spooky action at a distance’

[Einstein, Podolsky and Rosen 1935]In a layered network, each layer provides an alphabet of processes or tools which are used by higher layers for their own purposes. Thus molecules use atoms, cells use molecules, people use computers, corporations use people and so on. If we assume that the fundamental hardware of the Universe is the initial singularity, peer processes in all the layers of complexity which have arisen from this singularity still use it as the ultimate physical layer for all their communications with one another.

In this picture, space-time as we experience it is an an emergent property of the Universe, a layer built on the lower layer described by quantum mechanics. The structure of space-time is described by special relativity and depends upon the finite velocity of light.

As noted above, the algorithms used to defeat error in communication delay
the transmission and reception of messages. In order to encode a message of
*n* symbols into a transmissable signal function, the encoding machine must
wait for the source to emit *n* symbols, even if the computation required for
encoding is instantaneous. Since computation itself involves error free
communication within the computer, we can expect it too to add delay.
We therefore are led to believe that communication delay and quantization
are connected. Further, since the least delay corresponds to the maximum
velocity, we propose that the algorithm used to encode information into
signals that travel at the velocity of light is the fastest algorithm in the
Universe.

As ‘nature abhors a vacuum’ physicists traditionally abhor ‘action at a distance’. This abhorrence has been the historical motivation for field theories and the notion that the forces between systems are mediated by the exchange of particles.

The Einstein, Podolsky Rosen thought experiment, since frequently realized, proposed that entangled particles could act upon one another at a distance even though their separation was spacelike, requiring something greater than the velocity of light to account for their correlation (if it is due to communication) (Aspect et al 1982). Aspect

We assume that the velocity of light is finite rather than infinite because of the delay in the transmission of a photon from point to point due to error preventative encoding. Conversely, we might speculate that communications that cannot go wrong, that is communications that effectively carry no information, might require no encoding and therefore travel at infinite velocity.

The observed correlations between entangled photons have been found to propagate at many times c, and the measurements are compatible with infinite velocity, ie instantaneous transmission over any distance. Salart: *Testing the speed of 'spooky action at a distance'*.

In a practical communication network a message originating with user *A* is
passed down through the software layers in *A*’s computer to the physical
layer which carries it to *B*’s machine. It is then passed up through *B*’s software layers until reaches a form that *B* can read. By analogy, communication between one system in the Universe must pass down to the
ultimate physical layer (which we might identify as the initial singularity)
then up again to the peer system receiving the message. It may be that the simplicity in the lower layers of this network make encoding for error prevention unnecessary, so that instantaneous communication is possible.

##### 7.6 Numerical results

The first successful quantum field theory, quantum electrodynamics, has enabled physicists to compute the outcome of experiments with precision in the realm of parts per billion (Feynman 1988). Such precision compels one to believe that the theory is an accurate description of reality.

Quantum mechanical measurements are essentially frequencies, that is rational numbers expressing the ratio of the occurrences of two events. A probability is a prediction of a frequency. When we say that the probability of a tossed fair coin coming up heads is 1/2 we are predicting that if we throw a coin a sufficient number of times, the ratio of head to tails will approach as closely as we like to 1/2.

The measurement and computation of frequencies requires both definitions of the events to be compared (heads, tails) and a count of these events. The quantum mechanical eigenvalue equation embodies both these requirements: the eigenfunction which defines the event and the eigenvalue which measures its frequency.

One must ask: if the continuum calculus methods of conventional quantum mechanics work so well, how are they to be reconciled with the digital approach suggested here. The matter clearly requires much further discussion in the genre more appropriately published in physics journals but an outline of the answer seems to be present in the layering of networks. Physics deals with the physical layer of the universal network, which may nevertheless comprise a number of layers beginning with the totally simple, isolated and meaningless initial singularity upon which are built quantum mechanics and space-time.

A foundation of quantum mechanics is the principle of superposition which works because quantum mechanics is linear. When the amplitudes of two states are added the resulting amplitude is simply the sum of the two added states: the whole is no greater or less than the sum of its parts. This contrasts with more complex non-linear systems for which the relationship between whole and parts is more complex, as illustrated by the old adage ‘the straw that breaks the camels’s back’: a minor input leading to a major effect. Thus, in the quantum mechanical context, continuous and digital mathematics are indistinguishable.