natural theology

We have just published a new book that summarizes the ideas of this site. Free at Scientific Theology, or, if you wish to support this project, buy at Scientific Theology: A New Vision of God

Contact us: Click to email
vol 6: Essays

On the quantization of gravitation

Einstein 1954 to Besso: 'I consider it quite possible that physics cannot be based on the field principle, ie on continuous structures. In that case nothing remains of my entire castle in the air, gravitation theory included. Kevin Brown
It struck me as limiting that after some 75 years, the whole subject of quantum field theory remains rooted n the harmonic paradigm, to use a dreadfully pretentious word. We have not been able to get away from the basic notions of oscillations and wave packets. Indeed, string theory, the heir to quantum field theory, is still founded firmly on this harmonic paradigm Surely, a brilliant young physicists, perhaps a reader of this book, will take us beyond. Anthony Zee: Quantum Field Theory in a Nutshell

Outline

0: Abstract
1: Logical continuity = proof: a mathematical foundation for physics
2: The Cantor Universe
3: The Transfinite computer network
4: Why is the Universe quantized?
5: Quantum mechanics describes a computer network
6: Fixed points and invisibility
7: Symmetry, renormalization and symmetry breaking
10: Natural selection
13: Creation and annihilation
*** 8: Relativity: from special to general: local to global, global first, initial singularity
9: Quantum field theory: applied symmetry
4: Renormalization
6: Computer networks
Wave function collapse Cosmological constant Quantization of gravitation: not Psychological universe

0: Abstract

For more than 60 years much sweat, maybe some tears (and possibly a little blood) has been spent in the (so far) unsuccessful effort to quantize gravity. Here we interpret this situation as evidence that gravity is not quantized. The argument is based on the notion that the Universe may be modelled as a computer network.

An important feature of useful networks is error free communication. Shannon has shown that we can approach this ideal over a noisy channel by encoding our messages into packets so far apart in message space that the probability of confusion is minimal. We assume that this accounts for the quantization of our observations and that the exchange of particles in the physical world corresponds to the exchange of messages in the network model.

Conversely, we should not expect to find quantization where error is impossible, that is in a regime where every possible message is a valid message. Since gravitation couples universally to energy alone, and is blind to the particular nature of the particles or fields associated with energy, we can imagine that gravitation involves unconstrained and therefore intrinsically error free and non-quantized interaction.

Back to top

1: Logical continuity = proof: a mathematical foundation for physics

We begin with the initial singularity and follow the emergence of the observable Universe within this singularity. Parmenides posed the fundamental scientific question about 2500 years ago: how can we write the truth about a changing world, for what is true at this moment may be false at the next? His answer remains true: we must seek out and record the invariant features of the world that do not change over time. One important feature of the Universe is its evolutionary history which has demonstrated a record of creativity. The purpose of this essay is to cast mathematical light on the creative power of the Universe.

The initial singularity had no structure yet it had the nous to become the world and we assumes that at some scale the world still has no structure locally, in other words every point in the world is inside the initial singularity bounded only by inconsistency and free to do anything consistent with that bound. The Cantor Universe defined non-constructively gives a measure of the possibilities within the bound of consistency.

From a dynamic points of view, mathematics is the work of the mathematical community. The invariant features of this work are valid mathematical theorems, which are considered to be true for all time. Theorems can be represented by static eternal texts which are shared between mathematicians and serve both as records of past achievement and the foundations for further work.

Here we follow Hilbert’s notion that mathematics is a purely formal system, a set of games played with symbols whose rules may be freely created by mathematicians. The principal criterion is that the games created should be self consistent and interesting. Such formal mathematics may be inspired by observed reality but does not necessarily depend upon it for its validity. Hilbert accepted the classical view that mathematics is consistent, complete and computable.

If we assume that the Universe is all that there is and consequently that it is subject to no external constraint, we might expect that consistency is also the sole constrain acting on reality. This would go toward explaining Wigner's observation of the effectiveness of mathematics in the natural sciences. It would also support Dirac's firm belief that that 'it is more important to have beauty in one's equations than to have them fit experiment'.Eugene Wigner: The Unreasonable Effectiveness of Mathematics in the Natural Sciences, P. A. M. Dirac: The Evolution of the Physicists Picture of Nature

As is well known, Gödel and Turing showed that Hilbert's view is untenable: there are consistent symbolic systems which are both incomplete and incomputable. Gödel's incompleteness theorems - Wikipedia

Turings proof established a boundary on provability which serves as a limit to mathematically provable theorems and hence a limit on the fixed points in the mathematical structures that mathematicians share. He did this by devising a formal machine (now known as a Turing machine) capable of logically connecting a set of hypotheses to conclusion. Turing devised software for this machine which enabled it perform anything which might reasonably called a proof and showed that there exist initial conditions (hypotheses) for this machine which would never yield a result, thus establishing the existence of unprovable or incomputable functions. On the other hand, any completed computation is effectively a proof, and Turing's idea helped to found the current exponential deployment of computing machinery. Alan Turing: On Computable Numbers, with an application to the Entscheidungsproblem

We take the Turing machine and its derivatives to be the archetype of a logical continuum, that is a logically deterministic chain of events leading from one point to another in a symbolic space. We guess that the mechanism that couples the fixed points in the universe is a logical continuum implemented by a symbolic computer.

We take the address space for these computers to be the Cantor universe, the hierarchy of transfinite cardinal and ordinal numbers. The processors that work within this memory are envisaged to have the power of universal turing machines.

Back to top

2: The Cantor universe

The Cantor universe serves as the phase space for the transfinite computer network, addressing all the processes in the network. The symbolic space and methodology of mathematics was greatly expanded with the publication of Georg Cantor's papers on transfinite numbers in 1895 and 1897. Georg Cantor: Contributions to the Founding the the Theory of Transfinite Numbers (pdf), Cantor, (book, pdf)

We may consider a point as a named entity. It has been known for a long time that neither the natural nor the rational numbers are sufficient to name all the points in a continuous geometric line. This problem prompted the invention of the real numbers to provide a number corresponding to every point in a real line. The cardinal of the continuum then becomes the cardinal of the set of real numbers. Cantor set out to find a representation of this number. Square root of 2 - Wikipedia, Real number - Wikipedia, Cardinality of the continuum - Wikipedia

Cantors system is to generate new cardinal numbers considering the ordinal numbers of sets with fixed cardinals. The foundation of the whole system is the set N of natural numbers, which is said to be countably infinite.

There is no greatest natural number, since we can always add 1 to get the next number. Cantor therefore invented the symbol 0 to represent the cardinal of N.

N has a natural order, 0, 1, 2, . . .. We can permute this order in 0! ways to create the set of all permutations of the natural numbers whose cardinal we assume to be 1

The structure we are creating is a layered hierarchy of permutation groups, each constructed by permuting the elements of the group before it. The subscripts on the alephs number the layers in the transfinite universe. Each of these layers is already a permutation group, at least formally. We assume, therefore, that when we study the world, we will see only computable permutations of its elements.

As with the natural numbers, there is no greatest transfinite cardinal. We can always generate a new one by considering all the permutations of the largest set we have so far.

Cantor believed that the transfinite number system is capable of enumerating anything enumerable, and so cannot be further generalized. Thus the transfinite numbers provide a space large enough to encompass anything that that mathematicians may imagine.

It is clear, from their method of development, that the higher transfinite number are enormously complex. As Cantor notes, they can be placed into correspondence with anything with fixed points, so that we may envisage, for instance the transfinite layer corresponding to a human being,or to the whole system of the Earth, including all itslife forms. Back to top

3: The transfinite computer network

We are modelling the Universe as a layered communication network. The lowest layer in such networks is the physical layer, which handles the formation and transport of the physical symbols that carry information. The modern foundation of physics is quantum mechanics which describes the dynamics of the physical Universe. We will take up the study of quantum mechanics in detail in chapter 4. Here we are interested on just one point: that quantum mechanical observables are the fixed points of the universal dynamics. Tanenbaum: Computer Networks

We use networks through a user interface which might be a computer or telephone. The interface enables us to transmit and receive data to from and the network. Behind the interfaces is the system which transmits data from one interface to another, the coding and switching network.

The work that goes on between the user interfaces is invisible or transparent to us. We do not become aware of it unless it breaks down and we need to understand it to fix it. From a physical point of view, the user interface of the world is the space-time in which we live. The messages we receive from the Universe are written in spacetime, and we move in spacetime to send messages back to the Universe.

A foundation of the theory of computation is the abstract machine that Turing devised to prove the existence of incomputable functions. Turing began by specifying the elementary operations of his machine and then began increasing its power by recursively specifying subroutines which served as building blocks for more complex processes until he arrived at a universal machine which could (formally) compute any computable function. By modern standards of computer design, this is a slow and cumbersome machine, but it has the universality necessary to prove Turing's thesis. Concrete realizations of Turings idea work much more efficiently but are subject to physical constraints that reduce their power. Alan Turing: On Computable Numbers

Every concrete computer and computer network is built on a concrete physical layer made of copper, silicon, glass and other materials which serve to represent and information and move it about. The physical layer implements Landauer’s hypothesis that all information is represented physically. Rolf Landauer

Software engineering is devoted to instructing the physical machinery to manipulate the physical representatives of information so as to execute formal algorithms. Software engineering - Wikipedia

In practical networks, the first layer after the physical layer is usually concerned with error correction, so that noisy physical channels are seen as noise free and deterministic by subsequent layers of the system.

Once errors are eliminated, an uninterrupted computation proceeds formally and deterministically according to the laws of propositional calculus. As such it imitates a formal Turing machine and may be considered to be in inertial motion, subject to no forces (messages).

It is well known a that all the operations of propositional calculus can be modelled using the binary Sheffer stroke or NAND operation. We can thus imagine building up a layered computer network capable of any computable transformation, a universal computer, using a suitable array of NAND gates. Whitehead & Russell: Principia Mathematica, Sheffer stroke - Wikipedia

An ‘atomic’ communication is represented by the transmission of single packet from one source to another. Practical point to point communication networks connect many sources, all of which are assigned addresses so that addressed packets may be steered to their proper recipient. This ‘post office’ work is implemented by further network layers.

Each subsequent software layer uses the layer beneath it as an alphabet of operations to achieve its ends. The topmost layer usually comprises human users. These people may be a part of a corporate network, reporting through further layers of management to the board of an organization. By analogy to this layered hierarchy, we may consider the Universe as a whole as the ultimate user of the universal network.

Processes in corresponding layers (‘peers’) of two nodes in a network may communicate if they share a suitable protocol. All such communication uses the services of all layers between the peers and the physical layer. These services are generally invisible or transparent to the peers unless they fail. Thus two people in conversation are generally unaware of the huge psychological, physiological and physical complexity of the systems that make their communication possible.

The Cantor universe is an address space created by exploiting the power of permutation to increase complexity. We now imagine that the permutation operations in this universe are performed by computing machines which transform one permutation into another. Each of these permutations may be understood as a vector from some origin to a point in the permutation space. When applied to the universe as a whole, we may imagine the origin of all these vectors as the initial singularity.

We the fact that the cardinal of the set of computing machines is 0, the cardinal of the set of natural numbers to set up a correspondence between computers and numbers and associate each element in these permutations with a computing process. The vector representing a point in permutation space can then be imagined as a sequence of processes joined together in a particular order leading to a particular point. The permutation space then acquires a countably infinite basis of computable functions. We may consider the processes corresponding to each basis vector of the permutation space vector as the steps in the process of getting from the original end of the vector to the tip.

There are 1 permutations of N but there only 0 algorithms available for computing these functions. This suggests that a large proportion of all possible permutations are incomputable. That is, the majority of the points in the permutation space cannot be reached by deterministic means. This constraint imposes boundaries on the stable, that is computable, structures in the universe.

A stable network requires relatively error free communication between its nodes. The transfinite network is constructed as a layered network whose finite analogies are to be found in structures such as the internet. A useful network must be error free. Shannon developed the mathematical theory of communication to show how this can be done. As suggested below, this may account for the quantization of the observed universe.

From an an abstract point of view there is very little difference between a computer and a network of computers. A computer is itself a network. The principal difference of importance here is clock sped or frequency of operation. We will consider a single computer to be a network whole operations are synchronized by a central clock. This clock serves to keep all the computation operations in their correct sequential order, and to provide a time interval during which dynamic changes in the hardware are ignored until until the system settles into a stable state which is the stating point for the next cycle of operation. A computer network, then, is an asynchronous set of synchronous computers with the necessary arrangements, like buffering, to deal with the frequency differences.

Back to top

4: Why is the Universe quantized

Quantum theory is the mathematical machinery we use to discover the stationary states of the universe and explain the transitions between them. When we know them we can possibly use them to construct things. The application of quantum mechanics to the design and construction of computing machinery is the basis of a growing industry.

Despite the fact that much of the foundation of physical theory is expressed in continuous mathematics, there is no doubt that everything we can see, from galaxies to fundamental particles, is discrete or quantized. The reason for this seems to be that a stable Universe requires error free communication, and the theory of communication shows that error can be defeated by packing information into discrete symbols.

From a mathematical point of view, a message is an ordered set of symbols. In practical networks, such messages are transmitted serially over physical channels. The purpose of error control technology is to make certain that the receiver receives a string identical to the transmitted string.

The mathematical theory of communication developed by Shannon shows that by encoding messages into discrete packets, we can maximize the distance between different signals in signal space, and so minimize the probability of their confusion. This theory enables us to send gigabytes of information error free over scratchy phone lines. We can see in our own bodies that quantum processes enable trillions of cells each comprising trillions of atoms and molecules to function as a stable system for many years,Claude E Shannon: A Mathematical Theory of Communication, Khinchin: Mathematical Foundations of Information Theory

A system that transmits without errors at the limiting rate C predicted by Shannon’s theorems is called an ideal system. Some features of an ideal system are:

1. To avoid error there must be no overlap between signals representing different messages, They must, in other words, be orthogonal, as with the eigenfunctions of a quantum observable.

2. Such ‘basis signals’ may be chosen at random in the signal space, provided only that they are orthogonal. The same message may be encoded into any satisfactory basis provided that the transformations used by the transmitter and receiver to encode the message into the signal and decode the signal back to the message are inverses of one another. Quantum processes are reversible.

3. The signals transmitted by an ideal system are indistinguishable from noise. The fact that a set of physical observations looks like a random sequence is not therefore evidence for meaninglessness. Until the algorithms used to encode and decode such a sequence are known, nothing can be said about its significance.

4. Only in the simplest cases are the mappings used to encode and decode messages linear and topological. For practical purposes, however, they must all be computable with available machines.

5. As a system approaches the ideal, the length of the transmitted packets, the delay at the transmitter while it takes in a chunk of message for encoding, and the corresponding delay at the receiver, increase indefinitely. Claude Shannon: Communication in the presence of noise

Shannon's idea lies at the foundation of binary digital computing. Many machines operate using signals ranging between 0 and 5 volts. One binary symbol, say 0, is reduced by a low voltage in the range, say 0 to 1. Th other symbol, 1, is represented by a higher voltage range, say 5 to 5. The voltage gap between 1 and 4 separates these two signals. Given the noise level in a computer, this separation may result in less than 1 in a trillion symbols being mistaken.

Shannon's theory tells us that an error free computation process much be discrete, symbolic or digital. The symbols used for computation may themselves be computations which we conceive as subroutines. A large computation is an ordered set of symbols which may themselves be ordered sets of symbols. When we apply this model, we see this hierarchy rooted in the initial singularity which is considered at to be at once structureless and the source of all structure.

Back to top

5: Quantum mechanics describes a communication network

When we fit the transfinite network with computers it can behave like quantum mechanics. Quantum mechanics is written complex numbers, which are well adapted to represent periodic functions. The processes of computation are periodic measured by phase with the property that f(zπ) = f(z)

I assume that my intelligence and consequent insights are products of the network structure of my central nervous system. The insight that insight and quantum events are the same phenomenon at different scale then supports the idea that a quantum event like the emission or absorption of a particle is also mediated by a network process. the idea is supported by the idea that quantum mechanics describes the motion of messages in a network.

The mathematical formalism of quantum mechanics assumes that the state space of the physical Universe can be represented by state vectors in complex Hilbert space of finite or infinite dimension. The joint state of two communicating quantum systems is represented by vectors in the tensor product space of the Hilbert spaces of the constituent systems.

The continuous evolution of state vectors in an isolated quantum system is described by unitary operators on their Hilbert space governed by differential equations or algorithms. Since such a system is isolated, however, this continuous evolution is not directly observed but is inferred from the observed success of its consequences.

Mathematically this evolution is deterministic and reversible so that we may think of it as a process of encoding the same message in different bases. Quantum mechanics applies equally at all energies and all levels of complexity of state vectors, spaces and operators. The only truly isolated system is the Universe as a whole, represented in its simplest state by the initial singularity. If we imagine the universe as a layered network, we may understand the initial singularity as the lowest physical layer, and assume that all communications in the universe pass through this layer. Given that the universe is all that there is, we imagine the emergence of the universal network to take place within the initial singularity, and for this structureless entity to be present at every point in the universe we observe. Hawking & Ellis: The Large-Scale Structure of Space-time.

The continuous evolution of an isolated quantum system is understood to be interrupted by an observation or measurement. When we observe a system, we do not see the whole continuous system, but only an eigenvalue corresponding to one of the basis states (eigenvectors) of the operator representing the measurement process.. The mathematical formalism of quantum mechanics can predict eigenvalues to precision limited only by the skill of the theory and the computing resources applied. On the other hand it cannot predict which eigenvalue we will observe in a particular event, only the relative frequencies of the observed eigenvalues.

Here we look upon a quantum measurement as an act of communication. A communication sourceA is characterized by a source alphabet ai and the corresponding probabilities pi for the emission of the symbol ai. These probabilities are normalized on the principle that the source emits one symbol at a time, so that the sum of the pi is 1.

Quantum sources are normalized in a similar way, subject to the requirement of the Born rule we use to compute the probabilities of the alphabet of measurement outcomes. The probability of finding a given outcome is pi= |<ai ||ψ >|2, where |ψ> is the preexisting state of the system and ai is an eigenvector of a measurement operator A. Born rule - Wikipedia

The ai are the fixed points of the measured system which correspond to the eigenvectors of the measurement operator. These fixed points are selected by the eigenvalue equation A|ψ> = ai|ψ>. Eigenvalues and eigenvectors - Wikipedia

The normalization of the probabilities of the results of quantum measurements places a constraint on the wave function |ψ>. In a Hilbert space of dimension i we may expand |ψ> = ∑i c i |i > where the |i > are a set of basis vectors of the space. Normalization then requires that the sum of the squares of the complex amplitudes c i in the expansion must be 1.

Quantum measurements therefore have the same statistics as communication sources and the outputs of quantum sources are quantized into a definite alphabet of symbols which suggests a design evolved to prevent error. This is considered to be the case even when we consider continuous spectra and the dimensionality of the Hilbert space us the cardinal of the continuum.

From a communication point of view, quantum mechanics may be seen as modelling the flow of information in a network. The eigenvalue equation yields the actual values of the symbols transmitted, and the Born rule gives us the frequencies of transmission on each of the channels represented by a basis vector in the observable operator. From this point of view, particles are the messages that we receive in the act of observation or communication. They are real and have lifetimes ranging from almost zero to almost forever.

The similarity in the statistics of quantum and communication sources is reflected in the processes that generate their output. The implementation of Shannon's ideas requires messages the transmitter to encode them in a noiseproof form and the receiver to decode the signals received. Engineer first achieved this coding and decoding using analogue methods, but the full power of error control requires digital computations. The layering of engineered network software transforms the human interface to the physical interface in a series of steps. All of these transformations are performed by digital computers. The functions used must be invertible (injective) and computable with the machinery available.

From a network point of view, we may consider the eigenfunctions as computations of computable functions and the eigenvalues at the fixed point reached when this computation halts. This suggests that the number of discrete eigenvalues available in quantum mechanics is equal to the number of distinct computers, that is o

All the machinery of quantum mechanics can be applied in the energy/time domain without considering space. From this point of view it may be considered as a single serial channel, like music transmitted from point to point in time alone. Each quantum system may be considered to operate in its own inertial frame, since quantum mechanics can exist in the absence of space. Relativity become relevant when we consider the interactions of quantum systems in different inertial frames. We apply the network model to this situation below.

Back to top

6: Fixed points and invisibility

The 'waves' of quantum mechanics are represented by complex periodic functions which cannot be observed. All observations are consequences of interactions between two quantum systems. At the moment of observation, the infinite set of linearly superposed solutions to the relevant differential equation is understood to collapse, so that we see the eigenvalue of just one element of the superpositions. The relative frequencies of observation of each of these elements is predicted by the Born rule. We have faith in the wave equations because they work so well. Their invisibility, however, leaves their interpretation an open question which has followed quantum mechanics since its birth.

The machinery of quantum transformation is represented by the energy (or Hamiltonian) operator, which maps Hilbert space onto itself. This mapping fulfills the hypotheses for mathematical fixed point theorems like that found by Brouwer: the domain any continuous function f mapping a compact convex set into itself has a point x0 such that f(x0) = x0. The results of quantum observations may be understood as the fixed points predicted by this theorem. Brouwer fixed point theorem - Wikipedia

Physicists often seem to presume that measurement is something specific to physicists. Here we eliminate physicsts with the assumption elements of the Universe continually interact with one another, that is measure one another, and so communicate with one another, through the exchange of particles. The creation and annihilation of particles is a reflection of the evolution of wave functions and also controls this evolution, so that we consider the two processes to be duals of one another, carrying the same information in different forms in a manner similar to the relationship between covariant and contravariant vectors in the general theory of relativity.

Wave functions are complex vectors, and if we use the Schrödinger picture, these vectors are in continuous motion each element of them evolving st a frequency determined by its energy.

Quantum mechanics envisages two operations on vectors, addition (superposition) and the inner product. We consider each vector as a function from the natural numbers (which index the basis states) to the complex coefficients of each elements of the vector. Superposition involves the addition of the coefficients of corresponding basis vectors. The inner product involves the product of corresponding coefficients and their addition. Both these operations are simple computations. The inner product of a vector with itself is a real number which corresponds to the length of the vector so that Hilbert space is a metric space.

We understand these operations on Hilbert space to be the computations that encode and decoded the messages that quantum systems in the universal network exchange with one another. These coding operations are represented by operators which are represented mathematically by unitary matrices. A feature of quantum operations is that they are reversible, as we expect of a lossless codec.

The evolution of the quantum wave function is assumed to be deterministic. Veltman writes:

A vector in Hilbert space represents a physical state. What is a physical state? A physical state is a possible physical situation with particles moving here and there, with collisions, dogs chasing cats, with people living and dying. With all sorts of things happening. Often people make the mistake of identifying a physical state with the situation at a given moment, one picture fro a movie. The situation at some moment may be seen as a boundary condition: if one knows the whole situation at some moment, and one knows the laws of nature, then in principle we can deduce the rest.

The evolution of the wave function is invisible because it can only be seen if it is communicated, and the communication process also requires computation to encode and decode the message. If a computer were to stop to explain itself after every operation, it would be required to explain the operations of communication associated with this stop, and so would not be able to make any progress on the original computation. For this reason we can only see the results of halted processes.

We can see this feature of a communication network at work in the classical quantum mechanical two slit experiment, described in detail by Feynman. When we fire particles at a barrier with two slits and do not check which slit the particle goes through, we get an interference pattern. If we make a measurement to check which slit the particle went through, however, we lose the interference pattern. Feynman: Feynman lectures on physics: vol III, Quantum mechanics, Double-slit experiment - Wikipedia

Our observation has the effect of stopping the interference process before it is complete, so that there is no interference. So we cannot both have our process and observe it. All we see are the fixed points that serve as the messages between quantum processes each operating in their own inertial frame.

Feynman writes:

'One might still ask: "How does it work? What is the machinery behind the law?" No one has found any machinery behind the law. No one can "explain" any more than we have "explained". No one will give you any deeper representation of the situation. We have no ideas about a more basic mechanism from which these results can be deduced.' Feynman, Leighton & Sands FLP III:01: Chapter 1: Quantum behaviour, §7

The network, digital to the core model may give us some idea of an underlying mechanism. We asume that the observable fixed points in the Universe are the states of halting computers and the invisible dynamic of the Universe are executed by invisible computers. We suspect the presence of deterministic digital computers because of the precision with which nature determines the eigenvalues of various observations,.

Until the advent of quantum mechanics, physicists were generally inclined to believe that the world was deterministic. They still attribute determinism to the invisible process that underlies quantum observations, but they now have to accept that even though this process may be deterministic, it does not determine the the actual outcome of events, but rather the relative frequencies of the various outcomes that may result from a particular event. Laplace's demon - Wikipedia

Back to top

7: Symmetry, renormalization and symmetry breaking

The Cantor universe is an enormous formal system. Although we might see it as sufficient to talk about all the structure in the Universe, it seems far too large to deal with the local word in which we live, which usually contain thousands of people and things, not a countable infinity, the smallest cardinal in the Cantor universe.

How do we cut this enormous system down enough to leal with local situations like neighbourhoods, atoms and subatomic particles? It seems quite reasonable to assign the number zero to the initial singularity. It is a structureless existent, which has no parts and therefore no cardinal, like the empty set. Where is zero in the Cantor universe?

The idea we need is 'machine infinity'.

Renormalization is a process of changing the resolution of our view of the world. The resolution in each layer is a function of the computing power in each layer. Each layer in the network is in effect a stand-alone system, which takes its input from the layer above and delivers instructions to the layer below and vice versa. This is how software make the hardware work, and the hardware the software. The software is created by a human platonic god the software engineers, the hardware by chipmakers and their their chains of inputs and outputs.

the effect of the renormalization group is to pick out the computable permutations in the layer. This work is bounded by computability, and and uncomputable permutations are effectively by working around them. Tis is the strategy of the mathematical theory of communication, choosing code points which are surrounded by a ball of noise protection, modelled in function space. Claude Shannon: Communication in the presence of noise

The computable transformations in the transfinite network stand out as a set of fixed points, each described by a fixed algorithm which can be executed by a computer. The layers are all self similar in that any deterministic and repeatable processes to be found within them must be the product of some computation. They all have this limitation on common, and it serves as a symmetry to lead our understanding of the world. Although the power of computers is limited by Turing's theorem, there is no practical limit on then number of computers that can be connected into a network. New computers can be connected as long as there are symbolic addresses available.

Back to top

9: Symmetry and symmetry breaking

Symmetries are situations where imagine some change but nothing observable happens. They are the practical boundaries of the dynamic Universe. We may picture this to a degree by imagining the string of a piano or guitar. When struck, the string vibrates at every point except at the two ends, which are held still by the structure of the instrument. Symmetry - Wikipedia

When we consider the Universe as divine, we can imagine the symmetries discovered by physics as the boundaries of the divinity. From a logical point of view, the dynamics of the Universe is consistent. The boundaries of the dynamics are the points beyond which it would become inconsistent, that is non-existent,.

Back to top

10: Wave mechanics: superposition and timing

Quantum mechanics began in 1900 when Planck discovered that the energy emitted by a hot body came in 'quanta' each with an energy proportional to its frequency. The constant of proportionality has come to be known as the Planck's constant which appears in the Planck Einstein relation E = ℏω.

The next major step came in 1913 when Neils Bohr and Ernest Rutherford modelled the hydrogen atom as a positive nucleus orbited by a negative electron. Classical electrodynamics predicts that the acceleration of the orbiting electron would cause it to radiate energy and quickly fall into the nucleus. Spectroscopic measurements of hydrogen line frequencies led Bohr to guess that this did not happen in the atom. Instead the electron energy could remain fixed in distinct orbits ('orbitals') around the nucleus. In this model, successively larger orbitals have orbital angular momenta measured in integral multiples of Planck's constant. Bohr model - Wikipedia

Bohr guessed that the energies corresponding to spectral lines are the the energy differences between different possible electronic orbitals. Using classical electrodynamics, he was able to calculate these energies from the electric charge of the electron and the proton and the assumption of quantized orbital angular momentum. His results closely approximated the observed spectrum of hydrogen, but did not work well for more complex atoms with more than one electron.

The next step came with Louis de Broglie's 1923 hypothesis that not just photons but all particles are associated with a wave whose spatial frequency is a function of their momentum. From this point of view, the stationary states of the orbiting electron may be seen as standing waves whose wavelength is an integral sub-multiple of the circumference of the electron orbit, similar to the fundamental and overtones of a vibrating string. Louis de Broglie: Radiation, waves and quanta

Soon afterwards, in 1925, Heisenberg and Schrödinger published two versions of quantum mechanics, matrix mechanics and wave mechanics, which were later shown to be equivalent. The new quantum mechanics provided a more comprehensive application of de Broglie's idea which came to be understood as the application of a Hamiltonian operator to a complex Hilbert space whose bases could be understood as stationary quantum states interpreted as standing waves in a probability amplitude described by a wave function. As described above, the probability amplitude is a complex number whose absolute square represents the probability of finding the system in a particular state. The overall state of a quantum system is understood to be the superposition of all the wave functions represented by the eigenfunctions of the Hamiltonian.

A quantum system may be understood as the superposition of a large (possibly infinite) number of basis states defined by their relative frequencies. The frequencies of stationary states in nature appear to be defined with absolute precision. There is uncertainty of about one cycle in counting the number of cycles in a wave train which is represented by the relation &DeltaE;Δt ≥ ℏ/2π. As the wave train and observation time becomes longer, the proportional error in the energy estimate decreases linearly. If we were able to observe a stationary quantum state for an infinite time, we could establish its energy with absolute precision.

The execution of an abstract Turing machine is an ordered sequence of operations which do not involve time or frequency. A practical physical computer, on the other hand, operates in space-time and its power is measured by a combination of its memory space and its processing frequency.

A computer is a network of 'gates' small elements that can execute Boolean algebra. This algebra is quite simple. It is a set of elements, functions and axioms. The elements have two states which we may represent by '0' and '1', 'true' and 'false', 'high' and 'low' or any other duality. There are three functions, often written 'and', 'or' and 'not' which are defined by truth tables. There are four axioms. Boolean algebra is closed, meaning that boolean operations on boolean variables always leads to boolean results. It is also commutative and associative, like ordinary algebra, and distributive, the 'and' taking precedence over 'or'. Such a network can (in principle) do anything that a Turing machine can do.Boolean algebra - Wikipedia

The operations of a single computer are controlled by a clock which produces a square wave by executing the not operation at a frequency determined by a physical oscillator. In modern machines, this frequency lies in the gigaHerz range. The clock pulses maintain synchronicity between all the logical operations of the machine. From a quantum mechanical point of view, the clock represents a stationary state.

We may connect single computers into a network, even though the clock frequencies may not be the same. In general the rate of communication between networked computers can be expected to be much slower than the rate of communication within each computer.

Superposition Zurek has shown that this restriction on the completeness of observation is necessary if we are to obtain information from a quantum system. This suggests that the quantization of observation and the requirements of mathematical communication theory are consistent with one another.

From a communication point of view, quantum mechanics does not reveal actual messages but rather the traffic on various links. If we assume that the transmission of a message corresponds to a quantum of action, the rate of transmission in a channel is equivalent to the energy on that channel.

We may see the communication theoretic equivalent of quantum mechanical transformations as the computational transformation of messages between different encodings using different alphabets.

Gravitation

Let us assume that gravitation is the fundamental force in the universe and that, from the network point of view, it can be modelled as the most physical layer. We have already noted that we can use a binary logical operation, NAND, to build a computable universe, but there are simpler unary operations, no operation, NOP, and NOT.

An important feature of both these operations is that they cannot go wrong, and therefore have no need for error correction, or, on our assumptions, quantization. In other words, we may think of them as continuous. They are in a sense empty transformations.

Let us associate dynamical action with the execution of a Turing machine. From one point of view, because it represents a deterministic transformation, we can think of a Turing machine as a NOP, since (following Einstein) we think of reality going its own way and in no way changing because someone observes it from a new frame of reference.

The NOT operation is more complex, in that it introduces the idea of time or sequence. First there was p, now there is not-p, NOT(p) = not-p. From this point of view the NOT operation is part of a clock, a ticker without a counter. If we could count the ticks of this clock, we would introduce energy, the rate of action. Since down here in the foundations of the universal network there are still no observers, we can imagine that energy exists, but is not measured.

Since gravitation couples only energy, and sees only the energy of the particles with which it interacts, we might assume that it is a feature of an isolated system, that is one without outside observers. The universe as a whole is such a system.

Symmetry breaking

The mathematical axioms of quantum mechanics represent unobserved systems by a function space (set of algorithms) of finite or infinite dimension which evolves by unitary transformations of the functions represented in the space.

The observational axioms of quantum mechanics place constraints on this mathematical system. We assume that there can be but one actual outcome from each event, no matter how many possible outcomes there may be. This requirement is fixed by defining the length of permissible state vectors to be 1. Given the formalism, this gives a quantum mechanical event the same normalization as a communication source. For a quantum system sum of the probabilities of possible outcomes is 1; for a communication source, the sum of the probabilities of possible messages is 1.

The discussion above suggests that gravitation is concerned with events with only one possible outcome. Things stay the same (which can hardly be called an event) or they change from p to not-p. We might imagine this situation as characteristic of the initial singularity.

Let us now endow this rudimentary universe with memory, so that both p and not-p can exist simultaneously. Since the two parts of this broken symmetry are identical and indistinguishable, they interfere quantum mechanically and our initial singularity becomes a two state system.

Now we can imagine that p can observe not-p, that is communicate with it. We expect this to be possible because despite their difference, the two states have a common ancestry. In network terms, they are both users of the layer of processing beneath them. This layer is the undivided universe which we may imagine as a power distribution network, a flow of completely unmodulated (and therefore error free) energy.

The vector representing an observation in quantum mechanics exists in a Hilbert space which is the tensor product of the Hilbert spaces of the observing system and the observed system. Observation is thus quantum mechanically connected to the increase in complexity of the world and the breaking of symmetry. As the universe complexifies, it remains true that all emergent processes share the energy layer and so are subject to gravitation.

Conclusion

Memory is by definition something that stays the same through the passage of time, only changing when it is changed by some message or force. The origin of space is the origin of history - the origin of memory. We can get no closer to the initial singularity that the beginning of memory, because everything before that is forgotten. No particle can see before its own creation. Here we have a statement that applies to the universe as a whole as well as every particle within it, a cosmological principle.

Whenever there is observation (communication), there is quantization. There can be no observation when there are no observers. This is the initial state of the universe described by gravitation.

With the breaking of the initial total symmetry of the initial singularity space, time, memory and structure enter the universe to give us the system we observe today. Nevertheless, as in a network, the more primitive operations are to be found inherent in the more complex. So everything one does on a computer or network can be broken down into an ordered network of NANDs.

Gravitation, which describes nothing but the conserved flow of processing power, that is energy, is thus present at every point in the universe. In the context of our present four dimensional universe, this conserved flow is described by Einstein’s field equations. Because this flow cannot go wrong, it requires no error protection, and so it does not need to be quantized.

This line of argument suggests that efforts to quantize gravity may be misdirected. It may also give us some understanding of 'spooky action at a distance' which has been found to propagate much faster than the velocity of light. Let us guess that the velocity of light is limited by the need to encode and decode messages to prevent error. If there is no possibility of error, and so no need for coding, it may be that such 'empty' messages can travel with infinite velocity. Juan Yin et al: Bounding the speed of 'spooky action at a distance

Logical continuity does not require an ether or a vacuum, it exists independently of any continuous substrate and its whole reality can be expressed in the truth tables for the logical connectives. This is a very hard idea to absorb because it looks like action at a distance, but there is no distance in logic, no metric, just the logical operation of a deterministic computing machine.

@@@ path integral @@@ Isaac Newton described the motions of the heavens using the law of universal gravitation. He refused to speculate about the mechanism that implements this law, attributing this work to God: In him are all things contained and moved . . ... Newton's mathematical development of gravitation implied instantaneous action at a distance which many saw as a defect in his work.
The Newton Project Canada: Newton's General Scholium

The discovery of electromagnetism raised a similar problem which led to the notion of a field to carry forces between bodies. The term was first used by Michael Faraday. Faraday realised that electric and magnetic fields could be understood as independent entities that carried energy. The field concept has since become central to physics, and some authors consider fields to be the fundamental reality. The particles which we observe are then considered to be 'epiphenomena', energetic excitations in the underlying field. Royal Institution: Michael Faraday's iron filings @@@

Quantum mechanics goes deeper, producing vastly more detailed accounts of interaction represented by much fiercer looking mathematics. It has become a truism in physics that our space-time intuition, very helpful in imagining the orbits of planets, is of limited help in quantum mechanics. Bernard d'Espagnat

Following Faraday and Maxwell, quantum mechanics seeks to describe the inner workings of the world in terms of invisible fields that choreograph the creation, annihilation and interplay of the particles that we do see. Quantum field theory has often worked to give us astronomically precise matches between calculation and observation. This has led to a certain amount of hubris in the community, some imagining that we are seeing the mind of Newton’s God in the models we use to predict the behaviour of observable particles. Davies: The Mind of God: Science and the Search for Ultimate Meaning

Despite its success, quantum field theory is prone to logical and mathematical problems that have so far made it impossible to bring gravitation into the field theoretical fold. Greene: The Elegant Universe

Field theory sees the forces acting between observable particles as the unobservable exchange either of the particles themselves, or of gauge particles (like photons or gravitons) created and annihilated by the interagents.

One of the fundamental principles of quantum mechanics is that what we see depends upon how we look. We observe only eigenvalues corresponding to the orthogonal eigenvectors of our measurement operator. So let us look at quantum mechanics not with space-time intuition, but with social, that is networking, intuition. Wojciech Hubert Zurek: The quantum origin of quantum jumps

From this point of view, the exchange of particles is an exchange of messages. Messages on a network act as a force, changing the states of both sender and receiver. So if you tell obedient children to tidy up, they will move as though impelled by a force. Others may not hear the message at all and continue their lives unperturbed.

From this point of view, the invisible substratum of the universe we call field becomes analogous to the computation in a real world network as it transforms ordered sets of data from one encoding to another.

Here we draw attention to two features of networks, the need for error correction if they are to be useful and stable; and the layering of software on a hardware foundation in real world computer networks. Tanenbaum: Computer Networks

The layering of computer networks
The continuum

Continuous mathematics gives us the impression that it carries information at the highest possible density. Cantor’s set theory has shown us how to treat continua as transfinite sets of discrete points. This theory is so convincing that we are inclined to treat such points as real. Much of the trouble in quantum field theory comes from the assumption that point particles really exist.

The field of quantum computation believes that the information density of the continuum is real and hopes to exploit it. It is inclined to see the quantum world as a perfect analogue computer. From the point of view of algorithmic information theory, however, a continuum is just that, a featureless nothing. Nielsen & Chuang: Quantum Information and Quantum Computation, Gregory J. Chaitin: Randomness and mathematical proof

Analytical continuity is the logical foundation of the argument from continuity, embedded in tensor calculus. Einstein’s principle of general covariance embodies the idea that if two points of view of the same observable event can be joined by a continuous transformation, both express the same information in a different coordinate frame (or language). Einstein: Relativity: The Special and General Theory, Continuity - Wikipedia

This idea is supported by algorithmic information theory, that holds that there can be no more information in a theory derived logically from a set of axioms than there is in the axioms themselves. This leads us to suspect that there is no more information in differential geometry than is contained in the notations used to represent it. Chaitin: Information, Randomness and Incompleteness: Papers on Algorithmic Information Theory

It also suggests that we identify the argument from continuity as a species of ‘logical continuity’. A logical continuum is equivalent to the deterministic argument (that is the computation) that binds the hypothesis of a theorem to its conclusion: if we accept Euclidian geometry, the interior angles of a triangle total 180 degrees. A turing machine or deterministic computer is a logical continuum.

Continua therefore, represent no information in themselves, but represent the transmission or transformtion of information without error from one representation to another.

Gravitation

Let us assume that gravitation is the fundamental force in the universe and that, from the network point of view, it can be modelled as the most physical layer. We have already noted that we can use a binary logical operation, NAND, to build a computable universe, but there are simpler unary operations, no operation, NOP, and NOT.

An important feature of both these operations is that they cannot go wrong, and therefore have no need for error correction, or, on our assumptions, quantization. In other words, we may think of them as continuous. They are in a sense empty transformations.

Let us associate dynamical action with the execution of a Turing machine. From one point of view, because it represents a deterministic transformation, we can think of a Turing machine as a NOP, since (following Einstein) we think of reality going its own way and in no way changing because someone observes it from a new frame of reference.

The NOT operation is more complex, in that it introduces the idea of time or sequence. First there was p, now there is not-p, NOT(p) = not-p. From this point of view the NOT operation is part of a clock, a ticker without a counter. If we could count the ticks of this clock, we would introduce energy, the rate of action. Since down here in the foundations of the universal network there are still no observers, we can imagine that energy exists, but is not measured.

Since gravitation couples only energy, and sees only the energy of the particles with which it interacts, we might assume that it is a feature of an isolated system, that is one without outside observers. The universe as a whole is such a system.

Symmetry breaking

The mathematical axioms of quantum mechanics represent unobserved systems by a function space (set of algorithms) of finite or infinite dimension which evolves by unitary transformations of the functions represented in the space.

The observational axioms of quantum mechanics place constraints on this mathematical system. We assume that there can be but one actual outcome from each event, no matter how many possible outcomes there may be. This requirement is fixed by defining the length of permissible state vectors to be 1. Given the formalism, this gives a quantum mechanical event the same normalization as a communication source. For a quantum system sum of the probabilities of possible outcomes is 1; for a communication source, the sum of the probabilities of possible messages is 1.

The discussion above suggests that gravitation is concerned with events with only one possible outcome. Things stay the same (which can hardly be called an event) or they change from p to not-p. We might imagine this situation as characteristic of the initial singularity.

Let us now endow this rudimentary universe with memory, so that both p and not-p can exist simultaneously. Since the two parts of this broken symmetry are identical and indistinguishable, they interfere quantum mechanically and our initial singularity becomes a two state system.

Now we can imagine that p can observe not-p, that is communicate with it. We expect this to be possible because despite their difference, the two states have a common ancestry. In network terms, they are both users of the layer of processing beneath them. This layer is the undivided universe which we may imagine as a power distribution network, a flow of completely unmodulated (and therefore error free) energy.

The vector representing an observation in quantum mechanics exists in a Hilbert space which is the tensor product of the Hilbert spaces of the observing system and the observed system. Observation is thus quantum mechanically connected to the increase in complexity of the world and the breaking of symmetry. As the universe complexifies, it remains true that all emergent processes share the energy layer and so are subject to gravitation.

Conclusion

Memory is by definition something that stays the same through the passage of time, only changing when it is changed by some message or force. The origin of space is the origin of history - the origin of memory. We can get no closer to the initial singularity that the beginning of memory, because everything before that is forgotten. No particle can see before its own creation. Here we have a statement that applies to the universe as a whole as well as every particle within it, a cosmological principle.

Whenever there is observation (communication), there is quantization. There can be no observation when there are no observers. This is the initial state of the universe described by gravitation.

With the breaking of the initial total symmetry of the initial singularity space, time, memory and structure enter the universe to give us the system we observe today. Nevertheless, as in a network, the more primitive operations are to be found inherent in the more complex. So everything one does on a computer or network can be broken down into an ordered network of NANDs.

Gravitation, which describes nothing but the conserved flow of processing power, that is energy, is thus present at every point in the universe. In the context of our present four dimensional universe, this conserved flow is described by Einstein’s field equations. Because this flow cannot go wrong, it requires no error protection, and so it does not need to be quantized.

This line of argument suggests that efforts to quantize gravity may be misdirected. It may also give us some understanding of 'spooky action at a distance' which has been found to propagate much faster than the velocity of light. Let us guess that the velocity of light is limited by the need to encode and decode messages to prevent error. If there is no possibility of error, and so no need for coding, it may be that such 'empty' messages can travel with infinite velocity. Juan Yin et al: Bounding the speed of 'spooky action at a distance

@@@

Space-time is the momentum-energy layer of the universe. These also transform in the same way as distance and time. All the fundamental particles are subroutines of the universe. Some, like the photon and the electron need only space-time / momentum energy to exist. Others, like quarks, need a more complex toolkit to be found in baryons and mesons, and so cannot exist outside these systems, they are confined [maybe the fractional electric charge says something about this?].

5: The mathematics of continuity
$$$

Pais page 428: Heisenberg: "The problem of elementary particles is a mathematical one, it is simply a question of how to construct a non-linear relativistically invariant quantized wave equation without any fundamental constants."

$$$

Mathematicians have studied the question of continuity for a long time. Here we suggest that continuity in the mathematical sense is not a characteristic of the functioning of the universe, but rather the result of applying the law of large numbers to vast numbers of discrete logical operations each of which may be imagined as a step in a potential.. The continuous representation serves very well, however, for the treatment of the statistics of the fundamental processes in the Universe, and they serve that role effectively in field theories which imagine continuous processes whose continuity is occasionally broken by an interaction Since the first extant records of scientific and philosophical thought, people seem t have considered the world to be continuous. This is despite the fact that it is populated by a vast number of discrete objects ranging from grains of dust to people, trees, and cities. The only exception to the hypothesis of continuity that we know of was Democritus, and even he considered the space occupied by atoms to be continuous.

The development of modern physics that began with Galileo Galilei motivated mathematicians to the study of functions. One of the most important steps forward was application of calculus to the study of motion by Newton and others. The physicists raised many difficult problems that led to functional analysis becoming a central theme of mathematics. Many theorems dealing with the representation, continuity and differentiability of real and complex functions were discovered and applied to physical problems. Among the most important developments are the differentiable manifolds of relativity and the function spaces of quantum theory.

Intuitively continuous and differentiable means without gaps or jumps. Dedekind formalized 'without gaps' with the 'Dedekind cut' which provided a method to construct real numbers in the gaps between the rational numbers. The cut is defined logically and is generally agreed by the mathematical community to be a valid means to construct a real number, like the square root of 2. Dedekind cut - Wikipedia

The notion 'without jumps' means intuitively that an infinitesimal change in domain of a function yields an infinitesimal change in the range. This idea can be formalized with Weisertrass's 'epsilon-delta' argument similar proofs. Once again a logical process is used to describe a property of a continuum. Continuous functions are differentiable, since the limit that defines the differential exists.Continuous function - Wikipedia, Differentiable function - Wikipedia

The mathematical properties of continuous and differentiable functions depend heavily on the concept of a limit, which in turn depends upon the notion of continuity. We define the differential f'(x) of the function f(x) as the limit, as h approaches 0 of f(x +h) - f(x) / h. This expression makes no sense when h = 0, for then we would be dividing by zero, but we assume that everything goes smoothly as h gets smaller and we reach a fixed value of f '(x) somewhat before h actually reaches 0. This is in effect an argument from continuity which depends ultimately on the logical definition of continuity. Limit (mathematics) - Wikipedia

This suggests that logical continuity is prior to and more powerful than geometric continuity. If we can find a way to substitute logical arguments for the arguments from continuity employed in physics we may be able to put the subject on a stronger foundation. Cantor's set theory effectively digitized the continuum into sets of points. There is something of a paradox in trying to describe something continuous with discrete points. Aristotle considered things to be continuous if they had points in common, like a chain. The modern mathematical definition of continuity accepts that all the points in a line are distinct, yet they are so crowded that we can find no gaps between them.

Continuous mathematics gives us the impression that it carries information at the highest possible density. Cantor’s set theory has shown us how to treat continua as transfinite sets of discrete points. This theory is so convincing that we are inclined to treat such points as real. Much of the trouble in quantum field theory comes from the assumption that point particles really exist. The problem is solved if we assume that a particle is not a geometrical but alogical object, a self sustaining proof, proving itself as I prove myself.

The field of quantum computation believes that the information density of the continuum is real and hopes to exploit it. It is inclined to see the quantum world as a perfect analogue computer. From the point of view of algorithmic information theory, however, a continuum is just that, a featureless nothing. Nielsen & Chuang: Quantum Information and Quantum Computation, Gregory J. Chaitin: Randomness and mathematical proof

Analytical continuity is the logical foundation of the argument from continuity, embedded in the theory of differentiable manifolds. Einstein’s principle of general covariance embodies the idea that if two points of view of the same observable event can be joined by a continuous transformation, both express the same information in different coordinate frames (or languages). Einstein: Relativity: The Special and General Theory, Continuity - Wikipedia

Back to top

$$$

The world is psychological, made of distinct events which can best be understood by modelling them as a computer network. The actual network is invisible, but we see its inputs and outputs, the observable world. We kick the ball, it flies toward the goal, but the keeper interrupts it. An enormously complex yet infinitesimally small part of the overall functioning of the cosmos. Everything is done by [quantum] jumps which can be represented by truth tables. Some organizations of logical jumps come to exist in preference to others depending o their reproduction and resistance to noise. The only things we can observe are logical jumps, is discrete events [modelled as mathematical symbols]. There is nothing to be seen between events, but we can speculate. So far the dominant method of speculation has been continuous mathematics. Here we go a bit deeper and consider the logical foundation of this apparently continuous motion and realize that the continuity we see is the law of large

$$$ ###Stuff
5. Continuity and Noether's theorem

Even when talking about continuous quantities, mathematics is expressed in symbolic or digital form. Physical motion appears continuous and so it has been accepted since ancient times the the space and time in which we observe motion are also continuous. From the point of view of communication theory and algorithmic information theory, a continuum, with no marks or modulation, can carry no information. This idea is supported by Emmy Noether's theorem which links symmetries to invariances and conservation laws, different expressions of the fixed points in the dynamic Universe. Neuenschwander, Nina Byers, Noether's theorem - Wikipedia

Each of these three terms is equivalent to the statement 'no observable motion'. From the observers point of view nothing happens, although we may imagine and model some invisible motion or transformation like the rotation of a perfect sphere. Gauge theory - Wikipedia

The Universe is believed to have evolved from a structureless initial singularity to its current state. The evolutionary process is a product of variation and selection. The variation is made possible by the existence of non-deterministic (ie non-computable) processes. The duplication of genetic material during cell division, for instance, is subject to a certain small error rate which may ultimately affect the fate of the daughter cells. Evolution - Wikipedia

Selection culls the variations. The net effect of variation and selection is to optimize systems for survival, that is for stability or the occupation of a fixed point (which may be in a space of transfinite dimension). Before the explicit modelling of evolution, however, writers like de Maupertuis and others speculated that the processes of the world were as perfect as possible. Yourgrau & Mandelstam: Variational principles in dynamics and quantum theory

Mathematical physics eventually captured this feeling using Lagrangian mechanics. An important result of this search is Hamilton’s principle: that the world appears to optimize itself using a principle of stationary action. Noether succeeded in coupling the action functional to invariance and symmetry, to give us a broad picture of the bounds on the Universe as those fixed points where nothing happens. The conservation of action (angular momentum) , energy, and momentum form the backbone of modern physics.

Noether's work is based on continuous transformations represented by Lie groups. Symmetry also applies to discrete transformations, as we can see by rotating a triangle or a snowflake. We understand symmetries by using the theory of probability. We may consider all the 'points' in a continuous symmetry as equiprobable, and for some discrete symmetries this is also true as we see in a fair coin or an unloaded die (ignoring the identifying marks on the faces). Lie Group - Wikipedia

Communication theory also introduces the statistics of a communication source whose discrete letters are not equiprobable, but each letter of the alphabet ai has probability pi such that Σi pi = 1. Shannon then defines a source entropy H = -Σi pi log pi, which is maximized when the pi are all equal. We may consider a quantum system as a communication source emitting discrete letters events with a certain probability structure. The condition Σi pi = 1 is enforced in quantum mechanics by normalization.

From this point of view, quantum mechanics predicts the frequency of traffic on different legs of the universal network and quantum field theory enables us to model the nature and behaviour of the messages (particles) passing through this network. back to top

###
5: How do we prove that the evolution of ψ is deterministic?

The evolution of the wave equation is considered to be both continuous and deterministic. The uncertainty in quantum mechanics is believed to arise at the time of observation when the deterministic evolution of all the superposed solutions to the wave equation 'collapses' and one solution is picked at random. Diract writes '. . . a measurement always causes the system to jump into an eigenstate of the dynamical variable that is being measured . . . '. The statistics of this choice are described by the Born Rule. Dirac: The Principles of Quantum Mechanics

If we see a quantum observation as a communication source, we may imagine the eigenvalues as the source alphabet, and their relative probability as a measure of the source entropy computed by the Shannon formula

H = - Σi pi log pi

where pi is the probability of the letter si of the source S.

The Born Rule embodies two features of quantum mechanics. First, that the probability of an event is the absolute square of the probability amplitude for that event: p = |ψ|2; and second, that the probability amplitude to observe the state ψ1 in a system whose state is ψ2 is the inner product of ψ1 and ψ2, which we write <ψ2|ψ1>. In Hilbert space, the inner product is a measure of the angle between the two vectors ψ1 and ψ2. This angle, and the corresponding probabilty, may take any value between 0 (orthogonality) and 1 (parallelism).

Zurek

In his axiomatization of Hilbert space, von Neumann took care to make the space complete so that it could??

9: Gravitation: the zero entropy physical network

Here we understand the transfinite universe as layered set of function spaces generated by permutation. The cardinal of the set of mappings of a set of 0 symbols onto itself is 1 and so on without end.This structure is capable of representing any group, since the permutation group of any finite cardinality contains all possible groups of that power.

In an engineered network all messages between users drill down through the software layers to the physical layer at the transmitting end. At the receiving end the message works its way up through layers of software to the receiving user. How does this process look in the physical world?

The general structure of the Universe as we know it is layered in various ways. Following this trail backwards, we come to the initial singularity, which we assume to be pure action primed to differentiate into the current Universe.

We imagine gravitation to describe the layer of the Universal network which is concerned with the transmission of meaningless identical symbols, simply quanta of action. Gravitation sees only undifferentiated energy, and is therefore blind to the higher layers of the Universe which introduce memory, correspondences and meaning. Quantum mechanics is also non-local, suggesting that it describes a layer of the universal network antecedent to the spatial layer. This view is reinforced by the observation that quantum mechanics requires no memory, each state deriving from the state immediately preceding it. back to top

www.naturaltheology.net is maintained by The Theology Company Proprietary Limited ACN 097 887 075 ABN 74 097 887 075 Copyright 2000-2020 © Jeffrey Nicholls