Essay 28: On creating the world
[Chapter 6 of scientific-theology.com]0: Abstract:
1: Introduction:
2: Some principles: The mechanism of creation
3: Reconstruction: some problems
4: A new domain for physics: a transfinite network of logical computation
5: Parmenides, Plato, Aristotle, Aquinas, pure act, and god
6: Action to energy
7: Hilbert space interpreted as the transfinite logical domain
8: Space-time and the differentiable manifold
9: The emergence of space-time: the velocity of light
10: Relativistic quantum mechanics
11: The mathematical theory of communication
12: Noether’s theorem and symmetry
13: Computer networks: ‘logical continuity’ is more powerful than physical continuity
14: Cognitive cosmology
15: The bounds on the universe introduced by undecidability, incomputability and unpredictability
16: A theological conclusion
0: Abstract
We are working here on the assumption that the initial singularity predicted by general relativity is both structureless and the source of the universe and therefore identical to the traditional God. Traditional theology holds that God created a universe independent of themself. Here we assume that the universe we inhabit emerged with the divinity by a process analogous to that pioneered by Augustine and Aquinas to explain the multiplication of divinity in Christian doctrine of the Trinity. The creative process, which we might call cosmic darwinism, relies on the random reproduction of the initial singularity constrained by a selective process that demands local consistency.
1: Introduction
This chapter is intended to lay a foundation for the union of physics and theology, so it comprises features of both. Its methodology, following Aquinas and Einstein, is to derive conclusions logically from a set of principles.. Its motivation is to lay the foundation for the logically consistent world which science suggests we occupy. Whenever we encounter an apparent contradiction, further research usually uncovers a rational explanation,We might attribute the explosive development in mathematics since the end of the nineteenth century to the philosophy of formalism which liberated mathematics from its connection to observed reality and required only that it be an internally consistent symbolic system. Tradition places the same constraint on the omnipotence of God: The only thing God cannot do is create a real local contradiction. Aquinas, Summa I, 25, 3: Is God omnipotent?
An important proponent of formalist mathematics was David Hilbert, and one an imagine that he was motivated by Georg Cantor's discovery of set theory and the transfinite number. Cantor's work drew criticism from theologians who thought the God is the only actual infinity. He saw Cantor's work as a mathematical paradise and opposed any attempt to deny that it was the true home of mathematics. He stated in a lecture given in 1925 that From the paradise, that Cantor created for us, no-one shall be able to expel us. Cantor's paradise - Wikipedia
We know now that the basic mechanisms of the universe are described by quantum theory. We communicate with the quantum world through classical space-time which serves as our interface for inputs to quantum processes and their outputs. The union of physics and theology must therefore be based on quantum mechanics.
Traditional theology and quantum theory are both built around action. Aristotle arrived at the unmoved movers through his theory of potential and action. Aquinas rebuilt the theory of the Christian God on Aristotle's work, agreeing that the unmoved mover and God can both be defined as pure action. Two thousand years after Aristotle, action reappeared in early modern physics through the work of de Maupertuis. He thought that the creator would have made the world so that everything was achieved with the minimum of action. This idea was refined until the principle of extremal action introduced to physics by Gauss and Lagrange was found to be an effective pointer to natural dynamics in both classical and quantum physics.
While individual quanta of action are intrinsically simple, like letters of an alphabet, taken together they create a complex system which differentiates them by the relationships between them. We follow this buildup for a while until we come to the fundamental particles and gravitation and then turn back to reconcile it to quantum field theory. We then expand these ideas to the evolution of the Universe, our solar system and our lives. The backbone for this expansion in the network model. Martinus Veltman (2003): Facts and Mysteries in Elementary Particle Physics
2: Some principles: the mechanism of creation
2.1: Action and creation God is traditionally pure action. Aristotle first reached this conclusion using his ideas of potential and actuality, together with his axiom that no potential could actualize itself. This led him to propose an unmoved mover motivating the world. Aquinas introduced the same idea to prove the existence of God.
Aristotle may have been motivated to think of potential as a passive state by his use of the ideas of matter and form to explain change. His mentor Plato imagined that the structure of the world was determined through some unexplained and imperfect mode of participation in an invisible world of perfect eternal forms. Aristotle brought these forms down to earth and proposed that change was the result of some passive matter changing from one form to another. To manage this change, he proposed four causes: matter and form, which constitute real substances, an agent which causes the change of form and a final cause, the purpose motivating an agent.
Aquinas adapted this cosmology to Christianity. God, the creator, takes the place of the unmoved mover and motivates the changing forms of the material world. God creates and acts for a purpose. In traditional Christianity God made the world so expose their glory to the the appreciation of intelligent beings, humans and angels. The forms which guide God's action are the ideas in the mind of God which guide the creation of the world. God did not act for their own good, which us already perfect, but to give obedient angel Christians an eternal vision of divine magnificence. Divine justice dictates that the disobedient suffer an eternity of punishment.
In modern physics the dichotomy between potential and actuality remains, but the axiom that no potential can actualize itself is history. Potential and kinetic energy are exactly equivalent as we see in harmonic oscillators like the pendulum which (in the absence of friction) would change potential energy to kinetic energy to kinetic energy and back again forever.
The fundamental equation of quantum theory says that energy is the time rate of action, E = hf. This simple scalar equation is expanded into the vector equation known as Schrödinger's energy equation:
where ∂|ψ> / ∂t is the rate of change of |ψ> which is a vector corresponding to f. H is the energy (or Hamiltonian) operator, a matrix of the same dimension as |ψ>, which may be infinite.
Here we identify the initial singularity (and therefore God) with action. Action appears in the modern world as a quantum embodies in fundamental particles or processes. It may seem counterintuitive to identify the smallest action in the universe with the universe as a whole. This is made possible by the creative property of action. The Trinity stopped at three for doctrinal reasons. Here we suggest that the same mechanism has no bound and may lead to structures measured by Cantors transfinite numbers. As Shannon demonstrated, quantization is required for error free communication.
2.2: Entropy and control Having identified divinity with action, Aquinas then goes on to derive the properties of God from the properties of action as he understood it. The first of these is simplicity. This a hard attribute to understand from a modern point of view, given that it is held that God has an intimate knowledge of every feature of every moment of the world. How is this information to be stored in an entity without marks? It is consistent with the ancient view that knowledge is related to spirituality rather than being a physical entity.Here we do not require God to know everything about the universe from the beginning in order to create it in detail, but we do accept that the initial singularity is absolutely simple in the Thomistic sense, with no internal structure to carry information. A consequence of this, in the light of the cybernetic principle of requisite variety, is that the initial singularity has no control over its future. This is consistent with our experience of quantum mechanics and the reason for Einstein's feeling that the theory is incomplete. Although we can predict the precise nature of quantum events, we cannot predict exactly when they will happen, although we can calculate the probability of defined quantum events with precision.
Lack of entropy correlates inability to control, but from the point of view of statistical mechanics, entropy is itself a a measure of a potential gradient that points toward increasing complexity. Quantum states are inclined to decay when the decayed state (eg electron in a lower energy state and photon free in the universe) have higher entropy that the undecayed state. Ultimately the increase in entropy correlates with creation which encompasses both increasing control and increasing randomness and points to the future, so we see it as the arrow of time. Here we come to the bifurcation between unitary quantum transformations and the increase in entropy caused by "observation". Practically, this tension is resolved by evolution, which selects control out of randomness.
2.4 P and NP The P and NP problem in the theory of computation brings the computer network structure into play.
2.5 Evolution, randomness and selection Given that the initial singularity has the potential to act, but no control over when its acts, we expect a random sequence of actions. Since energy inversely proportional to the period between actions, we expect this to create a spectrum of energies analogous to the vacuum of quantum field theory. At this point we have time but no space, so we van imagine a Hilbert space of vectors which are differentiated by period, or inverse frequency and a superposition of all these vectors which becomes more complex as time goes by and they may all be considered to be entangled since they have a common origin. This entanglement will create correlations in the set of vectors. 2.6 From Hilbert space to Minkowski space The fundamental formal quantum mechanical representation of the universe is Hilbert space, and we imagine the initial singularity to be represented by a ray (normalized vector) in one dimensional Hilbert space. In the traditional theology of the Trinity the Son or Word of God is understood to be the Father's image of themself. Here we represent this situation by the creation of a two dimensional Hilbert space through the tensor product of the one dimensional space with itself, and imagine this process continuing in an unbounded way represented by the creation of the Cantor universe of transfinite numbers, a process captured in the colloquial term "big bang". Wojciech Hubert Zurek: Quantum origin of quantum jumps: breaking of unitary symmetry induced by information transfer and the transition from quantum to classical 2.7 Fixed point theory and the closed universe / totality of divinity Traditional theology interprets the divinity as the realization of all possible being, sometimes known as the fullness of being. The exact meaning of this phrase is unclear, and it raises some difficulty with the existence of universe other than God, since if God is everything, there is no room outside God for something else. The identification of God and the Universe removes this difficulty, and raises the issue of what the fullness of being might mean in the context of the observable universe. Here we take it to mean that there is nothing outside it. The universe embraces all possible consistent structure, and to go outside it is to go into inconsistency which is, by definition, not being. From this point of view we can say the universe is closed, the boundary of the closure being the boundary between logical or formal consistency and inconsistency.Fixed point theory holds that the dynamics of any closed, convex and continuous system whose dynamics can be represented by a function f(x) of states x contains points which are stationary in the sense that they map onto themselves, f(x) = x.
3: Reconstruction: some problems
Cosmological studies suggest that the the idea that the universe within an initial singularity is plausible, but this notion comes with a few problems.
First, Hawking and Ellis conclude (364): . . . the actual point of creation, the singularity, is outside the scope of presently known laws of physics. An interpretation of the principle of conservation of energy suggests that the initial singularity is a scene of infinite temperature and energy density, which may seem to many an unphysical hypothesis.
Second, one can imagine that an initial point bearing a huge amount of energy could be the source of a big bang but details of this emergence are not particularly clear. High energy physics does show, however, that particles appear wherever there is sufficient concentration of energy.
Third the theory of black holes suggests that an initial singularity analogous to a black hole is unlikely to explode. Hawking devised a quantum mechanism through which black holes can ‘evaporate’ by releasing black body radiation, but this is very slow. A black hole the size of the Sun, for instance, would take many times the age of the Universe to decay.
4: A new domain for physics: a transfinite network of logical computation
If a time-reversed black hole is not an option, how could the universe have emerged from an initial singularity?
Misner, Thorne and Wheeler speculate about the existence of a “pregeometry” which pre-existed the emergence of space-time. This geometry is imagined to be implemented by formal logic rather than metric space.
Although we write logical arguments out at length on spaces such as paper, the formal logic itself lies outside both space and time. The essence of a long non-constructive proof comes down to a statement (a logical consequence of the initial hypotheses) that p = not-p, thus proving the hypothesis false.
Although our arguments are represented in time and space, formal mathematical proofs are considered to be local and eternal, outside space-time. Since a computer is a deterministic machine, it is essentially isomorphic to a proof: any proof can be represented by such a machine.
A real computer existing in space-time is a network of physically embodied logical operators, memory and a clock. The problem of devising a practical computer was to imitate the formality of mathematical proof. This is achieved using a clock to hide the dynamics logical operators.
A clock pulse sets the elements of the computer in motion. After a interval long enough for everything to reach a steady state the next phase of the clock causes the resulting state to be written in memory. The next pulse carries that computation another step forward. As Turing noted, a human computer can stop work after the completion of any computational step and begin again after an indefinite interval. Alan Turing: On Computable Numbers, with an application to the Entscheidungsproblem
A computer is a periodic function. The essential difference between computers and a computer networks is that the clocks in a network are not synchronised so that buffering mechanisms are necessary to synchronise network operations.
Network computation is a formal structure invariant with respect to complexity, as we can see when we complete a non-constructive proof of whatever length with the operator not-and where, for instance, the inputs nanded are number theory and the proposition that there is a largest prime number.
We can create a logical domain for physics by combining Cantor’s theory of transfinite numbers with Turing’s theory of computation. The cardinal of the set of Turing machines is equivalent to the cardinal of the set of natural numbers, so that we can map one onto the other.
Cantor developed the transfinite numbers by considering combinations and permutations of natural numbers. Local elements of a network are sequences of logical operations which may be modelled by Turing machines and combined and permuted by analogy to the transfinite numbers. Computers are connected by mapping the output of one to the inut of the next.
The simplest computer does nothing. Its ouptut is identical to its input, so it is identical to an error free communication channel. The next most complex operations are not and and, combined giving nand, aka the Sheffer stroke. Any computation may be executed by a suitable network of memory and nand operators.
The resulting transfinite computer network may serve as a domain in which to explore the transition from intial singularity to universe. We asume that this network operates as both real and complex numbers so it can model the emergence of the stationary real world from the dynamic world of complex amplidudes, exemplified by the Born rule for quantum measurements.
Like the set theoretical growth of the trsnsfinite numbers, the transfinite computer network may be seen as a construction process beginning with the empty set representing the initial singularity and growing without end, each step in this growth embodied in the next.
5: Parmenides, Plato, Aristotle, Aquinas, pure act, and god
Formalism in science has a long history.
The first gods to emerge in human history were modelled on contemporary warlords, kings, queens and emporers but about 500 bce Greek philosophers began to produce more abstract cosmic models of god which are will current in western theology.
A goddess revealed to Parmenides that there are two sources of truth, what we see in ephemeral day to day life and the deep perfect eternal heart of the universe. Parmenies idea was taken up by Plato with his theory of ideas. He saw our world as a pale shadow of a perfect formal heaven.
In his Physics, a study of motion, Aristotle brought Plato’s forms down to earth with his theory of matter and form. He elaboratd this idea to a theory of potency and act, and defined motion as the transition from potential being to actual being. The fundamental axiom of this theory states that no potential can actualize itelf.
From this idea Aristotle arrived at the notion of a unmoved mover responsible for all the action in the universe. The fundamental axiom requires that this entity be pure actuality. The unmoved mover was coopted in its entirely by the medieval Christian theologian Aquinas (12xx). The only difference being that, as required by his dogmatic faith, Aquinas put God outside rather than inside the universe. Here I revert to Aristotle’s view.
In his discussion of the first mover, Aristotle noted forms may guide action, but are not a source of action.
6: Action to energy
Let us assume that the initial singularity is a quantum of action. A natural logical definition of action is that it is the operation that changes some p (eg red) into not-p (eg green). In the binary space of formal logic we assume that not-(not p) = p, but if we remove the binary constraint we may find that not-(not p) = pi , (eg blue . . .).
It may seem difficul to reconcile the physical quantum of action with the intial state of the universe. The logical definition of action, however, presupposes no spacetime measure. We may attribute the frequent execution of quanta of action in the currrent universe to the replication of the initial singularity by self reference.
Energy may exist in a binary logical space by repeated execution of the quantum of action to give us a formal waveform. Let us assume that the quantum of action can be identified with the execution of any halting computation in the logical domain.
Recursive complexification is central to Turing’s paper on computation and the universe appears to follow this pattern, developing through fundamental particles, atoms, molecules, crystals, cells and so on.
The cybernetic principle of requisite variety requires that one system can control another only if the entropy of the controlling system is greater than the entropy of the system controlled. Given that the entropy of the initial singuarlity is zero, it can exercise no control over the sequence of states emerging within it which are therefore random, so that the rate of action, the formal measure of energy, has no constaint on its value. Ross Ashby: An Introduction to Cybernetics
7: Hilbert space interpreted as the transfinite logical domain
Given a complex waveform, we are in a position to introduce quantum theory. The non-locality of quantum mechanics revealed by Bell suggests that quantum mechanics emerged before the advent of space-time. John Bell: Speakable and Unspeakable in Quantum Mechanics
We proceed therefore on the asumption that quantum mechanics is a theory of energy defined as repeated action, one quantum of action per cycle. We are here in the purely formal phase of the emergence of the universe where syntactic structures stand by themselves, obeying only the demand of consistency without reference to any meaning.
The lack of any constraint arising from the initial singularity means that the Hilbert space of the universe amplitudes comprises vectors representing every element of the transfinite domain of computation.
Quantum mechanics comprises three axioms.
First: Vectors representing different energy states, that is different frequencies, are othogonal. We measure the distance between vectors by an inner product and we may create new vectors by adding old ones (superposition).
Second: the sequential states of a quantum system are related to one another by a unitary transformation which is in effect a reversible (entropy conserving) rotation in state space.
Third: this dynamical system mapping onto itself fulfills the requirements of mathematical fixed point theory. When it is embedded in four dimensional spacetime this process is decribed as measurement and the selection of fixed points is sometimes called “the collapse of the wave function”.
Communication requires sources to share a code or basis. Zurek notes that real values in quantum mechanics (eigenvalues) result from the inner product of a vector with its complex conjugate. This is the selective process that selects stationary real states in quantum observations. Wojciech Hubert Zurek: Quantum origin of quantum jumps: breaking of unitary symmetry induced by information transfer and the transition from quantum to classical
Although the terms ‘observation’ and ‘measurement’ seem to imply the presence of a human physicist, and some commentators have felt that human consciousness is somehow involved in quantum mechanics, I assume here that states of the world can observe each other, and the selective process that reveals ‘real’ states is the conversion of periodic complex states of motion into fixed real states.
Non-relativistic quantum mechanics does not require space-time for its execution, but is the logical source of space-time.
8: Space-time and the differentiable manifold
The domain of theories of relativity is a continuous differentiable manifold with a Minkowski metric signature, say (1, 1, 1, -1). This manifold may be interpreted as a computer network executing the discrete logical operations selected by quantum mechanics
Einstein’s project to complete the special theory of relativity which established the properties of inertial motion with a general theory that embraced acceleration was set on motion by ‘the happiest thought’ of his life: a person in free fall will not feel his own weight. Inertial motion thus served him as a reference point to study accelerated motion.
The general theory emerged with help of Riemann’s differential geometry which provided a framework to use communication between flat inertial spaces to arrive at a dynamic space whose curvature, defined by a metric, gives mathematical expression to gravitation.
The effect of curvature is to establish a potential whose gradient causes accelerations between nearby geodesics (that is paths of free fall) without subjecting them to force which would break their inertial condition, fulfilling Einsteins insight that a (small enough) person freely falling in a gravitational field would not feel their own weight.
Here we feel a connection with entanglement, which, as demonstrated by the EPR thought experiment and subsequent real experiments, introduces correlation without causality, that is it is a formalism without agency, without the transmission of force or information.
9: The emergence of space-time: the velocity of light
After the quantum of action, the velocity of light is the next most significant fixed point in the universe, and defines the metric structure of spacetime.
The formal property of space is that both p and not-p may exist simultaneously, enabling the existence of disrete orthogonal vectors in Hilbert space, each corresponding to an energy state.
We now seek to undersand the emergence of real space-time. We have noted that state vectors may “observe” one another to create fixed points. The mechanism for process in the amplitude world is logical contact. Given the constructive nature of complexification, we should expect this contact to be maintained with the emergence of real spacetime. We guess the representation of this contact in real space time is the null geodesic which carries contact in the amplitude world to contact in the observed world.
Entanglement maintains correlations between states which share a wavefunction, and we find that when these states are separated in space the correlation operates at superluminal speeds. We may assume that since all the states of thr world emerge from the initial singularity they are entangled and this entanglement reflects their origin in the pregeometric world. Daniel Salart, et al: Testing the speed of 'spooky action at a distance'
10: Relativistic quantum mechanics
Von Neumann placed quantum mechanics on a firm footing by unifying the Heisenberg and Schrodinger approaches in an abstract complex Hilbert space with a metric defined as a real inner product which we interpret as a probability. Physical states are represented in this space by orthogonal vectors ψ whose absolute value (computed by the inner product with themselves) is 1, with the dynamic property that eiθψ = ψ. In standard quantum theory the probability of quantum interactions between two states represented by ψ1 and ψ2 are computed by the inner product of the two state vectors (the Born rule) and each interaction accompanied by a quantum of action. This suggests that from an observational point of view we may identify a quantum interaction with one cycle of a wave. The energy of a photon, for instance, is givenby the relation E = hf. John von Neumann: Mathematical Foundations of Quantum Mechanics
The information represent by state vectors is a dynamic function of phase, θ, and quantum mechanical information processing is represented by differences of phase. The finite velocity of communication in space-time means that phase is a function of space-time interval.The assumption that space-time is continuous implies the the existence of infinitesimal and infinite phase differences which imply in turn infinite and infinitesimal differences in energy and momentum which lead to infinities in relativistic computations which must be removed by renormalization to reconcile events at different spatial scales.
A particular difficulty with the current approach to relativistic quantum mechanics is that gravitation is not renormalisable and so remains outside the Standard model. The root of the problem is the assumption of continuity in communication. Cantor’s work was, in effect to digitize the continuum and the invariance of the computer network with respect to scale provides us with a route to avoid the problems of continuity by respecting the fact that a stable world is quantized at all scales.
11: The mathematical theory of communication
The mathematical theory of communication devised by Shannon provides further insight into the role of quantization. A sine qua non for a stable computer network is eror free communication. Claude Shannon: Communication in the Presence of Noise
Shannon defines an information source A by enumerating the set of symbols ai that it can emit and their probabilities pi. The theory assumes that the symbols are discrete and independent so that the sum of their emission probabilities is 1: Σi pi = 1. Communication errors occur when noise in the communication channel causes symbols to be confused. The theory avoids this confusion by encoding symbols into packets. Alexandr Khinchin: Mathematical Foundations of Information Theory
Linear increases in packet size are equivalent to increasing the dimension of communication space, whose volume therefore increases exponentially, enabling packets to be placed further apart, reducing the probability of confusion. This is equivalent to quantization.
The encoding and decoding of messages is a computation process built around two conjugate algorithms, one to encode and the other to decode the message, a coder-decoder or codec.
The coding and decoding process introduces delay into comunication. The encoding process must wait for the source to emit sufficient symbols to construct one packet. The decoding process must wait for the channel to deliver a complete packet for decoding. We may see in this delay an explanation for the finite velocity of light in a network universe.
12: Noether’s theorem and symmetry
A consequence of the fact that digitization is required for error free information transmission is that a mathematical continuum carries no information because it embodies no marks or symbols. This notion plays a central role in Noether’s theorem which explains the existence of symetries or laws of nature. Dwight Neuenschwander: Emmy Noether's Wonderful Theorem
Noether’s theorem states that if the Lagrangian function for a physical system is not affected by a continuous transformation in the coordinate system used to describe it, then there will be a corresponding conservation law. Here the fact that continuity means that nothing happens is equivalent to symmetry. If we continuously move the origin of our time coordinate, for instance, the result is conservation of energy.
The general theory of relativity may be derived from a Lagrangian and the effect of Noether’s theorem is that changes in the coordinate systems used to describe the universe have no effect on the metric which defines the large scale struture of space-time and points to the existence of the initial singularity.
A computer network like the internet does not require a refernce frame per se, it is self addressing. Users have relatively transparent access to the addresses of all the files they wish to access and can obtain addressed spaces to store their own files and make them pubicly availabe if they so wish.
If the initial singularity is a point, or equivalently a structureless continuum, it has no capacity to carry any information that might specify the initial conditions of the universe, so we must dismiss the idea of initial conditions. The only constraint on the universe is self consistency, the same constraint as we place on mathematics. A self-consistent symbolic system is automatically granted mathematical existence. The structureless initial singularity enjoys what we might call null consistency, rather like the empty set, ∅.
13: Computer networks: ‘logical continuity’ is more powerful than physical continuity
Physics generally assumes that the universe is geometrically continuous, a natural assumption, given the continuous appearance of macroscopic spatial motion. On the other hand all communications, observations and discussions in the Universe are quantized. Experimental physics revolves around classifying (‘binning’) and counting events. When we observe the output of physicists and mathematicians (‘the literature’) we see that it too is quantized, into discrete volumes, articles, words and symbols, like this.
Continuity is a very strong constraint on the variety of functions and models. The total number of functions on a domain of cardinal n is the cardinal of the set of permutations of n elements, i.e. n!. Of these, only 2n are continuous in the sense that each element is replaced by its nearest neighbour to realize a cyclic subgroup of the group of permutations. The ratio of continuous functions to all possible functions is thus 2n/n!.
Continua are essentially unobservable, since there are no ‘marks’ to observe. So we cannot observe the continuous unitary evolution of an isolated quantum system. When we observe such a system, we do not see the whole continuous system, but only one or other of the basis states (eigenvectors) of the operator we use to observe the system.
All our mathematical proofs, including those relating to the analysis of continua, are logical continua. This suggests further that the correct understanding of continuity in the universe is not ‘geometric continuity’ but ‘logical continuity’.
14: Cognitive cosmology
Cognition can be understood as a logical process. Our daily lives are controled by a large number of neural processing pipelines running from our internal and external sensors to the muscle fibres, endocrine glands and other agents of our living processes. The logical network structure hinted at in this essay is has sufficient power and variety to desribe every quantum of action in the universe at whatever scale, and so might fittingly be called a cognitive model of cosmology.
15: The bounds on the universe introduced by undecidability, incomputability and unpredictability
Hilbert dreamt that all mathematical problems are inherently soluble, but Gödel and Turing crushed his dream and demonstrated that there are bounds on consistent mathematics. These bounds destroy the determinism which was once thought to be an attribute of an infinite, omniscient and omnipotent divinity capable of knowing and controlling every moment in the life of the universe. By revealing the formal uncertainties embedded in our universe, they explain the existence of the variation which makes evolution by natural selection possible, which has brought us to our current state of somewhat uncertain existence. Andrew Hodges: Alan Turing: The Enigma
16: A theological conclusion
Our most ancient written records indicate that theology, in one form or another, is the trditional theory of everything. Its basic function is to provide a description of our environment which gives us with some understanding of our place in the world and suggests guidance on how we should behave to optimize our lives. As noted above, Greek philosophers (and many others) began, some time ago, to replace capricious gods with very the human characteristics of lust, senseless violence, pride, jealousy and so on with more realistic theologies based on reliable historical observations, the data input to science.
The project underlying this essay is an attempt to carry this endeavour to its ultimate conclusion by recognising that the universe of our experience plays all the roles traditionally attributed to gods. The overall effect of this change of theological paradigm is to replace invisible and capricious divinities whose desires are often interpreted by powerful people acting in their own self interest with the system of the universe revealed by science which provides an epistemological foundation for an infinite variety of reliable technological routes to improving our collective condition.
From the point of view of physics, the process of renormalization (which worried Feynman) might be eliminated by establishing the domain of the universe as a network of computation which is invariant with respect to complexity, drawing this quality from Cantor’s proof for the existence of transfinite numbers, which is also invariant with respect to complexity. In this domain, which I might call cognitive cosmology, the so called quantum “collapse of the wave function” and the human conception of an idea by insight, like Archimedes eureka moment, are examples of identical computational processes at vastly different scales. Richard P. Feynman: Nobel Lecture: The Development of the Space-Time View of Quantum Electrodynamics, Bernard Lonergan: Insight: A Study of Human Understanding
(24 April 2020: written in a time of plague (as if this helps, but perhaps it helped Newton!)
Note: This essay is a speculation developed from my honours thesis written in the Department of Philosophy, Adelaide University Jeffrey Nicholls (2019): Prolegomenon to Scientific Theology