natural theology

We have just published a new website that summarizes the ideas of this site. Available at Cognitive Cosmology.com.

Contact us: Click to email

Essay 28: On creating the world

[Chapter 6 of scientific-theology.com]
0: Abstract:
1: Introduction:
2: Some principles: The mechanism of creation
3: Reconstruction: some problems
4: A new domain for physics: a transfinite network of logical computation
5: Parmenides, Plato, Aristotle, Aquinas, pure act, and god
6: Action to energy
7: Hilbert space interpreted as the transfinite logical domain
8: Space-time and the differentiable manifold
9: The emergence of space-time: the velocity of light
10: Relativistic quantum mechanics
11: The mathematical theory of communication
12: Noether’s theorem and symmetry
13: Computer networks: ‘logical continuity’ is more powerful than physical continuity
14: Cognitive cosmology
15: The bounds on the universe introduced by undecidability, incomputability and unpredictability
16: A theological conclusion
0: Abstract

We are working here on the assumption that the initial singularity predicted by general relativity is both structureless and the source of the universe and therefore identical to the traditional God. Traditional theology holds that God created a universe independent of themself. Here we assume that the universe we inhabit emerged with the divinity by a process analogous to that pioneered by Augustine and Aquinas to explain the multiplication of divinity in Christian doctrine of the Trinity. The creative process, which we might call cosmic darwinism, relies on the random reproduction of the initial singularity constrained by a selective process that demands local consistency.

Back to top

1: Introduction
This chapter is intended to lay a foundation for the union of physics and theology, so it comprises features of both. Its methodology, following Aquinas and Einstein, is to derive conclusions logically from a set of principles.. Its motivation is to lay the foundation for the logically consistent world which science suggests we occupy. Whenever we encounter an apparent contradiction, further research usually uncovers a rational explanation,

We might attribute the explosive development in mathematics since the end of the nineteenth century to the philosophy of formalism which liberated mathematics from its connection to observed reality and required only that it be an internally consistent symbolic system. Tradition places the same constraint on the omnipotence of God: The only thing God cannot do is create a real local contradiction. Aquinas, Summa I, 25, 3: Is God omnipotent?

An important proponent of formalist mathematics was David Hilbert, and one an imagine that he was motivated by Georg Cantor's discovery of set theory and the transfinite number. Cantor's work drew criticism from theologians who thought the God is the only actual infinity. He saw Cantor's work as a mathematical paradise and opposed any attempt to deny that it was the true home of mathematics. He stated in a lecture given in 1925 that From the paradise, that Cantor created for us, no-one shall be able to expel us. Cantor's paradise - Wikipedia

We know now that the basic mechanisms of the universe are described by quantum theory. We communicate with the quantum world through classical space-time which serves as our interface for inputs to quantum processes and their outputs. The union of physics and theology must therefore be based on quantum mechanics.

Traditional theology and quantum theory are both built around action. Aristotle arrived at the unmoved movers through his theory of potential and action. Aquinas rebuilt the theory of the Christian God on Aristotle's work, agreeing that the unmoved mover and God can both be defined as pure action. Two thousand years after Aristotle, action reappeared in early modern physics through the work of de Maupertuis. He thought that the creator would have made the world so that everything was achieved with the minimum of action. This idea was refined until the principle of extremal action introduced to physics by Gauss and Lagrange was found to be an effective pointer to natural dynamics in both classical and quantum physics.

While individual quanta of action are intrinsically simple, like letters of an alphabet, taken together they create a complex system which differentiates them by the relationships between them. We follow this buildup for a while until we come to the fundamental particles and gravitation and then turn back to reconcile it to quantum field theory. We then expand these ideas to the evolution of the Universe, our solar system and our lives. The backbone for this expansion in the network model. Martinus Veltman (2003): Facts and Mysteries in Elementary Particle Physics

Back to top

2: Some principles: the mechanism of creation

2.1: Action and creation God is traditionally pure action. Aristotle first reached this conclusion using his ideas of potential and actuality, together with his axiom that no potential could actualize itself. This led him to propose an unmoved mover motivating the world. Aquinas introduced the same idea to prove the existence of God.

Aristotle may have been motivated to think of potential as a passive state by his use of the ideas of matter and form to explain change. His mentor Plato imagined that the structure of the world was determined through some unexplained and imperfect mode of participation in an invisible world of perfect eternal forms. Aristotle brought these forms down to earth and proposed that change was the result of some passive matter changing from one form to another. To manage this change, he proposed four causes: matter and form, which constitute real substances, an agent which causes the change of form and a final cause, the purpose motivating an agent.

Aquinas adapted this cosmology to Christianity. God, the creator, takes the place of the unmoved mover and motivates the changing forms of the material world. God creates and acts for a purpose. In traditional Christianity God made the world so expose their glory to the the appreciation of intelligent beings, humans and angels. The forms which guide God's action are the ideas in the mind of God which guide the creation of the world. God did not act for their own good, which us already perfect, but to give obedient angel Christians an eternal vision of divine magnificence. Divine justice dictates that the disobedient suffer an eternity of punishment.

In modern physics the dichotomy between potential and actuality remains, but the axiom that no potential can actualize itself is history. Potential and kinetic energy are exactly equivalent as we see in harmonic oscillators like the pendulum which (in the absence of friction) would change potential energy to kinetic energy to kinetic energy and back again forever.

The fundamental equation of quantum theory says that energy is the time rate of action, E = hf. This simple scalar equation is expanded into the vector equation known as Schrödinger's energy equation:

i ℏ ∂|ψ> / ∂t = H|ψ>

where |ψ> / ∂t is the rate of change of |ψ> which is a vector corresponding to f. H is the energy (or Hamiltonian) operator, a matrix of the same dimension as |ψ>, which may be infinite.

Here we identify the initial singularity (and therefore God) with action. Action appears in the modern world as a quantum embodies in fundamental particles or processes. It may seem counterintuitive to identify the smallest action in the universe with the universe as a whole. This is made possible by the creative property of action. The Trinity stopped at three for doctrinal reasons. Here we suggest that the same mechanism has no bound and may lead to structures measured by Cantors transfinite numbers. As Shannon demonstrated, quantization is required for error free communication.

2.2: Entropy and control Having identified divinity with action, Aquinas then goes on to derive the properties of God from the properties of action as he understood it. The first of these is simplicity. This a hard attribute to understand from a modern point of view, given that it is held that God has an intimate knowledge of every feature of every moment of the world. How is this information to be stored in an entity without marks? It is consistent with the ancient view that knowledge is related to spirituality rather than being a physical entity.

Here we do not require God to know everything about the universe from the beginning in order to create it in detail, but we do accept that the initial singularity is absolutely simple in the Thomistic sense, with no internal structure to carry information. A consequence of this, in the light of the cybernetic principle of requisite variety, is that the initial singularity has no control over its future. This is consistent with our experience of quantum mechanics and the reason for Einstein's feeling that the theory is incomplete. Although we can predict the precise nature of quantum events, we cannot predict exactly when they will happen, although we can calculate the probability of defined quantum events with precision.

Lack of entropy correlates inability to control, but from the point of view of statistical mechanics, entropy is itself a a measure of a potential gradient that points toward increasing complexity. Quantum states are inclined to decay when the decayed state (eg electron in a lower energy state and photon free in the universe) have higher entropy that the undecayed state. Ultimately the increase in entropy correlates with creation which encompasses both increasing control and increasing randomness and points to the future, so we see it as the arrow of time. Here we come to the bifurcation between unitary quantum transformations and the increase in entropy caused by "observation". Practically, this tension is resolved by evolution, which selects control out of randomness.

2.4 P and NP The P and NP problem in the theory of computation brings the computer network structure into play.

2.5 Evolution, randomness and selection Given that the initial singularity has the potential to act, but no control over when its acts, we expect a random sequence of actions. Since energy inversely proportional to the period between actions, we expect this to create a spectrum of energies analogous to the vacuum of quantum field theory. At this point we have time but no space, so we van imagine a Hilbert space of vectors which are differentiated by period, or inverse frequency and a superposition of all these vectors which becomes more complex as time goes by and they may all be considered to be entangled since they have a common origin. This entanglement will create correlations in the set of vectors.

2.6 From Hilbert space to Minkowski space The fundamental formal quantum mechanical representation of the universe is Hilbert space, and we imagine the initial singularity to be represented by a ray (normalized vector) in one dimensional Hilbert space. In the traditional theology of the Trinity the Son or Word of God is understood to be the Father's image of themself. Here we represent this situation by the creation of a two dimensional Hilbert space through the tensor product of the one dimensional space with itself, and imagine this process continuing in an unbounded way represented by the creation of the Cantor universe of transfinite numbers, a process captured in the colloquial term "big bang". Wojciech Hubert Zurek: Quantum origin of quantum jumps: breaking of unitary symmetry induced by information transfer and the transition from quantum to classical

2.7 Fixed point theory and the closed universe / totality of divinity Traditional theology interprets the divinity as the realization of all possible being, sometimes known as the fullness of being. The exact meaning of this phrase is unclear, and it raises some difficulty with the existence of universe other than God, since if God is everything, there is no room outside God for something else. The identification of God and the Universe removes this difficulty, and raises the issue of what the fullness of being might mean in the context of the observable universe. Here we take it to mean that there is nothing outside it. The universe embraces all possible consistent structure, and to go outside it is to go into inconsistency which is, by definition, not being. From this point of view we can say the universe is closed, the boundary of the closure being the boundary between logical or formal consistency and inconsistency.

Fixed point theory holds that the dynamics of any closed, convex and continuous system whose dynamics can be represented by a function f(x) of states x contains points which are stationary in the sense that they map onto themselves, f(x) = x.

Back to top

3: Reconstruction: some problems

Cosmological studies suggest that the the idea that the universe within an initial singularity is plausible, but this notion comes with a few problems.

First, Hawking and Ellis conclude (364): . . . the actual point of creation, the singularity, is outside the scope of presently known laws of physics. An interpretation of the principle of conservation of energy suggests that the initial singularity is a scene of infinite temperature and energy density, which may seem to many an unphysical hypothesis.

Second, one can imagine that an initial point bearing a huge amount of energy could be the source of a big bang but details of this emergence are not particularly clear. High energy physics does show, however, that particles appear wherever there is sufficient concentration of energy.

Third the theory of black holes suggests that an initial singularity analogous to a black hole is unlikely to explode. Hawking devised a quantum mechanism through which black holes can ‘evaporate’ by releasing black body radiation, but this is very slow. A black hole the size of the Sun, for instance, would take many times the age of the Universe to decay.

Back to top

4: A new domain for physics: a transfinite network of logical computation

If a time-reversed black hole is not an option, how could the universe have emerged from an initial singularity?

Misner, Thorne and Wheeler speculate about the existence of a “pregeometry” which pre-existed the emergence of space-time. This geometry is imagined to be implemented by formal logic rather than metric space.

Although we write logical arguments out at length on spaces such as paper, the formal logic itself lies outside both space and time. The essence of a long non-constructive proof comes down to a statement (a logical consequence of the initial hypotheses) that p = not-p, thus proving the hypothesis false.

Although our arguments are represented in time and space, formal mathematical proofs are considered to be local and eternal, outside space-time. Since a computer is a deterministic machine, it is essentially isomorphic to a proof: any proof can be represented by such a machine.

A real computer existing in space-time is a network of physically embodied logical operators, memory and a clock. The problem of devising a practical computer was to imitate the formality of mathematical proof. This is achieved using a clock to hide the dynamics logical operators.

A clock pulse sets the elements of the computer in motion. After a interval long enough for everything to reach a steady state the next phase of the clock causes the resulting state to be written in memory. The next pulse carries that computation another step forward. As Turing noted, a human computer can stop work after the completion of any computational step and begin again after an indefinite interval. Alan Turing: On Computable Numbers, with an application to the Entscheidungsproblem

A computer is a periodic function. The essential difference between computers and a computer networks is that the clocks in a network are not synchronised so that buffering mechanisms are necessary to synchronise network operations.

Network computation is a formal structure invariant with respect to complexity, as we can see when we complete a non-constructive proof of whatever length with the operator not-and where, for instance, the inputs nanded are number theory and the proposition that there is a largest prime number.

We can create a logical domain for physics by combining Cantor’s theory of transfinite numbers with Turing’s theory of computation. The cardinal of the set of Turing machines is equivalent to the cardinal of the set of natural numbers, so that we can map one onto the other.

Cantor developed the transfinite numbers by considering combinations and permutations of natural numbers. Local elements of a network are sequences of logical operations which may be modelled by Turing machines and combined and permuted by analogy to the transfinite numbers. Computers are connected by mapping the output of one to the inut of the next.

The simplest computer does nothing. Its ouptut is identical to its input, so it is identical to an error free communication channel. The next most complex operations are not and and, combined giving nand, aka the Sheffer stroke. Any computation may be executed by a suitable network of memory and nand operators.

The resulting transfinite computer network may serve as a domain in which to explore the transition from intial singularity to universe. We asume that this network operates as both real and complex numbers so it can model the emergence of the stationary real world from the dynamic world of complex amplidudes, exemplified by the Born rule for quantum measurements.

Like the set theoretical growth of the trsnsfinite numbers, the transfinite computer network may be seen as a construction process beginning with the empty set representing the initial singularity and growing without end, each step in this growth embodied in the next.

Back to top

5: Parmenides, Plato, Aristotle, Aquinas, pure act, and god

Formalism in science has a long history.

The first gods to emerge in human history were modelled on contemporary warlords, kings, queens and emporers but about 500 bce Greek philosophers began to produce more abstract cosmic models of god which are will current in western theology.

A goddess revealed to Parmenides that there are two sources of truth, what we see in ephemeral day to day life and the deep perfect eternal heart of the universe. Parmenies idea was taken up by Plato with his theory of ideas. He saw our world as a pale shadow of a perfect formal heaven.

In his Physics, a study of motion, Aristotle brought Plato’s forms down to earth with his theory of matter and form. He elaboratd this idea to a theory of potency and act, and defined motion as the transition from potential being to actual being. The fundamental axiom of this theory states that no potential can actualize itelf.

From this idea Aristotle arrived at the notion of a unmoved mover responsible for all the action in the universe. The fundamental axiom requires that this entity be pure actuality. The unmoved mover was coopted in its entirely by the medieval Christian theologian Aquinas (12xx). The only difference being that, as required by his dogmatic faith, Aquinas put God outside rather than inside the universe. Here I revert to Aristotle’s view.

In his discussion of the first mover, Aristotle noted forms may guide action, but are not a source of action.

Back to top

6: Action to energy

Let us assume that the initial singularity is a quantum of action. A natural logical definition of action is that it is the operation that changes some p (eg red) into not-p (eg green). In the binary space of formal logic we assume that not-(not p) = p, but if we remove the binary constraint we may find that not-(not p) = pi , (eg blue . . .).

It may seem difficul to reconcile the physical quantum of action with the intial state of the universe. The logical definition of action, however, presupposes no spacetime measure. We may attribute the frequent execution of quanta of action in the currrent universe to the replication of the initial singularity by self reference.

Energy may exist in a binary logical space by repeated execution of the quantum of action to give us a formal waveform. Let us assume that the quantum of action can be identified with the execution of any halting computation in the logical domain.

Recursive complexification is central to Turing’s paper on computation and the universe appears to follow this pattern, developing through fundamental particles, atoms, molecules, crystals, cells and so on.

The cybernetic principle of requisite variety requires that one system can control another only if the entropy of the controlling system is greater than the entropy of the system controlled. Given that the entropy of the initial singuarlity is zero, it can exercise no control over the sequence of states emerging within it which are therefore random, so that the rate of action, the formal measure of energy, has no constaint on its value. Ross Ashby: An Introduction to Cybernetics

Back to top

7: Hilbert space interpreted as the transfinite logical domain

Given a complex waveform, we are in a position to introduce quantum theory. The non-locality of quantum mechanics revealed by Bell suggests that quantum mechanics emerged before the advent of space-time. John Bell: Speakable and Unspeakable in Quantum Mechanics

We proceed therefore on the asumption that quantum mechanics is a theory of energy defined as repeated action, one quantum of action per cycle. We are here in the purely formal phase of the emergence of the universe where syntactic structures stand by themselves, obeying only the demand of consistency without reference to any meaning.

The lack of any constraint arising from the initial singularity means that the Hilbert space of the universe amplitudes comprises vectors representing every element of the transfinite domain of computation.

Quantum mechanics comprises three axioms.

First: Vectors representing different energy states, that is different frequencies, are othogonal. We measure the distance between vectors by an inner product and we may create new vectors by adding old ones (superposition).

Second: the sequential states of a quantum system are related to one another by a unitary transformation which is in effect a reversible (entropy conserving) rotation in state space.

Third: this dynamical system mapping onto itself fulfills the requirements of mathematical fixed point theory. When it is embedded in four dimensional spacetime this process is decribed as measurement and the selection of fixed points is sometimes called “the collapse of the wave function”.

Communication requires sources to share a code or basis. Zurek notes that real values in quantum mechanics (eigenvalues) result from the inner product of a vector with its complex conjugate. This is the selective process that selects stationary real states in quantum observations. Wojciech Hubert Zurek: Quantum origin of quantum jumps: breaking of unitary symmetry induced by information transfer and the transition from quantum to classical

Although the terms ‘observation’ and ‘measurement’ seem to imply the presence of a human physicist, and some commentators have felt that human consciousness is somehow involved in quantum mechanics, I assume here that states of the world can observe each other, and the selective process that reveals ‘real’ states is the conversion of periodic complex states of motion into fixed real states.

Non-relativistic quantum mechanics does not require space-time for its execution, but is the logical source of space-time.

Back to top

8: Space-time and the differentiable manifold

The domain of theories of relativity is a continuous differentiable manifold with a Minkowski metric signature, say (1, 1, 1, -1). This manifold may be interpreted as a computer network executing the discrete logical operations selected by quantum mechanics

Einstein’s project to complete the special theory of relativity which established the properties of inertial motion with a general theory that embraced acceleration was set on motion by ‘the happiest thought’ of his life: a person in free fall will not feel his own weight. Inertial motion thus served him as a reference point to study accelerated motion.

The general theory emerged with help of Riemann’s differential geometry which provided a framework to use communication between flat inertial spaces to arrive at a dynamic space whose curvature, defined by a metric, gives mathematical expression to gravitation.

The effect of curvature is to establish a potential whose gradient causes accelerations between nearby geodesics (that is paths of free fall) without subjecting them to force which would break their inertial condition, fulfilling Einsteins insight that a (small enough) person freely falling in a gravitational field would not feel their own weight.

Here we feel a connection with entanglement, which, as demonstrated by the EPR thought experiment and subsequent real experiments, introduces correlation without causality, that is it is a formalism without agency, without the transmission of force or information.

Back to top

9: The emergence of space-time: the velocity of light

After the quantum of action, the velocity of light is the next most significant fixed point in the universe, and defines the metric structure of spacetime.

The formal property of space is that both p and not-p may exist simultaneously, enabling the existence of disrete orthogonal vectors in Hilbert space, each corresponding to an energy state.

We now seek to undersand the emergence of real space-time. We have noted that state vectors may “observe” one another to create fixed points. The mechanism for process in the amplitude world is logical contact. Given the constructive nature of complexification, we should expect this contact to be maintained with the emergence of real spacetime. We guess the representation of this contact in real space time is the null geodesic which carries contact in the amplitude world to contact in the observed world.

Entanglement maintains correlations between states which share a wavefunction, and we find that when these states are separated in space the correlation operates at superluminal speeds. We may assume that since all the states of thr world emerge from the initial singularity they are entangled and this entanglement reflects their origin in the pregeometric world. Daniel Salart, et al: Testing the speed of 'spooky action at a distance'

Back to top

10: Relativistic quantum mechanics

Von Neumann placed quantum mechanics on a firm footing by unifying the Heisenberg and Schrodinger approaches in an abstract complex Hilbert space with a metric defined as a real inner product which we interpret as a probability. Physical states are represented in this space by orthogonal vectors ψ whose absolute value (computed by the inner product with themselves) is 1, with the dynamic property that eψ = ψ. In standard quantum theory the probability of quantum interactions between two states represented by ψ1 and ψ2 are computed by the inner product of the two state vectors (the Born rule) and each interaction accompanied by a quantum of action. This suggests that from an observational point of view we may identify a quantum interaction with one cycle of a wave. The energy of a photon, for instance, is givenby the relation E = hf. John von Neumann: Mathematical Foundations of Quantum Mechanics

The information represent by state vectors is a dynamic function of phase, θ, and quantum mechanical information processing is represented by differences of phase. The finite velocity of communication in space-time means that phase is a function of space-time interval.The assumption that space-time is continuous implies the the existence of infinitesimal and infinite phase differences which imply in turn infinite and infinitesimal differences in energy and momentum which lead to infinities in relativistic computations which must be removed by renormalization to reconcile events at different spatial scales.

A particular difficulty with the current approach to relativistic quantum mechanics is that gravitation is not renormalisable and so remains outside the Standard model. The root of the problem is the assumption of continuity in communication. Cantor’s work was, in effect to digitize the continuum and the invariance of the computer network with respect to scale provides us with a route to avoid the problems of continuity by respecting the fact that a stable world is quantized at all scales.

Back to top

11: The mathematical theory of communication

The mathematical theory of communication devised by Shannon provides further insight into the role of quantization. A sine qua non for a stable computer network is eror free communication. Claude Shannon: Communication in the Presence of Noise

Shannon defines an information source A by enumerating the set of symbols ai that it can emit and their probabilities pi. The theory assumes that the symbols are discrete and independent so that the sum of their emission probabilities is 1: Σi pi = 1. Communication errors occur when noise in the communication channel causes symbols to be confused. The theory avoids this confusion by encoding symbols into packets. Alexandr Khinchin: Mathematical Foundations of Information Theory

Linear increases in packet size are equivalent to increasing the dimension of communication space, whose volume therefore increases exponentially, enabling packets to be placed further apart, reducing the probability of confusion. This is equivalent to quantization.

The encoding and decoding of messages is a computation process built around two conjugate algorithms, one to encode and the other to decode the message, a coder-decoder or codec.

The coding and decoding process introduces delay into comunication. The encoding process must wait for the source to emit sufficient symbols to construct one packet. The decoding process must wait for the channel to deliver a complete packet for decoding. We may see in this delay an explanation for the finite velocity of light in a network universe.

Back to top

12: Noether’s theorem and symmetry

A consequence of the fact that digitization is required for error free information transmission is that a mathematical continuum carries no information because it embodies no marks or symbols. This notion plays a central role in Noether’s theorem which explains the existence of symetries or laws of nature. Dwight Neuenschwander: Emmy Noether's Wonderful Theorem

Noether’s theorem states that if the Lagrangian function for a physical system is not affected by a continuous transformation in the coordinate system used to describe it, then there will be a corresponding conservation law. Here the fact that continuity means that nothing happens is equivalent to symmetry. If we continuously move the origin of our time coordinate, for instance, the result is conservation of energy.

The general theory of relativity may be derived from a Lagrangian and the effect of Noether’s theorem is that changes in the coordinate systems used to describe the universe have no effect on the metric which defines the large scale struture of space-time and points to the existence of the initial singularity.

A computer network like the internet does not require a refernce frame per se, it is self addressing. Users have relatively transparent access to the addresses of all the files they wish to access and can obtain addressed spaces to store their own files and make them pubicly availabe if they so wish.

If the initial singularity is a point, or equivalently a structureless continuum, it has no capacity to carry any information that might specify the initial conditions of the universe, so we must dismiss the idea of initial conditions. The only constraint on the universe is self consistency, the same constraint as we place on mathematics. A self-consistent symbolic system is automatically granted mathematical existence. The structureless initial singularity enjoys what we might call null consistency, rather like the empty set, ∅.

Back to top

13: Computer networks: ‘logical continuity’ is more powerful than physical continuity

Physics generally assumes that the universe is geometrically continuous, a natural assumption, given the continuous appearance of macroscopic spatial motion. On the other hand all communications, observations and discussions in the Universe are quantized. Experimental physics revolves around classifying (‘binning’) and counting events. When we observe the output of physicists and mathematicians (‘the literature’) we see that it too is quantized, into discrete volumes, articles, words and symbols, like this.

Continuity is a very strong constraint on the variety of functions and models. The total number of functions on a domain of cardinal n is the cardinal of the set of permutations of n elements, i.e. n!. Of these, only 2n are continuous in the sense that each element is replaced by its nearest neighbour to realize a cyclic subgroup of the group of permutations. The ratio of continuous functions to all possible functions is thus 2n/n!.

Continua are essentially unobservable, since there are no ‘marks’ to observe. So we cannot observe the continuous unitary evolution of an isolated quantum system. When we observe such a system, we do not see the whole continuous system, but only one or other of the basis states (eigenvectors) of the operator we use to observe the system.

All our mathematical proofs, including those relating to the analysis of continua, are logical continua. This suggests further that the correct understanding of continuity in the universe is not ‘geometric continuity’ but ‘logical continuity’.

Back to top

14: Cognitive cosmology

Cognition can be understood as a logical process. Our daily lives are controled by a large number of neural processing pipelines running from our internal and external sensors to the muscle fibres, endocrine glands and other agents of our living processes. The logical network structure hinted at in this essay is has sufficient power and variety to desribe every quantum of action in the universe at whatever scale, and so might fittingly be called a cognitive model of cosmology.

Back to top

15: The bounds on the universe introduced by undecidability, incomputability and unpredictability

Hilbert dreamt that all mathematical problems are inherently soluble, but Gödel and Turing crushed his dream and demonstrated that there are bounds on consistent mathematics. These bounds destroy the determinism which was once thought to be an attribute of an infinite, omniscient and omnipotent divinity capable of knowing and controlling every moment in the life of the universe. By revealing the formal uncertainties embedded in our universe, they explain the existence of the variation which makes evolution by natural selection possible, which has brought us to our current state of somewhat uncertain existence. Andrew Hodges: Alan Turing: The Enigma

Back to top

16: A theological conclusion

Our most ancient written records indicate that theology, in one form or another, is the trditional theory of everything. Its basic function is to provide a description of our environment which gives us with some understanding of our place in the world and suggests guidance on how we should behave to optimize our lives. As noted above, Greek philosophers (and many others) began, some time ago, to replace capricious gods with very the human characteristics of lust, senseless violence, pride, jealousy and so on with more realistic theologies based on reliable historical observations, the data input to science.

The project underlying this essay is an attempt to carry this endeavour to its ultimate conclusion by recognising that the universe of our experience plays all the roles traditionally attributed to gods. The overall effect of this change of theological paradigm is to replace invisible and capricious divinities whose desires are often interpreted by powerful people acting in their own self interest with the system of the universe revealed by science which provides an epistemological foundation for an infinite variety of reliable technological routes to improving our collective condition.

From the point of view of physics, the process of renormalization (which worried Feynman) might be eliminated by establishing the domain of the universe as a network of computation which is invariant with respect to complexity, drawing this quality from Cantor’s proof for the existence of transfinite numbers, which is also invariant with respect to complexity. In this domain, which I might call cognitive cosmology, the so called quantum “collapse of the wave function” and the human conception of an idea by insight, like Archimedes eureka moment, are examples of identical computational processes at vastly different scales. Richard P. Feynman: Nobel Lecture: The Development of the Space-Time View of Quantum Electrodynamics, Bernard Lonergan: Insight: A Study of Human Understanding

(24 April 2020: written in a time of plague (as if this helps, but perhaps it helped Newton!)

Note: This essay is a speculation developed from my honours thesis written in the Department of Philosophy, Adelaide University Jeffrey Nicholls (2019): Prolegomenon to Scientific Theology

Back to top

Copyright:

You may copy this material freely provided only that you quote fairly and provide a link (or reference) to your source.

Further reading

Books

Ashby, W Ross, An Introduction to Cybernetics, Methuen 1956, 1964 'This book is intended to provide [an introduction to cybernetics]. It starts from common-place and well understood concepts, and proceeds step by step to show how these concepts can be made exact, and how they can be developed until they lead into such subjects as feedback, stability, regulation, ultrastability, information, coding, noise and other cybernetic topics.' 
Amazon
  back

Bell, John S, Speakable and Unspeakable in Quantum Mechanics, Cambridge University Press 1987 Jacket: JB ... is particularly famous for his discovery of a crucial difference between the predictions of conventional quantum mechanics and the implications of local causality . . . . This work has played a major role in the development of our current understanding of the profound nature of quantum concepts and of the fundamental limitations they impose on the applicability of classical ideas of space, time and locality. 
Amazon
  back

Hawking, Steven W, and G F R Ellis, The Large Scale Structure of Space-Time, Cambridge UP 1975 Preface: Einstein's General Theory of Relativity . . . leads to two remarkable predictions about the universe: first that the final fate of massive stars is to collapse behind an event horizon to form a 'black hole' which will contain a singularity; and secondly that there is a singularity in our past which constitutes, in some sense, a beginning to our universe. Our discussion is principally aimed at developing these two results.' 
Amazon
  back

Hodges, Andrew, Alan Turing: The Enigma, Burnett 1983 Author's note: '. . . modern papers often employ the usage turing machine. Sinking without a capital letter into the collective mathematical consciousness (as with the abelian group, or the riemannian manifold) is probably the best that science can offer in the way of canonisation.' (530) 
Amazon
  back

Khinchin, Aleksandr Yakovlevich, Mathematical Foundations of Information Theory (translated by P A Silvermann and M D Friedman), Dover 1957 Jacket: 'The first comprehensive introduction to information theory, this book places the work begun by Shannon and continued by McMillan, Feinstein and Khinchin on a rigorous mathematical basis. For the first time, mathematicians, statisticians, physicists, cyberneticists and communications engineers are offered a lucid, comprehensive introduction to this rapidly growing field.' 
Amazon
  back

Lonergan, Bernard J F, Insight: A Study of Human Understanding (Collected Works of Bernard Lonergan : Volume 3), University of Toronto Press 1992 '. . . Bernard Lonergan's masterwork. Its aim is nothing less than insight into insight itself, an understanding of understanding' 
Amazon
  back

Neuenschwander, Dwight E, Emmy Noether's Wonderful Theorem, Johns Hopkins University Press 2011 Jacket: A beautiful piece of mathematics, Noether's therem touches on every aspect of physics. Emmy Noether proved her theorem in 1915 and published it in 1918. This profound concept demonstrates the connection between conservation laws and symmetries. For instance, the theorem shows that a system invariant under translations of time, space or rotation will obey the laws of conservation of energy, linear momentum or angular momentum respectively. This exciting result offers a rich unifying principle for all of physics.' 
Amazon
  back

Veltman (2003), Martinus, Facts and Mysteries in Elementary Particle Physics, World Scientific 2003 'Introduction: The twentieth century has seen an enormous progress in physics. The fundamental physics of the first half of the century was dominated by the theory of relativity, Einstein's theory of gravitation and the theory of quantum mechanics. The second half of the century saw the rise of elementary particle physics. . . . Through this development there has been a subtle change in point of view. In Einstein's theory space and time play an overwhelming dominant role. . . . The view that we would like to defend can perhaps best be explaned by an analogy. To us, space-time and the laws of quantum mechanics are like the decor, the setting of a play. The elementary articles are the actors, and physics is what they do. . . . Thus in this book the elementary particles are the central objects.' 
Amazon
  back

von Neumann, John, and Robert T Beyer (translator), Mathematical Foundations of Quantum Mechanics, Princeton University Press 1983 Jacket: '. . . a revolutionary book that caused a sea change in theoretical physics. . . . JvN begins by presenting the theory of Hermitean operators and Hilbert spaces. These provide the framework for transformation theory, which JvN regards as the definitive form of quantum mechanics. . . . Regarded as a tour de force at the time of its publication, this book is still indispensable for those interested in the fundamental issues of quantum mechanics.' 
Amazon
  back

Papers

Salart, Daniel, et al, "Testing the speed of 'spooky action at a distance", Nature, 454, , 14 August 2008, page 861-864. 'Correlations are generally described by one of two mechanisms: either a first event influences a second one by sending information encoded in bosons or other physical carriers, or the correlated events have some common causes in their shared history. Quantum physics predicts an entirely different kind of cause for some correlations, named entanglement. This reveals itself in correlations that violate Bell inequalities (implying that they cannot be described by common causes) between space-like separated events (implying that they cannot be described by classical communication). Many Bell tests have been performed, and loopholes related to locality and detection have been closed in several independent experiments. It is still possible that a first event could influence a second, but the speed of this hypothetical influence (Einstein's 'spooky action at a distance') would need to be defined in some universal privileged reference frame and be greater than the speed of light. Here we put stringent experimental bounds on the speed of all such hypothetical influences. We performed a Bell test over more than 24 hours between two villages separated by 18 km and approximately east–west oriented, with the source located precisely in the middle. We continuously observed two-photon interferences well above the Bell inequality threshold. Taking advantage of the Earth's rotation, the configuration of our experiment allowed us to determine, for any hypothetically privileged frame, a lower bound for the speed of the influence. For example, if such a privileged reference frame exists and is such that the Earth's speed in this frame is less than 10-3 times that of the speed of light, then the speed of the influence would have to exceed that of light by at least four orders of magnitude.. back

Links

Alan Turing, On Computable Numbers, with an application to the Entscheidungsproblem, 'The "computable" numbers may be described briefly as the real numbers whose expressions as a decimal are calculable by some finite means. Although the subject of this paper is ostensibly the computable numbers, it is almost equally easy to define and investigate computable functions of an integral variable of a real or computable variable, computable predicates and so forth. . . . ' back

Claude Shannon, Communication in the Presence of Noise, 'A method is developed for representing any communication system geometrically. Messages and the corresponding signals are points in two “function spaces,” and the modulation process is a mapping of one space into the other. Using this representation, a number of results in communication theory are deduced concerning expansion and compression of bandwidth and the threshold effect. Formulas are found for the maximum rate of transmission of binary digits over a system when the signal is perturbed by various types of noise. Some of the properties of “ideal” systems which transmit at this maximum rate are discussed. The equivalent number of binary digits per second for certain information sources is calculated.' [C. E. Shannon , “Communication in the presence of noise,” Proc. IRE, vol. 37, pp. 10–21, Jan. 1949.] back

Eugene Wigner, The Unreasonable Effectiveness of Mathematics in the Natural Sciences, ' The first point is that the enormous usefulness of mathematics in the natural sciences is something bordering on the mysterious and that there is no rational explanation for it. Second, it is just this uncanny usefulness of mathematical concepts that raises the question of the uniqueness of our physical theories.' back

Jeffrey Nicholls (2019), A prolegomenon to scientific theology, ' This thesis is an attempt to carry speculative theology beyond the apogee it reached in the medieval work of Thomas Aquinas into the world of empirical science. Since the time of Aquinas, our understanding of the Universe has increased enormously. The ancient theologians not only conceived a perfect God, but they also saw the world as a very imperfect place. Their reaction was to place God outside the world. I will argue that we live in a Universe which approaches infinity in size and complexity, is as perfect as can be, and fulfils all the roles traditionally attributed to God, creator, lawmaker and judge.' back

Richard P. Feynman, Nobel Lecture: The Development of the Space-Time View of Quantum Electrodynamics, Nobel Lecture, December 11, 1965: We have a habit in writing articles published in scientific journals to make the work as finished as possible, to cover all the tracks, to not worry about the blind alleys or to describe how you had the wrong idea first, and so on. So there isn’t any place to publish, in a dignified manner, what you actually did in order to get to do the work, although, there has been in these days, some interest in this kind of thing. Since winning the prize is a personal thing, I thought I could be excused in this particular situation, if I were to talk personally about my relationship to quantum electrodynamics, rather than to discuss the subject itself in a refined and finished fashion. Furthermore, since there are three people who have won the prize in physics, if they are all going to be talking about quantum electrodynamics itself, one might become bored with the subject. So, what I would like to tell you about today are the sequence of events, really the sequence of ideas, which occurred, and by which I finally came out the other end with an unsolved problem for which I ultimately received a prize.' back

Richard P. Feynman (1965), Nobel Lecture: The Development of the Space-Time View of Quantum Electrodynamics, Nobel Lecture, December 11, 1965: We have a habit in writing articles published in scientific journals to make the work as finished as possible, to cover all the tracks, to not worry about the blind alleys or to describe how you had the wrong idea first, and so on. So there isn’t any place to publish, in a dignified manner, what you actually did in order to get to do the work, although, there has been in these days, some interest in this kind of thing. Since winning the prize is a personal thing, I thought I could be excused in this particular situation, if I were to talk personally about my relationship to quantum electrodynamics, rather than to discuss the subject itself in a refined and finished fashion. Furthermore, since there are three people who have won the prize in physics, if they are all going to be talking about quantum electrodynamics itself, one might become bored with the subject. So, what I would like to tell you about today are the sequence of events, really the sequence of ideas, which occurred, and by which I finally came out the other end with an unsolved problem for which I ultimately received a prize.' back

Thomas Aquinas, Summa, I, 2, 3, Does God exist?, 'I answer that, The existence of God can be proved in five ways. The first and more manifest way is the argument from motion. . . . ' back

Wojciech Hubert Zurek, Quantum origin of quantum jumps: breaking of unitary symmetry induced by information transfer and the transition from quantum to classical, 'Submitted on 17 Mar 2007 (v1), last revised 18 Mar 2008 (this version, v3)) Measurements transfer information about a system to the apparatus, and then further on – to observers and (often inadvertently) to the environment. I show that even imperfect copying essential in such situations restricts possible unperturbed outcomes to an orthogonal subset of all possible states of the system, thus breaking the unitary symmetry of its Hilbert space implied by the quantum superposition principle. Preferred outcome states emerge as a result. They provide framework for the “wavepacket collapse”, designating terminal points of quantum jumps, and defining the measured observable by specifying its eigenstates.' back

www.naturaltheology.net is maintained by The Theology Company Proprietary Limited ACN 097 887 075 ABN 74 097 887 075 Copyright 2000-2022 © Jeffrey Nicholls