natural theology

We have just published a new book that summarizes the ideas of this site. Free at Scientific Theology, or, if you wish to support this project, buy at Scientific Theology: A New Vision of God

Contact us: Click to email
vol III Development:

Chapter 3: Cybernetics

page 4: Creation

We are here imagining the world as a transfinite communication network. We understand this as an infinite version of the finite engineered communication networks that we have created around the world: roads, railways, pipelines, powerlines, and information networks among many others.

We may think of a network as a set of nodes which we model as computers and a set of messages, which we model as ordered sets of symbols. The messages interact with one another in the nodes through the operation of the computers.

Classical physics sees the world as a deterministic automaton. This view was first articulated by Laplace. The standard interpretation of quantum mechanics maintains that the wave function of a system evolves deterministically until the system is observed, at which point the system 'jumps' into one of the many superposed states consistent with the wave function. This jump is random with a probability spectrum revealed by the Born rule. Laplace's demon - Wikipedia, Born rule - Wikipedia

We should note here the property of a perfectly coded message mentioned in the previous page:

3. The signals transmitted by an ideal system are indistinguishable from noise. The fact that a set of physical observations looks like a random sequence is not therefore evidence for meaninglessness. Until the algorithms used to encode and decode such a sequence are known, nothing can be said about its significance.
Determinism and creation

A Turing machine is deterministic. Once set in motion in an initial state each step is a logical consequence of the step before it. It may or may not come to a conclusion, depending upon whether its initial state is computable or incomputable.

Computers in a network are 'oracle' machines, their processes can be interrupted to accept input which will change the course of their computation. Most computers these days are oracle machines. My computer sits here waiting for each keystroke, does what it is told to do and waits again. Oracle machine - Wikipedia

A deterministic machine is not creative. If we run it again with the same inputs, we will get the same outputs and every step along the way will be the same. An oracle machine in a network, however, will tend to be unpredictable because it is interrupted from outside with random inputs at at random points in its process. My computer does now know what key I am going to hit next, but it does know how to react for each available keystroke.

The difference between a computer and a network
When we look into a typical computer, we see that it is in effect a network. From an abstract point of view, both a computer and a network can be modelled as a set of memories and a set of processors which are able to read the memories and write to them. The difference is that the operations in a computer are synchronous, usually driven by one clock.

Each operation of the computer is initiated by a clock pulse that propagates throughout the machine. On receipt of the pulse, the physical elements of the computer change from one state to the next as defined by the software, and then settle down in their new state awaiting the next clock pulse. The effect of the clock is to hide the dynamic processes of the computer and reveal the sequence of static logical states which formally represent the computation. The synchronicity established by the clock enables the machine to act deterministically.

The computers in a network, on the other hand, may operate asynchronously, since each has its own clock, its own clock rate, and their operations may be started at relatively random times.

Computation is a periodic process, involving the repetition of many simple operations. Given sufficient ingenuity, any computation can be represented by a spacetime network of nand gates which implement the logical Sheffer Stroke. This function, defined by a truth table, has two inputs and one output. Sheffer stroke - Wikipedia

Quantum mechanics describes the Universe by the relationships between the phases of vectors in a space of any dimension. The probability of an event is measured by the overlap of the phases of the two inputs to the event computed by the Born rule.

Let us assume that one complete revolution of phase is equivalent to the execution of one computation of a computable function. This suggests an analogy between the random interruption of a computation process at different phases of its execution and the quantum mechanical computation of probability. Quantum mechanical uncertainty thus opens the way to creativity.

Natural selection: P vs NP

The Universe we inhabit seems very much simpler than the transfinite system we imagine mathematically. In the local observable world, we inhabit a finite subset of this infinite structure. We say that this subset has been selected. Among the infinity of possible species and genotypes, natural selection has chosen the species we observe, present and past. Using the symmetry of the transfinite model, the same can be said for any structure in the world.

Selection is induced by limitation. In the biological world, only a small fraction of the sperm, eggs, embryos and infants that come into existence are destined to find enough resources to reproduce themselves. Natural selection - Wikipedia, Jones: Almost Like a Whale: The Origin of Species Updated

The limiting feature of the transfinite network is computing power. Of the unlimited number of functions that can be represented in the Cantor Universe, only 0 are computable.

Let us assume that we can model each message in the network as a function, and that only computable functions are observable. All such functions are discrete, that is they can be represented by mappings of a set of natural numbers onto itself. This assumption is consistent with the experience in physics, where we find that the only observations we can make are counts of discrete quantum events like the creation or annihilation of a particle.

Evolution works by variation and selection. The variation is uncontrolled, that is random, not computed. Selection picks out the viable variations, those which are capable of maintaining and reproducing themselves. Our world is inabited by discrete entities like ourselves, stars and grains of sand. Let us assume that each of these entities maintains its existence by a variety of feedback loops. When an organism feels itself being stressed, it reacts in a manner which reduces the stress. I eat when I am hungry; a steel girder resists when I try to bend it.

This leads us to the assume survival and reproduction depend upon deterministic and reproducible processes. On the other hand, discovery of these processes is not deterministic, but depends on the random search of variation. Science works in a similar way. Random insights substantiated by deterministic experimentation. Fortun & Bernstein: Muddling Through

Survival requires performing various tasks like obtaining food or a mate. Each task is a series of actions determined by an algorithm stored in the creature's memory. Different algorithms may be available to perform a particular task. Some of these algorithms may be more efficient than others, using fewer resources to perform the same task. We guess that in the long run the most productive algorithms are naturally selected. Algorithmic efficiency - Wikipedia

Although most work on evolution is done in a biological context, the proposal here is that all of the observed struture and communication in the Universe has been refined by evolutionary process. The observed world, that which is fit to survive, is thus an infinitesimal fraction of the possible worlds modelled by the transfinite network.

Mathematically, we can see variation and selection as an instance of the P vs NP problem in the theory of computation. Clay Mathematics Institute: P vs NP problem, Carlson: The Millennium Prize Problems

The class P contains those problems which can be solved by a algorithm (a Turing machine) in a finite (polynomial) number of steps. Such processes are said to run in polynomial time.

NP stands for non-deterministic polynomial time processes. The idea here is that although NP problems may not be P computable an answer, once obtained by some method (such as random assortment), can checked in by a process in P. We then imagine NP to correspond to the process of generating new solutions by variation, and P corresponds to the checking of these solutions by natural selection, that is testing them deterministically to see if they are capable of survival and reproduction. This is equivalent to determining whether they are consistent with the realistic requirements of survival.

Consistency

A constraint it something that limits or controls the motion of a system. Constraints may be external, like gravitation, or internal, resulting from the structure of the system in question. So the relationships between piston, cylinder, connecting rod and crankshaft in a reciprocating engine determine that the piston will move up and down the cylinder as the crankshaft rotates.

Traditionally, the only constraint on God is that it be consistent. When we ask ourselves questions like 'can an omnipotent God make a stone bigger than it can lift' we see that even omnipotence has to respect consistency. Once this was believed to require the traditional properties of God, eternity, omniscience and omnipotence and so on to correspond to a static infinity defined by the state of medieval mathematics. Now with the advent of indeterminacy and transfinite numbers, we are able to contemplate a much bigger dynamic model of God.

Science recognizes consistency at two levels, which we might call formal and empirical. Formal consistency requires that the formal mathematical and logical models used in science be internally consistent. It has been shown that large formal systems are both incomplete and incomputable so that it is often difficult to tell whether the models used in science are consistent.

Empirical consistency requires that our formal models can be mapped consistently to observations, at least to the limits of observability. The fundamental article of scientific faith is that the scientific method is leading us toward a consistent union of a formal 'theory of everything' and all available observations. In the end we would like to see that ours is the only possible Universe, ie a Universe that embraces all possibilities.

(revised 8 January 2019)

Copyright:

You may copy this material freely provided only that you quote fairly and provide a link (or reference) to your source.

Further reading

Books

Click on the "Amazon" link below each book entry to see details of a book (and possibly buy it!)

Carlson, James, and Arthur Jaffe and Andrew Wiles, The Millennium Prize Problems, ClayMathematics Institute and American Mathematical Society 2006
1: The Birch and Swinnerton-Dyer Conjecture: Andrew Wiles
2: The Hodge Conjecture: Pierre Deligne
3: The Existence and Smoothness of the Navier-Stokes Equation: Charles L Fefferman
4: The Poincare Conjecture: John Milnor
5: The P versus NP Problem: Stephen Cook
6: The Riemann Hypothesis: Enrico Bombieri
7: Quantum Yang-Mills Theory: Arthur Jaffe and Edward Whitten 
Amazon
  back

Fortun, Mike, and Herbert J Bernstein, Muddling Through: Pursuing Science and Truths in the Twenty-First Century, Counterpoint 1998 Amazon editorial review: 'Does science discover truths or create them? Does dioxin cause cancer or not? Is corporate-sponsored research valid or not? Although these questions reflect the way we're used to thinking, maybe they're not the best way to approach science and its place in our culture. Physicist Herbert J. Bernstein and science historian Mike Fortun, both of the Institute for Science and Interdisciplinary Studies (ISIS), suggest a third way of seeing, beyond taking one side or another, in Muddling Through: Pursuing Science and Truths in the 21st Century. While they deal with weighty issues and encourage us to completely rethink our beliefs about science and truth, they do so with such grace and humor that we follow with ease discussions of toxic-waste disposal, the Human Genome Project, and retooling our language to better fit the way science is actually done.' 
Amazon
  back

Jones, Steve, Almost like a Whale: The Origin of Species Updated, Doubleday 1999 An Historical Sketch: 'The Origin of Species is, without doubt, the book of the millennium. ... [This book] is, as far as is possible, an attempt to rewrite the Origin of Species. I use its plan, developing as it does from farms to fossils, from beehives to islands, as a framework, but my own Grand Facts ... are set firmly in the late twentieth century. Almost Like a Whale tries to read Charles Darwin's mind with the benefit of scientific hindsight and to show how the theory of evolution unites biology as his millenium draws to an end.' (xix)  
Amazon
  back

Links

Born rule - Wikipedia, Born rule - Wikipedia, the free encyclopedia, 'The Born rule (also called the Born law, Born's rule, or Born's law) is a law of quantum mechanics which gives the probability that a measurement on a quantum system will yield a given result. It is named after its originator, the physicist Max Born. The Born rule is one of the key principles of the Copenhagen interpretation of quantum mechanics. There have been many attempts to derive the Born rule from the other assumptions of quantum mechanics, with inconclusive results. . . . The Born rule states that if an observable corresponding to a Hermitian operator A with discrete spectrum is measured in a system with normalized wave function (see bra-ket notation), then the measured result will be one of the eigenvalues λ of A, and the probability of measuring a given eigenvalue λi will equal <ψ|Pi|ψ> where Pi is the projection onto the eigenspace of A corresponding to λi'. back

Clay Mathematics Institute, P vs NP problem, 'Suppose that you are organizing housing accommodations for a group of four hundred university students. Space is limited and only one hundred of the students will receive places in the dormitory. To complicate matters, the Dean has provided you with a list of pairs of incompatible students, and requested that no pair from this list appear in your final choice. This is an example of what computer scientists call an NP-problem, since it is easy to check if a given choice of one hundred students proposed by a coworker is satisfactory (i.e., no pair taken from your coworker's list also appears on the list from the Dean's office), however the task of generating such a list from scratch seems to be so hard as to be completely impractical. Indeed, the total number of ways of choosing one hundred students from the four hundred applicants is greater than the number of atoms in the known universe! Thus no future civilization could ever hope to build a supercomputer capable of solving the problem by brute force; that is, by checking every possible combination of 100 students. However, this apparent difficulty may only reflect the lack of ingenuity of your programmer. In fact, one of the outstanding problems in computer science is determining whether questions exist whose answer can be quickly checked, but which require an impossibly long time to solve by any direct procedure. Problems like the one listed above certainly seem to be of this kind, but so far no one has managed to prove that any of them really are so hard as they appear, i.e., that there really is no feasible way to generate an answer with the help of a computer. Stephen Cook and Leonid Levin formulated the P (i.e., easy to find) versus NP (i.e., easy to check) problem independently in 1971.' back

Laplace's demon - Wikipedia, Laplace's demon - Wikipedia, the free encyclopedia, 'We may regard the present state of the universe as the effect of its past and the cause of its future. An intellect which at a certain moment would know all forces that set nature in motion, and all positions of all items of which nature is composed, if this intellect were also vast enough to submit these data to analysis, it would embrace in a single formula the movements of the greatest bodies of the universe and those of the tiniest atom; for such an intellect nothing would be uncertain and the future just like the past would be present before its eyes.' A Philosophical Essay on Probabilities, Essai philosophique dur les probabilites introduction to the second edition of Theorie analytique des probabilites based on a lecture given in 1794. back

Oracle machine - Wikipedia, Oracle machine - Wikipedia, the free encyclopedia, 'IIn complexity theory and computability theory, an oracle machine is an abstract machine used to study decision problems. It can be visualized as a Turing machine with a black box, called an oracle, which is able to decide certain decision problems in a single operation. The problem can be of any complexity class. Even undecidable problems, like the halting problem, can be used.' back

P versus NP problem - Wikipedia, P versus NP problem - Wikipedia, the free encyclopedia, 'The P versus NP problem is a major unsolved problem in computer science. It asks whether every problem whose solution can be quickly verified (technically, verified in polynomial time) can also be solved quickly (again, in polynomial time). The underlying issues were first discussed in the 1950s, in letters from John Forbes Nash Jr. to the National Security Agency, and from Kurt Gödel to John von Neumann. The precise statement of the P versus NP problem was introduced in 1971 by Stephen Cook in his seminal paper "The complexity of theorem proving procedures" and is considered by many to be the most important open problem in the field.' back

Sheffer stroke - Wikipedia, Sheffer stroke - Wikipedia, the free encyclopedia, 'In Boolean functions and propositional calculus, the Sheffer stroke, named after Henry M. Sheffer, written "|" . . . denotes a logical operation that is equivalent to the negation of the conjunction operation, expressed in ordinary language as "not both". It is also called nand ("not and") or the alternative denial, since it says in effect that at least one of its operands is false.' back

www.naturaltheology.net is maintained by The Theology Company Proprietary Limited ACN 097 887 075 ABN 74 097 887 075 Copyright 2000-2019 © Jeffrey Nicholls