vol III Development:
Chapter 3: Cybernetics
page 4: Creation
We are here imagining the world as a transfinite communication network. We understand this as an infinite version of the finite engineered communication networks that we have created around the world: roads, railways, pipelines, powerlines, and information networks among many others.
We may think of a network as a set of nodes which we model as computers and a set of messages, which we model as ordered sets of symbols. The messages interact with one another in the nodes through the operation of the computers.
Classical physics sees the world as a deterministic automaton. This view was first articulated by Laplace. The standard interpretation of quantum mechanics maintains that the wave function of a system evolves deterministically until the system is observed, at which point the system 'jumps' into one of the many superposed states consistent with the wave function. This jump is random with a probability spectrum revealed by the Born rule. Laplace's demon - Wikipedia, Born rule - Wikipedia
We should note here the property of a perfectly coded message mentioned in the previous page:
3. The signals transmitted by an ideal system are indistinguishable from noise. The fact that a set of physical observations looks like a random sequence is not therefore evidence for meaninglessness. Until the algorithms used to encode and decode such a sequence are known, nothing can be said about its significance.
Determinism and creation
A Turing machine is deterministic. Once set in motion in an initial state each step is a logical consequence of the step before it. It may or may not come to a conclusion, depending upon whether its initial state is computable or incomputable.
Computers in a network are 'oracle' machines, their processes can be interrupted to accept input which will change the course of their computation. Most computers these days are oracle machines. My computer sits here waiting for each keystroke, does what it is told to do and waits again. Oracle machine - Wikipedia
A deterministic machine is not creative. If we run it again with the same inputs, we will get the same outputs and every step along the way will be the same. An oracle machine in a network, however, will tend to be unpredictable because it is interrupted from outside with random inputs at at random points in its process. My computer does now know what key I am going to hit next, but it does know how to react for each available keystroke.
The difference between a computer and a network
When we look into a typical computer, we see that it is in effect a network. From an abstract point of view, both a computer and a network can be modelled as a set of memories and a set of processors which are able to read the memories and write to them. The difference is that the operations in a computer are synchronous, usually driven by one clock.Each operation of the computer is initiated by a clock pulse that propagates throughout the machine. On receipt of the pulse, the physical elements of the computer change from one state to the next as defined by the software, and then settle down in their new state awaiting the next clock pulse. The effect of the clock is to hide the dynamic processes of the computer and reveal the sequence of static logical states which formally represent the computation. The synchronicity established by the clock enables the machine to act deterministically.
The computers in a network, on the other hand, may operate asynchronously, since each has its own clock, its own clock rate, and their operations may be started at relatively random times.
Computation is a periodic process, involving the repetition of many simple operations. Given sufficient ingenuity, any computation can be represented by a spacetime network of nand gates which implement the logical Sheffer Stroke. This function, defined by a truth table, has two inputs and one output. Sheffer stroke - Wikipedia
Quantum mechanics describes the Universe by the relationships between the phases of vectors in a space of any dimension. The probability of an event is measured by the overlap of the phases of the two inputs to the event computed by the Born rule.
Let us assume that one complete revolution of phase is equivalent to the execution of one computation of a computable function. This suggests an analogy between the random interruption of a computation process at different phases of its execution and the quantum mechanical computation of probability. Quantum mechanical uncertainty thus opens the way to creativity.
Natural selection: P vs NP
The Universe we inhabit seems very much simpler than the transfinite system we imagine mathematically. In the local observable world, we inhabit a finite subset of this infinite structure. We say that this subset has been selected. Among the infinity of possible species and genotypes, natural selection has chosen the species we observe, present and past. Using the symmetry of the transfinite model, the same can be said for any structure in the world.
Selection is induced by limitation. In the biological world, only a small fraction of the sperm, eggs, embryos and infants that come into existence are destined to find enough resources to reproduce themselves. Natural selection - Wikipedia, Jones: Almost Like a Whale: The Origin of Species Updated
The limiting feature of the transfinite network is computing power. Of the unlimited number of functions that can be represented in the Cantor Universe, only ℵ0 are computable.
Let us assume that we can model each message in the network as a function, and that only computable functions are observable. All such functions are discrete, that is they can be represented by mappings of a set of natural numbers onto itself. This assumption is consistent with the experience in physics, where we find that the only observations we can make are counts of discrete quantum events like the creation or annihilation of a particle.
Evolution works by variation and selection. The variation is uncontrolled, that is random, not computed. Selection picks out the viable variations, those which are capable of maintaining and reproducing themselves. Our world is inabited by discrete entities like ourselves, stars and grains of sand. Let us assume that each of these entities maintains its existence by a variety of feedback loops. When an organism feels itself being stressed, it reacts in a manner which reduces the stress. I eat when I am hungry; a steel girder resists when I try to bend it.
This leads us to the assume survival and reproduction depend upon deterministic and reproducible processes. On the other hand, discovery of these processes is not deterministic, but depends on the random search of variation. Science works in a similar way. Random insights substantiated by deterministic experimentation. Fortun & Bernstein: Muddling Through
Survival requires performing various tasks like obtaining food or a mate. Each task is a series of actions determined by an algorithm stored in the creature's memory. Different algorithms may be available to perform a particular task. Some of these algorithms may be more efficient than others, using fewer resources to perform the same task. We guess that in the long run the most productive algorithms are naturally selected. Algorithmic efficiency - Wikipedia
Although most work on evolution is done in a biological context, the proposal here is that all of the observed struture and communication in the Universe has been refined by evolutionary process. The observed world, that which is fit to survive, is thus an infinitesimal fraction of the possible worlds modelled by the transfinite network.
Mathematically, we can see variation and selection as an instance of the P vs NP problem in the theory of computation. Clay Mathematics Institute: P vs NP problem, Carlson: The Millennium Prize Problems
The class P contains those problems which can be solved by a algorithm (a Turing machine) in a finite (polynomial) number of steps. Such processes are said to run in polynomial time.
NP stands for non-deterministic polynomial time processes. The idea here is that although NP problems may not be P computable an answer, once obtained by some method (such as random assortment), can checked in by a process in P. We then imagine NP to correspond to the process of generating new solutions by variation, and P corresponds to the checking of these solutions by natural selection, that is testing them deterministically to see if they are capable of survival and reproduction. This is equivalent to determining whether they are consistent with the realistic requirements of survival.
Consistency
A constraint it something that limits or controls the motion of a system. Constraints may be external, like gravitation, or internal, resulting from the structure of the system in question. So the relationships between piston, cylinder, connecting rod and crankshaft in a reciprocating engine determine that the piston will move up and down the cylinder as the crankshaft rotates.
Traditionally, the only constraint on God is that it be consistent. When we ask ourselves questions like 'can an omnipotent God make a stone bigger than it can lift' we see that even omnipotence has to respect consistency. Once this was believed to require the traditional properties of God, eternity, omniscience and omnipotence and so on to correspond to a static infinity defined by the state of medieval mathematics. Now with the advent of indeterminacy and transfinite numbers, we are able to contemplate a much bigger dynamic model of God.
Science recognizes consistency at two levels, which we might call formal and empirical. Formal consistency requires that the formal mathematical and logical models used in science be internally consistent. It has been shown that large formal systems are both incomplete and incomputable so that it is often difficult to tell whether the models used in science are consistent.
Empirical consistency requires that our formal models can be mapped consistently to observations, at least to the limits of observability. The fundamental article of scientific faith is that the scientific method is leading us toward a consistent union of a formal 'theory of everything' and all available observations. In the end we would like to see that ours is the only possible Universe, ie a Universe that embraces all possibilities.
(revised 8 January 2019)