Essay 30: Cognitive cosmology
In conclusion one has to recall that one reason why the ontological interpretation of [Quantum Field Theory] is so difficult is the fact that it is exceptionally unclear which parts of the formalism should be taken to represent anything physical in the first place. And it looks as if that problem will persist for quite some time. Meinard Kuhlmann (Stanford Encyclopedia of Philosophy): Quantum Field Theory
A physical understanding is completely unmathematical, imprecise, an inexact thing but absolutely necessary to a physicist. Richard Feynman: Lectures on Physics II Chapter 2: Differential Calculus of Vector FieldsResearch is to see what everybody has seen and think what nobody has thought. Albert Szent-Györgyi (1957): Bioenergetics1: Abstract
2: Introduction: The creation of the world
3: Action: unmoved mover to quantum
4: Gravitation and classical singularity
5: God's ideas, cybernetics and singularity
6: Evolution: variation and selection
7: Networks, brains and intelligence
8: The classical trinitarian theology of action
9: The active creation of Hilbert space
10: The emergence of quantum mechanics
11: Quantization: the mathematical theory of communication
12: The quantum creation of Minkowski space
13: Is Hilbert space independent of Minkowski space?
14: The measurement problem
15: Gravitation
16: Why are quantum amplitudes and logical processes invisible
17: Energy bifurcates into potential and kinetic, enabling a zero energy universe.
18: A model: classical and quantum information processing networks
19: Space-time is the operating system of the universe
20: Dynamics, fixed points and particles
21: Quantum electrodynamics: QED
22: Quantum chromodynamics: QD
23: Methodology: the mathematical community
24: Some principles
25: Some conclusions
1: Abstract
Theology is the ancient traditional theory of everything. If the Universe is to be divine, physics and theology must be mutually consistent, and this constraint requires that both disciplines be radically revised,
The modern search for a theory of everything has run aground on gravitation. We are pretty sure that quantum mechanics is the true theory of the world, and we have a comprehensive quantum mechanical explanation for all known particles from fundamental to the planets, stars and galaxies that dot the vast spaces of the universe, but a quantum mechanical explanation of gravitation still eludes us, casting some doubt on progress so far.
Einstein's classical gravitation, in the hands of Penrose, Hawking and Ellis suggests that the universe began as a structureless initial singularity which imploded to give us the universe we observe. We imagine that both gravitation and quantum mechanics are connected with the emergence of the universe within the initial singularity, so it is a perfect starting point to develop of a comprehensive model of creation.
Von Neumann shows that universe increases its entropy (that is creates itself) by observing itself. This implies that physics is a logical or cognitive process, akin to human self awareness, that we might best understand by thinking of the universe as a mind. John von Neumann (2014): Mathematical Foundations of Quantum Mechanics, Chapter 5
A path to cognitive cosmology is relatively clear: we begin with the initial singularity and identify it as the primordial quantum of action, identical to the traditional God described by Aristotle and Aquinas. Unmoved mover - Wikipedia, Thomas Aquinas, Summa, I, 2, 3: Does God exist?
We then examine the ancient Christian theological and psychological doctrine of the Trinity devised by Augustine and Aquinas to explain how one God can become three. This doctrine exhibits features similar to superposition in quantum mechanics and leads us to an understanding of divine creativity. The next step is to identify the features of this theological structure that are present in the initial singularity. At this point the traditional divinity and the initial singularity are both formally identical quanta of action and the source of the universe.The final step is to trace the emergence of quantum mechanics, space-time, gravitation and particles, providing material to construct enormously complex intelligent beings such as our planet and ourselves. The known history of the Universe adds empirical support to this hypothetical picture of an intelligent divine world. Chronology of the universe - Wikipedia
2: Introduction: Top down or bottom up?
In the beginning God created the heaven and the earth. Genesis, from the Holy Bible, King James Version
In the traditional creation story God created the world according to a plan which already existed in their mind. The medieval theologian Thomas Aquinas calls this plan ideas, a nod to the Platonic notion that an invisible heaven of ideas or forms defines the structure of the world. Plato thought that our world is a poor reflection of these perfect ideas, which is why it was, to his mind, rather shoddy. Only philosophers (like himself) could perceive the true beauty of these invisible forms. Aquinas, Summa I, 15, 1: Are there ideas in God?, Form of the Good - Wikipedia, Allegory of the cave - Wikipedia
Aquinas made a direct connection between the ideas and the design of the created world. God, by analogy with a human builder, has architectural plans in mind as they make the world. We might ask how long it took God to design the world. The standard answer is that since God is eternal, "how long" has no meaning. God just is, and the contents of God's mind are just as eternal as they are. God could have made a different world but it is what it is. Aquinas, Summa, I, 14, 8: Is the knowledge of God the cause of things?
The Christian tradition still runs deep in the scientific world. Given that the aim of science is to understand the world, we usually take the nature of the world as given and seek a scientific understanding of what the creator had in mind when they made it. Paul Davies (1992): The Mind of God: Science and the Search for Ultimate Meaning, Michio Kaku (2021): The God Equation: The Quest for a Theory of Everythings
Here I wish to break from this tradition, following a consequence of the general theory of relativity.
Penrose, Hawking and Ellis deduced that Einstein's general theory of relativity predicts the incidence of singularities at the boundaries of the universe. There is now strong astronomical evidence for the existence of black holes, and Hawking and Ellis speculated that the big bang that initiated the emergence of the universe within the initial singularity might be understood as a "time reversed black hole". Hawking & Ellis (1975): The Large Scale Structure of Space-Time, Big Bang - Wikipedia
These singularities are considered to lie outside the laws of nature. Our experience with black holes, however suggests that they contain energy and mass which shape the space around them, so controlling the motions of nearby visible structures. This suggests that the pointlike singularity at the root of the universe may contain all the energy of the universe.
This seems to me hard to imagine. From a formal point of view, this initial singularity is indistinguishable from the traditional God of Aquinas: it is outside space and time, and so eternal; it has no structure, so it is absolutely simple; it is beyond the laws of nature, so can give no meaning to energy; and it is the source or creator of the universe. Tradition, dating from Aristotle 2350 years ago and descended to us through Aquinas and Catholic theology, holds that God is pure action actus purus. So here I will assume that the initial singularity is action rather than energy. Aquinas, Summa, I, 3, 7: Is God altogether Simple?
The absolute simplicity of the initial singularity, in the light the cybernetic principle of requisite variety, precludes the existence of any plan of the universe within it (§5 below). All we have is unbounded reproductive activity controlled by local consistency. What we need to look for is a mechanism for the emergence of the universe as we know it within the initial singularity.
What I want to create is model of the universe that embraces not only gravitation and the fundamental particles, but also embraces all structures in the universe including ourselves, no matter how complex, and shows that the universe like the traditional God, touches the bounds of possibility.
The ancients, like Plato, Aristotle and Aquinas divided the world into material and immaterial or spiritual. They thought that knowledge and intelligence are correlated with immateriality. Aquinas argued that God is maximally intelligent because they are maximally spiritual. Since that time, we have come to see information as a physical entity and intelligence, that is information processing, as a physical process. Intelligence is represented technologically by the work of a computing machinery, biologically as a process in complex neural networks like our brains and physically by the power of computation and communication embedded in the quantum world. Aquinas, Summa: I, 14, 1: Is there knowledge in God?, Rolf Landauer (1999): Information is a Physical Entity
Quantum theory began as a physical theory, but since the 80s we have learnt to see it as the description of computation and communication in the universe. It is now accepted as the foundation of our understanding of the world and opens the way to a vision of the world as an intelligent system responsible for its own creation. This idea first achieved widespread acceptance in biology, where the theory of biological evolution explains the enormous spectrum of living species that has arisen on Earth over the last four billion years. Nielsen & Chuang (2000): Quantum Computation and Quantum Information, Evolution - Wikipedia
We may think of quantum theory as a modern description of the ancient idea of spirit. It is rather diaphanous, invisible and exists in an environment of pure action and energy that precedes spacetime and classical physics. It is very similar to music, in perpetual motion, feeding the visible universe with the possibilities which serve as the foundation of evolution. Our "classical" world is selected by the system becoming aware of itself through self-interaction (aka "measurement"). The interaction between the Hilbert space of quantum theory and the Minkowski space of everyday life sets the scene for the measurement problem which has been with us almost since the beginning. Interpretations of quantum mechanics - Wikipedia
Quantum theory is often hamstrung by its derivation from classical physics. Each new breakthrough in quantum mechanics appears to have arisen from the rejection of a classical concept. Planck's discovery put us on notice that continuous mathematics in physics is not everything. The universe is radically discrete or integral, its life measured in discrete quanta of action.
Next was a qualification of the classical concept of determinism. While the need for consistency in quantum mechanics lead us to the "uncertainty principle" deeper results were revealed by Gödel and Turing: a large enough consistent system cannot be deterministic. Around the edges at least, both mathematics and thew world appear to be incomplete and incomputable. Uncertainty principle - Wikipedia, Gödel's incompleteness theorems - Wikipedia, Turing's Proof - Wikipedia
A next step seems to go deeper. Martinus Veltman writes:
In general, if we do a Lorentz or Poincare transformation then a state in Hilbert space will transform to another state . . . Thus, corresponding to a Lorentz transformation there is a complicated big transformation in Hilbert space. (page 20)
To sum up:
To every Lorentz transformation, or more generally a Poincaré transformation, corresponds a transformation in Hilbert space.Martinus Veltman (1994): Diagrammatica: The Path to the Feynman Rulespage 21
This is (I think) the only boldface statement in the book. Is it true? It is based on the assumption that Minkowski space is the domain of Hilbert space. Is this true? Hilbert space is built on the domain of complex numbers which have no order but the fundamental theorem of algebra holds there. Minkowski space is built on the domain of real numbers which have an order but are limited by the theorems of Gödel and Turing. In this essay I will explore the idea that Hilbert space is both the predecessor and the quantum source of classical Minkowski space. I feel that these spaces are two distinct layers in the emergence of the universal structure from the initial singularity.
This idea introduces a new degree of freedom into quantum field theory which may solve many problems. Perhaps the most important difference is that the vectors of Hilbert space of Hilbert are normalized rays, and the operators are element of a ring which is like the integers naturally quantized, since it embodies addition, substraction and multiplication, but has no fractions resulting from division.
Let us therefore propose that the initial singularity is a vector in a zero dimensional Hilbert space, best represented by the empty set, ∅. To delve deeper into how this singularity reproduces itself, we take another look at the cognitive explanation of the procession of the Son from the Father in theology of the Trinity. Here we look at the thirteenth century work of Aquinas as revised and expanded by Bernard Lonergan in the twentieth century.
In Roman Catholic theology God is pure act, a fact first explained by Aristotle. The term "act" has evolved rapidly through the eras of classical and quantum physics, but the term as used in Catholic theology has experienced little change in more than a thousand years. Pius X tried to contain the philosophy and theology of Aquinas in 24 theses. The motivation is obvious: Aquinas is still the standard; the "modernists" are heretics. The 24 theses of Pope Pius X
Now it is time for physics and theology to meet by agreeing on the meaning of the term action. We therefore begin with a history of action from the time it was coined by Aristotle about 2350 years ago until the present.
The modern version of this idea is quantum field theory, which proposes a space of invisible fields to guide the behaviour of the observable world. This theory is beset by serious problems. Practically, the most acute is the 'cosmological constant problem'. One interpretation of quantum field theory predicts results that differ from observation by about 1o0 orders of magnitude, ie 10100. One point of this essay is to re-interpret the relationship between mathematical theory and reality in a way that points to a solution this problem. Quantum field theory - Wikipedia, Cosmological constant problem - Wikipedia
To explain this I follow in some detail the long and winding trail from the absolute simplicity of the initial singularity to the majestic complexity of the present universe. This all happens inside God, not outside as the traditional story tells us. We are part of the divine world, owe our existence to it, share its conscious intelligence, created, as the ancients said, in the image of God. The most powerful product of this intelligence is reflected in the mathematical formalism that shapes our selves and our world. Mathematics as we know it is embodies in the mathematical community and its literature. The fact that mathematics serves as the skeleton of science implies that it is also represented in the Universe. Eugene Wigner (1960): The Unreasonable Effectiveness of Mathematics in the Natural Sciences
3: Action: from unmoved mover to the quantum
Theology and astronomy have a very long history. There are close cultural connections between Earth and the Heavens. Plato produced the classic description of a spiritual heaven of eternal, perfect forms which shaped the Earth and our knowledge of Earth. For Plato, however, the Earth and our knowledge are very poor copies of their heavenly paradigms.
An ancient Greek interface between poetic like Homer who described the behaviour of the gods and later more philosophically oriented writers is found in the work of Parmenides (5th - 6th century bce). We have fragments of a didactic poem describing his education by a Goddess. In this poem he developed a durable foundation for science, based on the proposition that permanently true knowledge is only possible if the subject is immutable or invariant.
Like many philosophers since, Parmenides sought to use the possibility of knowledge to constrain the nature of the world, an application of the anthropic principle. His "way of truth" is that we can only have true knowledge of eternal realities. Knowledge of ephemera, the way of opinion, is useless because as soon as we have it the subject changes, invalidating the new knowledge. Anthropic principle - WikipediaBack
Parmenides' Goddess announces the core of her teaching:
You must needs learn all things, both the unshaken heart of well-rounded reality and the notions of mortals, in which there is no genuine trustworthiness. (Fr. 1.28b-32)
The Goddess dismisses the way of opinion and turns to describe true reality, "what is". First, it is "ungenerated and deathless". Further it it is "whole and uniform", "motionless", and "perfect":
But since there is a furthest limit, it is perfected from every side, like the bulk of a well-rounded globe, from the middle equal every way: for that it be neither any greater nor any smaller in this place or in that is necessary; for neither is there non-being, which would stop it reaching to its like, nor is What Is such that it might be more than What Is here and less there. Since it is all inviolate, for it is equal to itself from every side, it extends uniformly in limits.
This suite of attributes have remained central to the description of what is for 2500 years. Parmenides student Zeno produced a series of argument to show that motion is impossible to support Parmenides' position.
This old idea is invalidated by the more modern Nyquist-Shannon sampling theorem: we can have true knowledge of changing situations if we update fast enough. The key to catching a moving ball is to watch it closely and move accordingly. Perhaps Parmenides did not play ball games. Nyquist-Shannon sampling theorem - Wikipedia
Parmenides work was taken up by Plato, which greatly increased its visibility. Plato guessed that the structure of the observable world is shaped by invisible, eternal, Our world, he thought, is just a pale shadow of these forms. Plato's student Aristotle brought these forms down to Earth with his theory of hylomorphism. Theory of Forms - Wikipedia, Allegory of the cave - Wikipedia, Hylomorphism - Wikipedia
Hylomorphism may be seen as a description of the static structure of the world which nevertheless enables change. The same matter may take different forms: a mass of bronze may be formed into a sword or a ploughshare. Thomas Ainsworth: Form vs. matter
Aristotle saw matter as potentially something and form as the element that made it actually something. He was the first to make a careful study of action. He coined two words for it, energeia (ενεργεια) and entelecheia (εντελεχεια). Energeia may be translated as being-at-work and entelecheia as completeness, the end of work. Both these active terms are contrasted to potentiality dynamis (δυναμις), which can mean either active power or passivity. These two ideas comprise the essence of his doctrine of potency and actuality with the addition of an axiom: no potentiality can actualize itself. Using this axiom Aristotle deduced the existence of an unmoved mover which is pure actuality and the driver of the world. Unmoved mover - Wikipedia
Aristotle's works entered the newly formed Christian Universities of Europe when they were brought back fro the Muslim East by the Crusaders. Albert the Great and Thomas Aquinas used Aristotle's work to build a new philosophical foundation for Christian theology. Aquinas developed a new Catholic model of God (which has since become standard) from Aristotle's theological treatment of the first unmoved mover in his Metaphysics:
But if there is something which is capable of moving things or acting on them, but is not actually doing so, there will not necessarily be movement; for that which has a potency need not exercise it. Nothing, then, is gained even if we suppose eternal substances, as the believers in the Forms do [ie Plato], unless there is to be in them some principle which can cause change; nay, even this is not enough, nor is another substance besides the Forms enough; for if it is not to act, there will be no movement. Further even if it acts, this will not be enough, if its essence is potency; for there will not be eternal movement, since that which is potentially may possibly not be. There must, then, be such a principle, whose very essence is actuality. Further, then, these substances must be without matter; for they must be eternal, if anything is eternal. Therefore they must be actuality. Aristotle Metaphysics XII, vi, 2
Aristotle thought that the unmoved mover was an integral part of the Cosmos and that the world is eternal so had no need for a creator. Aquinas used Aristotle's argument to establish the existence of God, but faithful to his religion placed this creator outside the Universe. The doctrine of the Summa has never been supeseded in the Church. It remains officially endorsed in Canon Law Aristotle, Metaphysics 1072b3 sqq., Aquinas Summa I, 2, 3: Does God exist?, Holy See: Code of Canon Law: Canon 252 § 3
The Catholic theologian Bernard Lonergan set out to reconceive Aquinas' arguments for the existence of God in his treatise on Metaphysics Insight in epistemological rather than physical terms. Lonergan's story follows the time honoured path. We all agree that the world exists, but we can see (they say) that it cannot account for its own existence. There must therefore be a Creator to explain the existence of the world which we might all agree to call God. Lonergan: Insight, A Study of Human Understanding
Lonergan set out to argue that God is other than the Universe by following the epistemological path pioneered by Parmenides, using the act of human understanding, insight, as his starting point. God, he said, must be perfectly intelligible. But the world is not perfectly intelligible. It contains meaningless data, empirical residue, so it cannot be divine. I think the weak spot in this argument is the idea that the world contains meaningless data. The theory of evolution suggests that there is a reason for every detail. The world is dense with meaning.
Although Lonergan fails to proving that God is not the Universe, his work led me to think of the universe in epistemological terms. The story presented here is intended to consolidate the view that the Universe plays all the roles attributed to traditional Gods.
Implicit in the ancient views is the idea that matter is dead and inert and cannot move itself. It cannot be the seat of understanding. It cannot be creative. Since the advent of modern physics, founded on relativity and quantum theory, these ideas are untenable. Physics based on quantum theory describes a universe in perpetual motion as a gigantic network of communication equivalent to a mind.
The Medieval universities gradually developed the notion that ancient texts and pure reason could not fully explain the world. This attitude was strongly supported by astronomy, a science based on observation. Galileo's telescope led to radical developments in astronomy, and some conflict with ancient religious beliefs. Galileo's opinion that mathematics is the language of the universe reached a high point in Isaac Newton's description of gravitation which showed that the the Moon in the heavens and apples on Earth were moved by the same force. Galileo affair - Wikipedia, Isaac Newton (1972); Philosophiae Naturalis Principia Mathematica
This new picture of the world derives from a new understanding of action which emerged when Joseph-Louis Lagrange sought a new and more comprehensive statement of classical Newtonian mechanics to make it easier to study many body problems like the solar system. His work, Mecanique Analytique placed mechanics on an algebraic rather than a geometric foundation. Mécanique analytique Volume 1
Hamilton's principle - WikipediaIn the Lagrangian method the action S associated with an event x that takes place between times t1 and t2 is expressed by the action functional
S(x) = ∫L dt.
The Lagrangian L = (T(t) −V(t)), where T and V are functions of the kinetic and potential energy of the system. Lagrangian mechanics postulates Hamilton's principle that the actual trajectory taken by a particle whose motion is constrained by T and V coincides with a stationary value of S (a fixed point in the action) which may be found using Euler's calculus of variations. Lagrangian mechanics - Wikipedia, Hamilton's principle - Wikipedia, Calculus of variations - Wikipedia
Lagrangian mechanics has been found to be very versatile and serves as a bridge between classical and quantum mechanics, quantum field theory and physical problems in general. On this basis, we might understand mechanics in spacetime as the study of action in the relationship between kinetic and potential energy.
Quantum mechanics began with Planck's discovery, in 1900, that action is quantized and that the quantum of action,his the constant of proportionality between the energy of radiation and its frequency. This is now a fundamental equation of quantum theory, E =ℏω where ℏ is the reduced Planck constant, h / 2πand the frequency is expressed in radians per second, ω, understood as the time rate of change of the phase of a quantum state ∂ |φ> / ∂ t. Max Planck: On the Law of Distribution of Energy in the Normal Spectrum
The quantum of action is very small and has the same dimensions as angular momentum, in classical physics: ML2T-1, since energy has the dimension ML2T-2 and frequency the dimension of inverse time T-1. In classical mechanics the Lagrangian action S (x) is a continuous variable, whereas in quantum theory it is discrete, so that every event is associated with an integral number of quanta, nh.
The quantum of action is a now precisely fixed natural constant which isused as a foundation for a natural set of units. We might look upon it as an invariant solution to a Lagrangian variation problem somehow established at the very root of universal structure. NIST: Kilogram, Mass and Planck's Constant
For Aristotle and Aquinas action is a metaphysical term, but here we see that it has a physical realization, providing a bridge between physics and metaphysics in a way analogous to its role in coupling classical and quantum mechanics. Dirac found that this role goes deeper and Feynman used it to create a new expression of quantum mechanics, the path integral formulation.
Quantum mechanics came of age in the 1920's in two versions known as wave mechanics (the Schrödinger equation, see above) and matrix mechanics. These were shown to be equivalent by Schrödinger, given a clear abstract symbolic expression by Dirac and a sound mathematical foundation by von Neumann using linear operators in abstract Hilbert space. Dirac notes that the major features of quantum mechanics are linearity and superposition. Matrix mechanics - Wikipedia, Paul Dirac (1983): The Principles of Quantum Mechanics, chapter 1., John von Neumann (2014): ,i>Mathematical Foundations of Quantum Mechanics
Feynman introduced a third approach to quantum mechanics which has since found favour because it provides a more direct route to quantum field theory and string theory. His path integral formulation seeks a stationary superposition of the contributions of all possible space-time paths between an initial and a final state. In principle, this set of paths spans the whole classical universe so the formulation depends implicitly on the idea discussed below in section 11, that the quantum world is prior to and independent of Minkowski space. Feynman & Hibbs (1965): Quantum Mechanics and Path Integrals, Path integral formulation - Wikipedia
Feynman began with Dirac looking for a feature of quantum theory corresponding to classical Lagrangian mechanics. Dirac found that the classical action could be used in a complex exponential to describe the evolution of a quantum state. Feynman imagined, by analogy with the two slit model, that the actual path taken by a particle is a stationary superposition of all possible paths where the contribution from a particular path is postulated to be an exponential whose (imaginary) phase is the classical action, in units of h. P. A. M. Dirac (1933): The Lagrangian in Quantum Mechanics, Richard P. Feynman (1948): Space-Time Approach to Non-Relativistic Quantum Mechanics
The path integral relies on the three general principles of quantum mechanics formulated by Feynman:
Feynman lectures on physics III Chapter 3: Probability Amplitudes1. The probability that a particle will arrive at x, when let out at the source s, can be represented quantitatively by the absolute square of a complex number called a probability amplitude — in this case, the “amplitude that a particle from s will arrive at x.”
2. When a particle can reach a given state by two possible routes, the total amplitude for the process is the sum of the amplitudes for the two routes considered separately. There is interference.
3. When a particle goes by some particular route the amplitude for that route can be written as the product of the amplitude to go part way with the amplitude to go the rest of the way.
The path integral computes the probability amplitude for a particle to go from s to x by dividing every possible path into infinitesimal segments and multiplying the amplitudes for the particle to cross each segment according to principle 3 to get the amplitude for the whole path, adding these amplitudes as required by principle 2 and computing the probability represented by the resulting amplitude by principle 1. The process works and contributes to computations in quantum field theory which precisely match observation. But, we might ask, does nature really work this way? If we are to consider the quantum of action as an atomic event, can we trust the mathematical fiction that a quantum path can be sliced into an infinity of infinitesimal events?
4: Classical gravitation and singularities
Penrose, Hawking and Ellis showed that Einstein's classical general theory of relativity predicts the existence of singularities at the boundary of spacetime. Beyond such boundaries, spacetime structure ceases to exist. Such boundaries within the universe are known to be surrounded by event horizons which give them the name black holes. Many have been observed. These authors also speculated that the Universe began from such a singularity, expanding from it like a time reversed black hole. Hawking & Ellis (1975): The Large Scale Structure of Space-Time
This raises two problem. First, we know from the behaviour of the material orbiting them, that black holes and therefore the singularities within them have mass and energy. This might lead us to suspect that the classical initial singularity is a massive and energetic particle containing the energy of the universe, a condition which is difficult to reconcile with the absence of any space-time structure associated with it.
The second is that, as Hawking has suggested, quantum processes can lead to the "evaporation" of black holes, but it is an exceedingly slow process. Hawking radiation - Wikipedia>
Since we now generally understand that classical physics is the result of underlying quantum processes, we will proceed on the basis that the initial singularity is a quantum of action, and so formally identical to the God of Aristotle and Aquinas, which they define as actus purus.
5: God's ideas, cybernetics and singularity
The traditional story of creation starts with the idea that God had a plan for the world they were about to create. The modern idea, however, is that is began from a structureless initial singularity.
Aristotle and his contemporaries considered that the intelligence is associated with immateriality, and Aquinas argued that since God is maximally immaterial, they are maximally intelligent. Since the development of computation and information theory, however, this idea has fallen out of favour. Instead we understand that information is carried by marks, like the printed letters that constitute this essay. Information stored in computers, discs and solid state memories is written into billion of memory locations, each of which has a specific address and carries one bit on information. The amount of information a memory can hold is exactly equal to the number of locations it has. Each one of these locations can be assigned one of two states, represented by either a 1 or a 0, called a bit. Another name for this number is the entropy of the memory. Entropy is the simplest measure in science: it is nothing other than a count of states and may be applied to any entity that has discrete states, from a mob of sheep to a boiler full of steam.
A second modern development about which the ancients knew nothing is cybernetics, defined by one if its founders, Norbert Wiener, as the science of control and communication in the animal and the machine. Norbert Wiener (1996): Cybernetics or Control and Communication in the Animal and the Machine, Cybernetics - Wikipedia
Following Galileo, mathematical modelling has become a primary tool of modern physics. Mathematics has progressed well beyond what was available to Galileo. Aquinas believed that an omniscient and omnipotent God has total deterministic control of every event in the world. Gödel found that logically consistent formal systems are not completely determined and Chaitin interpreted Gödel's work to be an expression of the limits to control known as the cybernetic principle of requisite variety: One system to control another only if it has equal or greater entropy than the system to be controlled. This principle suggests that a completely structureless initial singularity has no power to control its behaviour, so that its primordial acts are random events. Gregory J. Chaitin (1982): Gödel's Theorem and Information, W Ross Ashby (1964): An Introduction to Cybernetics
This same principle invalidates the idea that the traditional God would have planned the universe from the beginning. One of the first attributes of God that Aquinas studied was their simplicity. According to Aquinas (and long mystical history) God is absolutely simple. Since there are no structures or marks in this God, it cannot store information in the modern sense. Aquinas, Summa, I, 3, 7: Is God altogether simple?
As we shall see, the divine initial singularity gradually builds up its structure, becoming more and more complex by the process of trial and error we call evolution. Our universe was not planned beforehand, therefore, but has evolved through its own unlimited random activity. Some products of this activity do not last, other are are consistent and durable, and these are the ones selected to exist, to give us the more or less permanent features of the world we see.
6: Evolution, variation and selection
The divine initial singularity gradually builds up its structure, becoming more and more complex by the process of trial and error we call evolution. Some products of this activity do not last, other are consistent and durable, and give us the permanent features of the world we see.Random reproduction is the first step in the evolutionary process. On the whole plants and animals do not breed true, meaning that the children are very rarely precisely identical to their parents. The sources of this variation are to be found deep in the biological mechanisms of reproduction. Darwin was aware of this, but he was also aware, from his own experience and the experience of other breeders of plants and animals that is is possible, by carefully selecting breeding stock, to have some control over the characteristics of the offspring, so that over thousands of years of domestication people have developed a very wide variety of flowers, vegetables, dogs, cats pigeons, sheep and every other species of human interest. Ewen Callaway: Big dog, little dog: mutation explains range of canine sizes
Darwin also hypothesised that there could be natural selection. Some of the children of any parents would have a better chance of surviving and breeding in the environment in which they found themselves. These characteristics would be passed on to their children and possibly eventually become established as new species.
We now have detailed information about these genetic processes that lie behind the evolution of life. Below we suggest that a similar process of natural selection, building on the random generation of quanta of action, could be the source of the fundamental particles which constitute our universe. We now turn to a more detailed exploration of the emergence of the universe driven by the unlimited power of initial quantum of action, which we take to be analogous to the traditional divine creator.
7: Networks, brains and intelligence
The ideas proposed in this essay have been developed to shed light on some problems inherent in the theology developed by Aquinas derived from Christian doctrine and the picture of God he found in Aristotle's Physics and Metaphysics. Physics (Aristotle) - Wikipedia, Aristotle: Metaphysics
Aristotle traced a path from the physical world of everyday experience to a divine unmoved mover which drives the world. Aquinas followed Aristotle's path to produce a model of the Christian God. Aristotle believed that the world is eternal. Christianity, in contrast, believes that the world was created by an eternal God other than the world. The ancient source of this belief is Genesis, the first book of the Hebrew Torah which the writers of the Christian New Testament subsumed as their Old Testament the ancient forerunner of the theology they built around the life of Jesus of Nazareth.
Three problems I see with the Christian story are:
1. If God is the realization of all possibility, how can they create a Universe other than themselves?2. How can we reconcile the eternity of God with the life of God, which we understand to be self motion?
3. How can we reconcile the omniscience and omnipotence of the creator with their absolute simplicity?
The answer I propose is to identify the creator with the universe we have revealed to ourselves by modern astronomy and cosmology. There now appears to be a strong consensus that the general theory of relativity combined with particle physics suggests that the Universe originated from an eternal initial singularity formally identical to the Christian God within which the universe as we know it has emerged.
This emergence is imagined to have begun with a big bang about fourteen billion years ago followed by a relatively well understood series of events which have brought us to our current condition. The big bang theory assumes that all the energy of the Universe is concentrated in the initial singularity. If the initial singularity is action, however we may see the first step in the development of the universe is the creation of energy itself by the repeated action of action suggested by the fundamental equation of quantum mechanics, E = hf. Planck-Einstein relation - Wikipedia
This identification solves all three of my problems:
1. There is but one world, and it is divine.
The initial singularity shares the properties of the traditional God: both are eternal; both are absolutely simple; and both are the source of the world. The universe as we know it exists inside the initial singularity, that is inside God. Its complexity does not therefore compromise the unity and simplicity of the world, which provides us with a starting point from which to understand the universe as a single entity. The omniscience and omnipotence of God is the omniscience and omnipotence of the universe.
2. Eternity and motion are logically connected by fixed point theory
A plausible next step in the creation of a stable universe like the one we experience within the initial singularity may be motivated by fixed point theory, which entered the world of topology with Brouwer's theorem in 1910. Brouwer fixed point theorem - Wikipedia (ref link above)
Suppose X is a topological space that is both compact and convex and let f be a continuous map of X to itself. The f has a fixed point, that is there is a point x* such that f(x*) = x*. John Casti (1996): Five Golden rules (page 71)
Does the initial singularity explained above fulfil the conditions of this theorem? It is a space, and because it is generated randomly it does not have a metric, so we might say that it is topological. Is it convex? Since it is structureless it is unlikely to have "holes" so we might assume convexity. Is it compact? We define the initial singularity to be all that there is, with nothing outside it and so it must contain its own boundary. From a logical point of view, this is to say that the initial singularity is internally consistent, like mathematics. "Outside" it is inconsistency, which an application of the principle of non-contradiction assures us cannot exist.
Brouwer's theorem is not-constructive, which raises both a problem and a useful feature. The problem arises because it does not provide a means to find a fixed point. Subsequent mathematical developments have dealt with this problem. The useful feature of non-constructivity is that it respects the ancient belief that we cannot say what God is, only what God is not, the via negativa of Dionysius (§12).
Motion and eternity and are therefore to be understood as two sides of the one coin. The quantum layer of the universe is in perpetual motion whose fixed points appear to us as the classical world in which we live.
3. The omniscience and omnipotence of the creator may emerge by local evolution in a network
Action creates energy, energy facilitates quantum theory which describes a universe of perpetual motion. From here the mathematical theory of fixed points takes us to the generation of observable particles which are able to communicate and bond to form the universe we see.
The overall framework for this picture is a communication network, and I see much value in the fact that network model, like quantum mechanics, is symmetrical with respect to complexity. The fundamental properties of quantum mechanics are the same where we are working in a Hilbert space of 2 or a countable infinity of dimensions. The fundamental properties of a communication network are the same whether we are considering the "network atom" of two sources communicating through a channel, or the a network of a countable infinity of sources communication through a set of channels that connects each one to all the rest.
High energy physicists have found that by accelerating pieces matter to very high energies making them collide they can create small bubbles of almost pure and structureless energy which then rapidly materialize into a spectrum of particles. Our knowledge of the fundamental physics of the universe comes from comparing the properties of the particles input to a bubble with the particles that come out and trying to understand the transformation that links output to input. Martinus Veltman (2003): Facts and Mysteries in Elementary Particle Physics chapter 6
We might understanding this process is by comparing a fertile human egg to the information processing system that grows out of it. I am much more complex that the egg I grew from. Although this is consistent with the second law of thermodynamics that entropy increases, it seems to contradict the idea that nothing comes from nothing.
The information in my egg is encoded in my genome, a DNA string of some three billion symbols (A, T, G, C) each representing 2 bits of information for a total of approximately 1 gigabyte, about 1010 bits.
Life is a electrochemical process based on insulating membranes and ionic motors which create and utilize electrical potentials across the membranes. This system is closely analogous to the techniques of electrical engineering. Multicellular plants rely on electro-chemical signalling to coordinate the operations of individual cells. All but the simplest of animals use neural networks, both for internal housekeeping and for interaction with the world around them.
Neural networks are constructed from neurons, cells (sources) adapted to receive, process and transmit electrical signals. The connectivity in the network is high. A neuron may have many inputs and output to many other neurons or motor cells. Neurons fall into three classes, sensory neurons which provide input to the neural network, motor neurons which convey the output of the network to effectors, and interneurons, which transform sensory input into motor output. Neuron - Wikipedia
Signals are transmitted along the fibres in the neural network by a discrete voltage spike known as an action potential which propagates along the fibre at quite high velocity. All these action potentials are effectively identical, like the digits in a computer network. Their information content is a function of their timing and frequency.
The principal functional connections between neurons and the fibres connecting them are synapses. A synapse is a complex structure which, on receiving an input from a connected fibre, releases neurotransmitters which interact with the membrane of the neuron. This interaction may be excitatory or inhibitory. The neuron algebraically integrates this input over time taking into account the “weight” associated with each synapse. When it reaches a certain threshold it “fires” sending an action potential along its axon.
Processing and memory in a neural network are modulated by synaptic weights which are a measure of the level of influence, positive or negative, a particular synapse may have on the neuron to which it is connected. The details of a neural network may be extraordinarily complex, there are many different neurotransmitters and many varieties of cells which perform auxiliary functions associated with the neural network.
One of the principal research tools used to understand the functions of neural networks are computer systems which model the synaptic connections in a network and enable the adjustment of synaptic weights by various training algorithms in attempts to model the behaviour of various subsets of neural networks. This work is of great interest to the artificial intelligence community, but is far from achieving equivalence to the human brain.
The ontological development of an individual human human brain poses an interesting problem in network creation. An important source of formal guidance in the develop of any living creature is the genome. The expression of the genome occurs in a living cell, and depends on transformations executed by the physical and chemical processes embodied in the cell.
Formally, programmed deterministic development is subject to the cybernetic principle of requisite variety. This law establishes the conditions for completeness and computability that render any process deterministic enough to have a high probability of good success.
The human nervous system comprises some 100 billion neurons each with possibly 1000 connections to other neurons.
In the specification of a standard engineered computer network, every physical connection is precisely specified by source and destination. Measured in bits of information this is at a minimum twice the logarithm to base 2 of the number of connections. Such precise specification in the case of the n connections of the human nervous system is n log n, where n = 100 billion (neurons) x 1000 (connections per neuron), ie 1014. n log n is therefore about 1016 bits, approximately a million times greater the information content of the genome.
It is necessary, therefore, that some other mechanism must account for the connective structure of the brain, which is to say that to a large degree this system must define itself. The human brain must have a self-structuring property.
The explanation appears to be a form of evolution by natural selection. The neurons in an infant brain seek out synaptic connections with one another, a process which is to a large degree random, creating an excessive number of connections. There follows a process of pruning which continues through the teenage years, eliminating little used connections.
As well as determining the wiring of the brain over a period of years, experience determines the synaptic weights connecting neurons. Changes in weight may occur in milliseconds during the real time processing of speech, and over a lifetime during the acquisition of knowledge and experience. The physical development of a brain is thus closely related to the reception of information from the environment via the senses and feedback from the results of actions (like learning to walk). IT serves as a microcosm of the development of the universe. Our minds are the product not just of our genes, but of the environment in which we find ourselves.
Mental evolution provides us with an enormous advantage, since thought is usually much cheaper than action. In the natural world of evolution by natural selection many newborns fail to reproduce for one reason or another. In some species this failure rate may be very high, thousands being born for every one that survives and reproduces. In more complex species like ourselves most children are carefully nurtured by their parents, leading to a high rate of survival.
The relationship between the quantum world and the real world is rather like the relationship between mental modelling and actual construction. We see this at work in the phenomenon called the collapse of the wave function (§10). When quantum systems observe one another, only one of a large number of possibilities is physically realized in each case. From this point of view, the quantum world works like the mind or imagination of the universe, thinking many things but only investing real resources in constructing a few. Here we see a cosmic process which brings an advantage analogous to thought, education and imagination in the development of human society and technology.
The standard Christian God of Aquinas is supreme in every particular: supreme intelligence, omniscient, knowing ever every detail of everything, past present and future. On the other hand this god is absolutely simple, so that it has no means of representing all this information.
The ancients equated knowledge with spirituality and spirituality with simplicty, an impossible situation from the point of iew of a modern understanding of information, and the point at which it becomes neceseary to reconceive the divinity. God remains omniscient and omnipotent, but now their omniscience and omniptotence is the omniscience and omnipotence of the universe irself.
Cognitive cosmology sees the universe as a mind, a creative mind, and we are the ideas in that mind, created over many billions of years by a long and complex process of evolution that we have really only become aware of in the last two centuries.
Human cultural evolution seems slow. In particular we have found that a century is a short time in the development of theology. But compared to the biological evolution of the world, we see cultural, scientific and technological changes occurring in centuries where evolutionary changes require thousands or millions of years.
On the other hand, we can imagine a very fast process of evolution in a high energy particle collision. In general there is not enough information in the input to determine the output, but the output, although random, comprises a spectrum of precisely defined and well known particles that have perhaps been selected out of a very wide spectrum of possibilities. We return to this discussion below wit the help of quantum field theory.
8: The classical trinitarian theology of action
The existence of the modern initial singularity is in the first instance a consequence of the general theory of relativity, but it has very ancient roots in the Hebrew notion of the one God, monotheism, which was carried over into Christianity. A remarkable development in Christian New Testament theology is the transformation of the Hebrew God Yahweh into the Christian God, and the emergence of the New Testament Trinity of three divine persons, Father, Son and Spirit.
Here I pause for a moment to review the doctrine of the Trinity for some clues about the development of a quantum initial singularity into the modern Universe. Many might feel that ancient theologians did not have a clue about reality but we have to face the fact that intelligence in Homo sapiens (as estimated by crude cranial capacity) has been a persistent quality since we first evolved and historians and archaeologists are often finding that the ancients knew more than we might have suspected. Brain size - Wikipedia
A first step in the study of the emergence of the universe within the initial singularity is suggested by the ancient reconciliation of the unity and simplicity of the Hebrew God with the Christian Trinity. The Christian doctrine was explicitly enunciated in the Nicene Creed based on the authority of the New Testament. The reconciliation began with Augustine and was further developed by Aquinas and Lonergan. This work was intended to explain how three could be seamlessly combined into one, but it seems that some of the basic ideas could be extnded to any numbers of persons or (in communication theoretical terms) sources. Nicene Creed - Wikipedia, Augustine (419, 1991): The Trinity
Aquinas derived all the traditional properties of God from the conclusion that God is pure act, actus purus, a consequence of the proof for God's existence which he received from Aristotle. How could the unity of God be reconciled with the triplicity of the Trinity? Initially, this was just considered one of the many mysteries associated with the gods, but explanations slowly emerged. Trinity - Wikipedia, Hindu - Wikipedia, Hebrew Bible - Wikipedia, New Testament - Wikipedia
The first clue may be in John's gospel, which begins: "In the beginning was the Word, and the Word was with God, and the Word was God." (John, 1:1). This sentence may allude to the ancient psychological belief that the source of the words that we speak are the mental "words" (ideas, forms) that enter our consciousness as we think about what we are going to say. Because God is absolutely simple, theologians hold that attributes which are accidental in the created world are substantial in God. God's concept of themself, God's word (Latin verbum), is therefore considered identical to God. The author of the Gospel identifies this word with Jesus, the Son of God, the second person of the Trinity, who "was made flesh and dwelt among us". The relationship of this Word of God to God has been discussed at length by the twentieth century theologian Bernard Lonergan in his book Verbum. John the Evangelist: The Gospel of John (KJV), Bernard Lonergan (1997): Verbum : Word and Idea in Aquinas
The human psychological foundation of the procession of the Son from the Father is therefore the mental image each one of us has of ourselves. The love of self manifested in ourselves as a mental accident when we contemplate our self image becomes, in the Trinity, a real divinity, the Holy Spirit. This emergence of the Son from the Father is called procession. The Holy Spirit is understood to be the real manifestation of the love of the Father for the Son. Is there procession in God?
So far we have three identical gods generated within the initial divinity of pure action. The next step in the model is the idea that the distinctions of the persons are maintained by the relations between them. Once again the principle is invoked that while relationships between created beings are accidents, in God they are substantial. Aquinas, Summa, I, 40, 2: Do the relations distinguish and constitute the persons?
In each case, the person is truly divine, that is truly pure act, so the processions of the persons may be conceived as pure actions producing pure actions. This process is limited to the production of three persons by the Christian dogma of the Nicene Creed. If, however, we accept that every action may produce action, there is no limit to the process. With this historical psychological and metaphysical picture of the Trinity in mind, we can now turn to a more quantum mechanically oriented discussion of the multiplication of the initial quantum of action into the universe.
9: The active creation of Hilbert space
It is difficult to associate energy with a hypothetical structureless entity prior to the emergence of spacetime. One approach may be through Landauer's claim that information is physical. We can extend this idea to embrace the idea that logic, the means of computationally processing information is also physical, and conversely, that physics is an instance of information processing. This is consistent with current ideas of quantum computation and quantum information. Lo, Spiller & Popescu (1998): Introduction to Quantum Computation and Information
In its simplest incarnation, we may consider the quantum of action as a not operator. This operator changes some situation p into some not-p. In the binary realm of propositional logic, we understand that not-not-p = p, but in the wider world we can see, for instance, that there are about seven billion instance of not-me. The effect of this definition to is to interpret action in terms of logic which may be understood to be purely formal, like Hilbert, Whitehead and Russell's mathematics, needing no spacetime support. With emergence within the initial singularity of of energy, spacetime and momentum action obtains the dimensions of angular momentum. This transformation may provide insight into the role of logic in physics. Hilbert's program - Wikipedia, Whitehead (1910, 1962): Principia Mathematica
This logical approach suggests that action in its primordial form is inherently dimensionless. This may account for the fact that quantum coupling constants are dimensionless, making renormalization possible. At present a fundamental barrier to a quantum theory of gravitation is that it is not renormalizable because the gravitational constant has dimensions of M -1 L3 T -2. We will return to this in §10. Renormalization - Wikipedia: Renormalizability
Let us assume, then, that the quantum initial singularity comprises action, identical to the traditional God, and that it has the power to reproduce itself indefinitely, free of the dogmatic limitations of any religion and without concern for the conservation of energy, which has yet to emerge. We may guess that this action creates Hilbert space, dimension by dimension, the orthogonality of the dimensions being guaranteed by the no-cloning theorem. No-cloning theorem - Wikipedia
Von Neumann showed that quantum mechanics is best described using an abstract Hilbert space, a complex linear vector space analogous to Cartesian space with a metric defined by an inner product. We assume for the time being that this space may have at most a countably infinite number of dimensions, ℵ0. Physical states are represented by rays in this Hilbert space, and we assume that the initial state of the quantum initial singularity has 1 mathematical dimension represented by the complex plane. Inner product space - Wikipedia
Von Neumann defines abstract n dimensional Hilbert space with three axioms:
α)A “scalar product,” i.e., the product of a (complex) number a with an element f of Hilbert space: af;
β) Addition and subtraction of two elements f, g of Hilbert space: f ± g;
γ) An “inner product” of two elements f, g in Hilbert space. Unlike αand βthis operation produces a complex number, and not an element of Hilbert space: (f, g.Each element, f, g, . . . defines the orientation of a complex plane in Hilbert space. It is called a ray, with the property that e i θ f = f so that the orientation of element within its associated plane is only relevant when vectors associated with the same plane are added (superposed).
Basis elements of a Hilbert space f and g are normalized by the property (f, f) = 1 and are said to be orthogonal when (f, g) = 0.
Each vector in a Hilbert space represents a quantum of action , and we assume that since the specific dynamic property of a quantum of action is to act, the initial singularity will eventually become populated with a ℵ0 states. The result is an ℵ0 dimensional Hilbert space of orthonormal basis vectors analogous to the vacuum of quantum field theory.
Von Neuman points out that:
The noteworthy feature of the operationsaf, f ± g, (f, g) is that they are exactly the basic operations of the vector calculus: those which make possible the establishment of length and angle calculations in Euclidean geometry or the calculations with force and work in the mechanics of particles.
10: the emergence of quantum mechanics
We are attempting to construct a universe from a primordial quantum of action. In broadest terms we might imagine that the mechanism for this construction has two stages, familiar to us from Darwinian evolution, variation and selection.
The action of a quantum of action is to act, reproducing itself. Since it is completely simple it cannot control itself and so its actions are both random and in some way connected by their common source to form complex structures. Like the traditional God, the primordial quantum has the power to explore all consistent structures.
These structures are subject to selection since no self contradictory structure can exist. From this point of view, the space available to the universe is formally equivalent to the space available to mathematics. The only constraint we place on mathematical structures it that they be consistent. Hilbert thought that mathematics so understood would be able to solve all problems, but Gödel and Turing showed that this is not the case. Consistent mathematics embraces both incompleteness and incomputability. Complete theory - Wikipedia, Computability theory - Wikipedia
There is a problem with the traditional notion that the universe exists outside God, since God is understood to be the fullness of being, so that there can be nothing outside them. In the previous section we explained the idea that there is a trinity of divine persons inside God. We define "person" in abstract communication terms as sources, that is entities capable of sending and receiving messages. To maintain a connection with traditional theology, I want develop an analogy with the Trinity which envisages the procession of a transfinite number of sources rather than just three, each a quantum of action proceeding within the initial singularity, our analogue of the traditional God.
Like the action of the traditional god, the action of the initial singularity is to reproduce itself within itself. The difference is that while the trinity stops at three the procession of action has no limit.
In both classical and quantum physics energy is created by repeated action. A new layer of structure, energy and time thus emerges from the natural behaviour of the primordial divine action. Energy and time are the ingredients for the Hamiltonian of non-relativistic quantum mechanics. Hamiltonian (quantum mechanics) - Wikipedia, Feynman Lectures on Physics III: Chapter 8: The Hamiltonian Matrix
This scenario suggests a source of the vacuum of quantum field theory. Since the initial singularity is acting at random, we can imagine that there are random intervals between events, which intervals correspond inversely to frequencies and energies, so we may imagine the initial singularity as the source of an unlimited spectrum of random energies which looks like the conventional vacuum.
In the case of the Trinity, the persons are considered to be distinguished by their relationships to one another. We might understand the distinctions between the quanta of action proceeding from the initial singularity to be explained by the quantum no-cloning theorem. From this point of view the action of a quantum of action is identical to the universal logical operator NAND (not-and), producing a new state orthogonal to the original state. No-cloning theorem - Wikipedia
The creation of new orthogonal quantum states in the initial singularity is in effect the creation of new dimensions of a Hilbert space. We may imagine each new quantum of action as a pure quantum state, a basis vector of the space, represented by a ray in the space. This structure sets up a foundation for the operation of quantum mechanics as described in §9. Quantum state - Wikipedia, Projective Hilbert space - Wikipedia
At this point the interior of the initial singularity stands as a one dimensional spectrum of energy, a harmonic oscillator whose states are represented by orthogonal rays in a Hilbert space of countable dimension, the orthogonalities dictated by the non-cloning theorem. We may see this as an alphabet for music or speech which may be used to write music, analogous to the vacuum of conventional quantum mechanics, which provides an alphabet to write a universe. Quantum harmonic oscillator - Wikipedia
The action of action is to create action, and the fundamental equation of quantum mechanics, E = hf relates energy to the frequency of action, so that the repeated actions in the initial singularity create energy which is the basic input into quantum mechanics through the action of the Hamiltonian or energy operator.
A serious difficulty arising in quantum field theory is the cosmological constant problem which is a consequence of the idea that the universe in built on a vacuum with infinite of degrees of freedom, each with a finite zero point energy. The standard exposition of the theory yields a total energy about 100 degrees of freedom greater than that actually observed. The approach taken here, generating the vacuum "in situ" may provide a solution to this problem. We will return to this question below.
Near the beginning of his classic exposition of quantum theory, Dirac notes that one of the principle peculiarities of the new theory is superposition or interference. Superposition is a feature of waves, and we can see it and hear it. If we throw two stones into a smooth pond we will see that when the circles of waves spreading from each impact meet they add and subtract from one another to form a complex pattern of waves which seem to pass through one another unaffected. The phenomenon is also clear in sound. We can usually distinguish voices of different instruments or people sounding together. Jessica Haines: Two stones wave patterns, Catherine & Johnathan Karoly: Heitor Villa-Lobos: The Jet Whistle
In his lectures on physics Feynman uses the double slit experiment to demonstrate the radical difference between the quantum mechanics of fundamental particles and the behaviour of classical particles like bullets. He summarizes it is three simple propositions:
. The probability of an event in an ideal experiment is given by the square of the absolute value of a complex number φ which is called the probability amplitude:
P = probability,
φ = probability amplitude,
P = |φ|2.. When an event can occur in several alternative ways, the probability amplitude for the event is the sum of the probability amplitudes for each way considered separately. There is interference:
φ = φ1 + φ2,
P = |φ1 + φ2|2. If an experiment is performed which is capable of determining whether one or another alternative is actually taken, the probability of the event is the sum of the probabilities for each alternative. The interference is lost:
P = P1 + P2
One might still like to ask: “How does it work? What is the machinery behind the law?” No one has found any machinery behind the law. No one can “explain” any more than we have just “explained.” No one will give you any deeper representation of the situation. We have no ideas about a more basic mechanism from which these results can be deduced. Feynman lectures on physics III:01: Quantum behaviour
The quantum amplitudes referred to here are invisible and undetectable. They are assumed to exist, since quantum mechanical computations based on these ideas invariably match our experience, and we assume that the invisible amplitudes of quantum theory behave mathematically just like the visible and audible interference of real physical waves of water and sound. In the early days of wave mechanics, physicists often found themselves studying sound waves to gain insight into quantum waves.
11: Quantization: the mathematical theory of communication
We assume that the structure of the universe is maintained by communications between it various components. If the universe is to be stable, we further assume that these communications are at least so some degree error free. The mathematical theory of communication developed by Shannon establishes that quantization and error prevention are very closely related.
From a mathematical point of view, a message is an ordered set of symbols. In practical networks, such messages are usually transmitted serially over physical channels. The purpose of error control technology is to make certain that the receiver receives a string comprising the same symbols in the same order as that transmitted. The enemy of error free transmission is confusion of symbols and scrambling of their order. The fidelity of a channel can be checked by the receiver transmitting the message received back to the sender, who can compare the original with the retransmitted version.
Shannon's mathematical theory of communication shows that by encoding messages into discrete packets, we can maximize the distance between different signals in signal space, and so minimize the probability of their confusion. This idea enables us to send gigabytes of information error free over noisy channels. The specific constraint is that the ratio of signal to noise in a channel governs the rate of error free transmission. Claude E Shannon: A Mathematical Theory of Communication, Alexandr Khinchin (1957): Mathematical Foundations of Information Theory
Shannon's theory relies on the classical statistics of distinguishable and countable events and is an application of the mathematics of function space.
A system that transmits without error at the limiting rate C predicted by Shannon’s theorems is called an ideal system. Some features of an ideal system are embodied in quantum mechanics, particularly quantization.
1. To avoid error there must be no overlap between signals representing different messages. They must, in other words, be orthogonal, as with the eigenfunctions of a quantum observable. Observable - Wikipedia
2. The basis signals or letters of the source alphabet may be chosen at random in the signal space, provided only that they are orthogonal. The same message may be encoded into any satisfactory basis provided that the transformations (the codec) used by the transmitter to encode the message into the signal and receiver to decode the signal back to the message are inverses of one another. Quantum processes are reversible in the sense that the unitary evolution of an isolated quantum system acts like a lossless codec. Codec - Wikipedia, Unitary operator - Wikipedia
3. The signals transmitted by an ideal system have maximum entropy and so are indistinguishable from random noise. The fact that a set of physical observations looks like a random sequence is not therefore evidence for meaninglessness. Until the algorithms used to encode and decode such a sequence are known, nothing can be said about its significance.
4. Only in the simplest cases are the mappings used to encode and decode messages linear and topological. For practical purposes, however, they must all be computable with available machines. How this applies in quantum theory is closely related to the measurement problem and the collapse of the wave function (§10).
5. As a system approaches the ideal, the length of the transmitted packets, the delay at the transmitter while it takes in a chunk of message for encoding, and the corresponding delay at the receiver while the message is decoded , increase indefinitely. Claude Shannon (1949): Communication in the presence of noise
In computer networks the implementation of error free communication requires coding for transmission by the sending source which is decoded by the receiving source to reveal the original message. Entropy must be conserved and the codec must be complete if the transmission process is to be lossless.
Quantum systems described by algorithms such as the Schrödinger equation are constrained to evolve through time in a unitary and reversible manner. The Schrödinger equation defines an error free communication channel which is nevertheless invisible to us. This process is interrupted when systems interact, just as computers in a network are interrupted when they are required to deal with an incoming message. Unitarity (physics) - Wikipedia
12: The quantum creation of Minkowski space
We are inclined to take spacetime as given and see it as somehow emerging (like a time reversed black hole?) from the initial singularity. On the other hand we are convinced that the proper explanation of the nature of the world is quantum mechanics, so it seems reasonable to expect spacetime to be a layer of reality constructed from elements provided by quantum mechanics. This is consistent with the idea that the quantum world precedes the spacetime world. We suppose that quantum mechanics is the underlying symmetry that is applied (and broken) to create the transition from energy-time to spacetime, accompanied by the parallel emergence of momentum-distance and gravitation.
Here we come to the most interesting part of this story, the interface between the abstract invisible quantum world that drives the universe from behind the scenes and the world of space and time in which we live and which serves as the stable screen upon which we exist and observe everything that happens. Quantum mechanics works in the world of energy — time; we live in the world of energy — momentum — time — space — gravitation. How does structural information flow from Hilbert space to Minkowski space and back? This question, often known as the measurement problem, has been debated since the beginning of quantum mechanics.
The key idea here is tautological or zero-sum complexification. This has already appeared unremarked in section 10. There, since it is of the nature of action to act, and we define energy as the rate of action, we may see that action of its nature (ie tautologically) creates energy. From this point of view the emergence of energy has added nothing new to the initial singularity. The idea here is that each step in the complexification of the universe steps through a phase of randomness and uncertainty to create new entities that add up to nothing, like potential and kinetic energy or positive and negative charge. Primordial symmetries are broken to create new features of the world.
The principle at issue here is that causality requires contact. Isaac Newton was forced by circumstances to admit that gravitation was some sort of "action at a distance", which we understand to be impossible in ordinary space. Quantum entanglement in Hilbert space (a mathematical space), however, led Einstein to imagine "spooky action at a distance". We shall suggest in §13 that this is possible because quantum mechanics occupies world where there is no spatial distance in the Newtonian sense. In the space-time world contact is maintained by particles moving at the speed of light which follow "null geodesics" whose beginnings and ends coincide in spacetime. (see §16) Geodesics in general relativity - Wikipedia
The most peculiar feature of Minkowski spacetime is its metric ημν, which is diagonal 1, 1, 1, -1. This suggests that zero bifurcation is at work, so that in some sense space + time = 0 The principal ingredients of a model of the emergence of spacetime are therefore symmetry, zero bifurcation and the speed of light. The null geodesic, made possible by the Minkowski metric, is the accommodation made in spacetime to maintain contact after the emergence of space. The velocity of light is an artefact of this accomodation and enables contact in the quantum world to continue uninterrupted despite the emergence of space. How can this happen? We invoke the evolutionary principle that uncontrolled action can try everything, and that consequence of these trials that are self sustaining are selected and can become fixed.
Interaction is local. Before space enters the world, contact is inevitable and quantum systems can evolve unitarily without interruption. To correlate their evolution, spatially separated systems must communicate to maintain contact. The metric of Minkowski space enables the existence of null geodesics whose endpoints are in contact because the observable space-time interval between them is zero. The unitary contact of spatially separated systems can thus be maintained if the messenger travelling between them proceeds at the speed of light in Minkowski space. In other words the speed of light makes space possible by maintaining the integrity of the contact and unitarity that is essential to the work of quantum mechanics, and this "trick" explains the Minkowski metric. Kevin Brown (2018): Reflections on Relativity, page 693.
It has been generally assumed that Minkowski space is the domain of Hilbert space so that it is necessary to apply Lorentz transformations to both Hilbert spaces and particles in quantum field theory. I have suggested above that this my not be necessary because Hilbert space is prior to, independent of and the source of Minkowski space. Martinus Veltman (1994): Veltman (1994) op cit page 20
13: Is Hilbert space independent of Minkowski space?
Since the advent of special relativity, classical physical field theories are usually described in flat Minkowski space-time. The special principle of relativity holds that every observer sees the same laws of physics, including the same speed of light, in their own rest frame. This defines the Lorentz transformation which enables each observer to compute their local equivalent of the spacetime intervals observed between events in moving frames. This transformation is expressed succinctly by the 1, 1, 1, -1 metric of Minkowski space so that if we set the speed of light c to 1, all observers see an invariant interval ds2 = dx2 + dy2 + dz2 - dt2. Minkowski space - Wikipedia, Tests of special relativity - Wikipedia
It seems to be generally accepted in quantum field theory that the Lorentz transformation applies equally to states in Hilbert space and particles in Minkowski space. This implies that the domain of Hilbert space is Minkowski space. Martinus Veltman (1994): Diagrammatica: The Path to the Feynman Rules page 20.
If, however, the quantum world constitutes a layer of the universe built on the initial singularity before the emergence of energy, time, space and momentum, this convention may need revision. The phenomenon of entanglement suggested that the Hilbert quantum world exists prior to and independent of the Minkowski classical world. It seems more reasonable to attribute the apparent propagation of correlations associated with entanglement to the absence of space than to the propagation of thee correlations at infinite velocity.
If this is the case, it opens up a new degree of freedom lying between quantum and classical dynamics which may make it possible to remove some of the confusion in quantum field theory noted by Kuhlman in the epigraph to this essay.Einstein appears never to have been truly happy with quantum mechanics, and often sought to demonstrate its weaknesses. This may have been because he felt that nature should go its own way independently of any observers. In the quantum world, however, observation is part of the physics. Although some have felt that 'observer' implied a conscious being, we can equally well imagine the universe observing itself to create real events. In the course of a paper on what is now known as the EPR paradox the authors identified 'spooky action at a distance' which is a now seen as a consequence of entanglement. Einstein, Podolsky & Rosen: Can the Quantum Mechanical Description of Physical Reality be Considered Complete?, Quantum entanglement - Wikipedia, Gabriel Popkin (2018): Einstein's 'spooky action at a distance' spotted in objects almost big enough to see
EPR equate reality with predictability: If, without in any way disturbing a system, we can predict with certainty (i.e. with probability equal to unity) the value of a physical quantity, then there exists an element of physical reality corresponding to this physical quantity.
Experimental tests of entanglement have gradually improved until now it is widely believed that spooky action at a distance is real. It has been shown that this correlation at a distance operates at many times the velocity of light. Pan, Bouwmeester, Daniell, Weinfurter & Zeilinger: Experimental test of quantum nonlocality in three-photon Greenberger–Horne–Zeilinger entanglement, Salart, Baas, Branciard, Gisin, & Zbinden: Testing spooky action at a distance, Juan Yin et al: Lower Bound on the Speed of Nonlocal Correlations without Locality and Measurement Choice Loopholes
EPR argue from the quantum formalism that a measurement on one part of an entangled system enables us to predict with certainty the outcome of a measurement on the other system even though they are spacelike separated. They concluded that 'no reasonable definition of reality could be expected to permit this.' It turns out, however, that the quantum mechanical description has become established as the new reasonable definition of reality.
Here we exemplify entanglement and its consequences using a two state system. Electrons have two spin states which are often called up and down. In the singlet state, one electron has spin up, the other down, so that the total spin of the singlet is zero. Singlet - Wikipedia
Entanglement establishes that when these electrons spatially separated, they behave, when observed, as though they are still in contact. If one electron is observed to be spin up, the other will be observed to be spin down no matter how far apart they are. This is 'spooky action at a distance', but the fact that this correlation appears to be instantaneous suggests that although the electrons are distant in Minkowski space, they are still in contact in Hilbert space. If this is the case, it is a major break from conventional wisdom and opens the way for a pleasing new approach to theology, spirituality and quantum field theory.Although the biggest shock that came with quantum mechanics is the inability to predict the precise timing of events, entanglement gives us a very definite method predict both the nature and timing of entangled events, since the observation of one half of an entangled system appears to have an immediate consequence in the other half.
Although the traditional observers Alice and Bob can communicate the fact of their observations via entanglement immediately and definitely, it cannot be used to communicate information faster than the speed of light. This is because Alice cannot control what she is going to observe, and therefore control what Bob receives, even though it is certain, in the binary case, that if Alice observes 0 Bob will observe 1. In other words, the pure quantum world is, as Einstein felt, incomplete. It is completed by observation.
The experiments of Pan, Salart and Yin referred to above have demonstrated that entangled particles could act upon one another at a distance even though their separation was spacelike, requiring something greater than the velocity of light to account for their correlation. This is called quantum non-locality. Quantum nonlocality - Wikipedia
Classical physics is founded on a belief in local realism. This has three features:
1. regularities in observed phenomena point to the existence of physical reality independent of human observers;
2. consistent sets of observations underlie 'inductive inference', the notion that we can use them to devise models of what is going on behind the scenes; and
3. causal influences cannot travel faster than the velocity of light.
Long experience and detailed argument as shown that quantum mechanics is not a local realistic theory. Bernard d'Espagnat (1979): The Quantum Theory and Reality
The EPR argument was perhaps the first hint that local realism is false. John Bell studied EPR and formulated a first version of Bell's theorem which would show that quantum mechanics was not a local realistic theory. John Bell (1987): Speakable and Unspeakable in Quantum Mechanics, Myrvold, Genovese & Shimony (Stanford Encyclopedia of Philosophy): Bell's Thorem
14: The measurement problem
The Hilbert space representation of a quantum state is a vector which may be the superposition of a number of orthonormal basis states corresponding to the dimension of a Hilbert space. In physical applications such spaces commonly have a countable infinity of dimensions but the simple two dimensional Hilbert space called the qubit tells us most of what we want to know because quantum mechanics, like the theory of networks, is symmetrical with respect to complexity. Orthonormal basis - Wikipedia, Qubit - Wikipedia
We cannot see the vectors in Hilbert space. Our conjectures about this hidden quantum mechanical structure are based on observations of the interactions of visible particles. What we do see are eigenvalues which correspond to the eigenfunctions of the operator we use to measure an unknown state. The theory predicts that there are as many possible eigenvalues as the dimension of the measurement operator. The terms collapse or reduction of the wave function refer to the fact that observations only ever reveal just one of the possible states of the unknown system. Measurement problem - Wikipedia
This situation is radically different from the superposition of real waves. It is as if we were to listen to the The Jet Whistle (§10) and at different times hear the flute and no cello or the cello and no flute. It seems that we can only see fragments of the information encoded in the amplitude representation of a quantum event yet we believe it is all there. When we make quantum computations we have to take all the possible invisible amplitudes into account if we are to get the right answer.
The radical problem facing the development of quantum computation is illustrated by the difference between a classical bit (binary digit) and its quantum analogue, the qubit. A classical bit has just two states, usually represented 0 and 1. These states are orthogonal, one is not the other. A qubit on the other hand is a vector formed in a two dimensional Hilbert space by adding the orthogonal basis states |0> and |1>. This vector has an transfinite count of states represented by the equation |qubit> = a|0> + b|1>, where a and b are complex numbers subject to the constraint that |a|2 + |b|2 = 1. When we observe a qubit, however, all we ever see is |0> or |1> with frequency P( |0> ) = |a|2, P( |1> ) = 1 - |a|2. The infinite amount of information which we suppose to be represented by the qubit turns out to be at best just 1 classical bit. It has collapsed. People designing quantum computers must try to devise some way to take advantage of this (allegedly) hidden information. Nielsen and Chuang write:
Understanding the hidden quantum information is a question that we grapple with for much of this book and which lies at the heart of what makes quantum mechanics a powerful tool for information processing. Nielsen & Chuang (2000): Quantum Computation and Quantum Information, page 16 (ref link above)
A possible answer is provided by Wojciech Zurek who takes the view that this collapse is a necessary consequence of the transmission of information between two quantum systems. He writes:
The quantum principle of superposition applies to isolated systems, but is famously violated in the course of measurements: A quantum system can exist in any superposition, but a measurement forces it to choose from a limited set of outcomes represented by an orthogonal set of states. . . . I show – using ideas that parallel the no-cloning theorem – that this restriction (usually imposed “by decree”, by the collapse postulate) can be derived when a transfer of information essential for both measurement and decoherence is modeled as a unitary quantum process that leads to records with predictive significance. Wojciech Hubert Zurek (2008): Quantum origin of quantum jumps: breaking of unitary symmetry induced by information transfer and the transition from quantum to classical
He begins with a concise definition of standard quantum mechanics in six propositions:
(1) the quantum state of a system is represented by a vector in its Hilbert space;
(2) a complex system is represented by a vector in the tensor product of the Hilbert spaces of the constituent systems;
(3) the evolution of isolated quantum systems is unitary governed by the Schrödinger equation:
i ℏ ∂|ψ> / ∂t = H |ψ > where H is the energy (or Hamiltonian) operator.
The other three show how the mathematical formalism couples to the observed world:
(4) immediate repetition of a measurement yields the same outcome;
(5) measurement outcomes are restricted to an orthonormal set { | sk > } of eigenstates of the measured observable;
(6) the probability of finding a given outcome is pk = |<sk||ψ>|2, where |ψ> is the preexisting state of the system.
Schrödinger equation - Wikipedia, Born rule - Wikipedia
Historically, the first three postulates of quantum mechanics have been considered uncontroversial, but there has been endless debate about the interpretation of the mathematical formalism encapsulated in postulates (4) - (6).
Zurek examines a system in 2D Hilbert space, noting that the complexity invariance of quantum mechanics enables an extension of the argument to a space of any dimension.
In the Hilbert space HS the state vector |ψS> is the superposition of two states: |ψS> = α|v> + β|w> (1).
Apparatus A0 measures state S:
|ψS>|A0> = (α|v> + β|w>)A0 = α|v>|Av> + β|w>|Aw> = |ΦSA>|ΦSA> is a vector in the tensor product of the constituent Hilbert spaces – (2).
The composite system is normed and linear, so <A0||A0> = <Av||Av> = <Aw||Aw> = 1.
So
<ψS||ψS> - <ΦSA||ΦSA> = 2Rα*β<v||w> (1 - <Av||Aw> = 0
so <v||w> (1 - <Av||Aw> = 0.
then if <v||w> ≠ 0, information transfer must have failed since
<Av||Aw> = 1, so 1 - <Av||Aw> = 0
or else
<v||w> = 0 so <Av||Aw> may have any value. So |v>, |w> must be orthogonal.
Conclusion: 'The overlap <v||w> must be exactly 0 for <Av||Aw> to differ from unity.
'Selection of an orthonormal basis induced by information transfer – the need for spontaneous symmetry breaking that arises from the unitary axioms of quantum mechanics (i, iii) is a general and intriguing result.
von Neumann shows that quantum mechanical measurement creates entropy. This may seem counterintuitive: the annihilation of quantum states implicit in measurement process leads to the selection of a real state, the outcome of the measurement. We return to this below in a discussion of evolution. John von Neumann (2014): (ref link above)
15: Gravitation
Gravitation and quantum theory are both primordial and they are also distinct, so we would like to see them as two sides of a zero-sum bifurcation. How would this work? Quantum theory and gravitation are both closely related to energy. Energy, via phase and superposition, is the source of structure in quantum processes. Energy is also both the source and the subject of gravitation, which couples energy to energy, and determines the overall structure of the universe.
The The quantum mechanical explanation for the differentiation of fermions explains that because they have half integral spin the probability of the superposition of two fermion states is zero. If we assume that Hilbert is the primordial space and use the principle of simplicity to argue that in the beginning there are only two opposite phases of recursive function at any energy / frequency (a form of digitization) we can account for the primordial existence of spacetime and bosons and fermions. But where does gravitation come in?
The answer that appeals most to me is that the general theory is a measure of the 'ignorance' of the initial singularity. This is because all information and communication is digital, as shown by Shannon's theory of communication, Turing's theory of communication and Gödel's theory of logic discussed above (§§8, 6 & 7). Therefore, insofar as gravitation is described by the continuous functions of differential geometry it carries no information and is therefore consistent with the primordial simplicity of the universe. It is, in other words, a perfect description a system of zero entropy which is subsequently populated by quantum theory, describing the quantum of action which underlies all communication and the magnificent complexity of the universe which has grown within the gravitational shell. This ignorance is implicit in the notion of general covariance that all Gaussian cooordinate systems are equivalent for the description of physical 'law' which is the foundation of Einstein's work:
The following statement corresponds to the fundamental idea of the general principle of relativity: "All Gaussian coordinate systems are essentially equivalent for the formulation of the general laws of nature."] Einstein (2005): Relativity: The Special and General Theory, page 123
Gaussian coordinates do not provide a metric, which is why Einstein's equation does not provide a measure of the size of the universe, and therefore applies from the initial singularity to the Universe at any subsequent stage of is expansion. Einstein's equation provides a transformation, analogous to the Lorentz transformation, from coordinates supplied by the observer to an estimate of the coordinates of the object being observed. The details of these coordinates must be provided by the observer, and apply to all observers and all all objects to be observed. This is the point of general covariance. Gaussian curvature - Wikipedia
The details are provided by quantum mechanics which explains both the creation of observers and the objects to be observed. Initially we are only dealing with fundamental particles. Later, under the influence of gravitation these particles often coalesce into both observers and the objects to be observed such as planets, stars and galaxies. These objects are often formed under the influence of gravitation which causes may of these particles to coalesce into stars whose high temperatures result from the from the conversion of gravitational potential into the kinetic energy which serves to drive that high energy quantum mechanical processes that lead the creation of the more complex particles which are redistributed when some such stars explode and distribute their material through space. This material may be collected once more by gravitation into another generation stars and planets which eventually sustain the evolution of astronomers and other thoughtful particles interested in the structure of the Universe.
We take the fundamental equation of quantum mechanics E = ℏω to be primordial in that the action of action is to act and energy is repeated action. This, combined with the no cloning theorem, establishes each of these actions as equivalent to the universal logical nand gate. At this point quantum theory sees no distinction between potential and kinetic energy. Only energy differences feature in quantum mechanics and we can set the zero of energy wherever we wish.
In section 12 we introduced space as the dual manifestation of time transformed by the speed of light. Now we introduce momentum as the dual manifestation of energy, transformed once again by the velocity of light. For the massless photon, the transformer between time and space, energy is identical to momentum. For massive particles, the same idea is expressed by Einstein's equation E = mc2. Although mass and energy are numerically equivalent, they are conceptually quite different.
@@@
The most obvious feature of gravitation is the gravitational potential that holds us on the planet and kills us when we fall from significant heights. We may look upon potential energy as kinetic energy travelling in the form of a massless particle at the speed of light. This is the nominal speed for gauge particles like photons, gravitons (if they exist) and gluons. We may understand the potential energy of massive particles as arising from their internal motions moving at light speed, so that their interior world comprises null geodesics which account for their apparent zero size in spacetime. This seems consistent with the Wilczek's idea proposed above that the mass of baryons is produced by the kinetic motions of their internal components which generally much lighter than the baryons themselves. Mass, we might say, is energy trapped in a null geodesic. Potential energy - Wikipedia, Wilczek (2002) op. cit. chapter 10 sqq.
We can understand the 3D component of Minkowski space by thinking of the differences between wired and wireless in practical communication networks. Wireless communication is serial (one dimensional) and channels are distinguished by frequency or energy, as we find on quantum mechanics. Wired networks, on the other hand, need three dimensions for their existence in order to prevent wires intersecting in the same plane. We may consider the case of moving fermions by analogy with city traffic. In a two dimensional road system, time division multiplexing introduced by traffic lights enables traffic streams to cross. Three dimensional structures like overpasses and tunnels enable uninterrupted two dimensional traffic flow and separation of air traffic in three dimensional space is established by requiring vehicles travelling in different directions to operate at different altitudes. If we understand the emergence of new features in the universe as a matter of random variation and controlled selection, we may see that 3D space is adequate for complete wired connection, so spaces with 4 or more dimensions have no evolutionary raison d'etre and may be selected out.
Wired networks are therefore more like plumbing or electrical power networks.. Tuning is not required to discriminate sources but switching maybe necessary for one source to connect to many others. A wired network transmitting pure unmodulated power shares three properties with gravitation: it exists in four dimensions, three of space and one of time; it can deal indiscriminately with energy in any form; and the direction of motion of signals is determined by potentials.
From an abstract point of view a fixed point is the dual of the compact and convex nature of a set and we can expect to find a different fixed point corresponding to every continuous mapping of the set onto itself: fixed point theory is symmetrical with respect to the set of all continuous functions. In the case of quantum mechanics these fixed points are the solutions to the eigenvalue equation. Their existence is guaranteed by the Hermitian nature of the unitary operators in quantum mechanics. Agarwal, Meehan & O'Regan (2009): Fixed Point Theory and Applications
An important feature of the network model is that it is symmetric with respect to complexity. The atom of a network is two sources and a channel, which we may think of quantum mechanically as two bosons and a fermion. Sources and connections can exist at any scale. Communications between discrete sources become possible when they share a language or codec, that is a symmetry. Since gravitation is the universal codec which couples all sources without discrimination so long as they have energy, we can imagine that it emerges in the about same epoch of the evolution of the universe as quantum mechanics. Unlike quantum mechanics however, where connection is established by specific values of energy which we compute using the eigenfunctions of specific operators, gravitational connections are indiscriminate. This suggests that they represents a symmetry deeper and simpler than quantum mechanics which reflects the consistent unity of the initial singularity. We would expect gravitation to respond to all the energy levels present in a vacuum, for instance, which is why the cosmological constant problem is so troubling in a universe with infinite degrees of freedom each with attached zero point energy..
Consequently we might conjecture that gravitation is not quantized. In §8 above we have use Shannon's communication theory to connect error free communication with quantization. If gravitation is a universal code which cannot go wrong, there is no ground for quantization. Nevertheless gravitation imposes the precise structure on the universe.
The classical general theory of relativity predicts classical gravitational waves which have now been observed, giving us information about large cosmic events occurring at great distances. Great distance and the overall weakness of gravity mean that these waves require very large high precision interferometers for their detection. Gravitational-wave observatory - Wikipedia
In short perhaps we may see gravitation as a hybrid between classical and quantum reality before they became differentiated. Our cities are festooned with power cables, telephone lines and surrounded by clouds of wireless. Our bodies, on the other hand, are largely wired. But in all cases communication requires contact, and a symmetry that unites Hilbert and Minkowski space.
Given this picture, we might understand that energy attracts energy because it is all the same energy created by action. It is subsequently and being mapped into fixed potential energy equal and opposite to the kinetic energy from which it came. Lagrangian mechanics - Wikipedia (ref link above)
We might imagine that the coupling between the two spaces Hilbert and Minkowski which we ascribe to observation describes the inner life of particles, ie it is a story of consciousness in the same way as my conscious awareness is related to my physical actions. So I imagine that quantum computations in Hilbert spaces are to be found inside particles as my consciousness is to be found inside me. What I hope is that a clear distinction between Hilbert and Minkowski removes many of the difficulties in quantum field theory that arise from ignoring this distinction. So we think that every particle is a computer and the relationships between particles are networks.
This, and the idea that gravitation is not quantized, suggests that gravitation must be some primordial quality of the interior space of the initial singularity described by the classical general theory of relativity. This would be consistent with the idea that it is present from the beginning and guides the growth of the universe as a whole as Einstein found. As Einstein noted when he published his field equation for gravitation, the general theory of relativity is a closed logical structure and there is little choice for an alternative. From this point of view, we might see the interior of the initial singularity as a continuous Lie group fulfilling the hypotheses of fixed point theory. General relativity - Wikipedia, Abraham Pais (1982): 'Subtle is the Lord...': The Science and Life of Albert Einstein page 256, Lie Group - Wikipedia
We have built a Hilbert space inside the initial singularity by its propensity to act, our starting point being mathematical fixed point theory. The topological barrier constraining the universe being the boundary between being consistent inside and inconsistent outside.
In section 9 I proposed replacing the classical initial singularity derived from Einstein's theory of relativity with a quantum source which produced an unlimited random sequence of discrete quanta of action. This process could be understood both as the construction of a Hilbert space within the singularity and the creation of energy. Energy is both the foundation of quantum mechanics and the material, from which through quantum mechanics, everything is constructed (section 10, principle energy).
We study the quantum world by spectroscopy, stimulating it with means that range from bunsen burners to the Large Hadron Collider and measuring the species and energy of the particles that are emitted. Although physicists and machines in the real physical world are both the source of this stimulation and the observers, we know that what we are provoking is the interaction of invisible quantum states with one another, the process we call measurement. We assume that the world goes its own way in the absence of human observers, so that there is continual interaction in the Hilbert domain yielding events in the Minkowski domain which are separated by real spacetime distances which may be zero.
In the complex modern world we have considerable control over the measurement operators we use but the outcomes of our observations nevertheless remain uncertain (section 14). In the primordial system both the measurement operators and the states measured are predominantly random. Nevertheless the theory shows that there will be a spectrum of eigenvectors and associated eigenvalues yielding a spectrum of real results with a probabilities predicted by the absolute square of the complex amplitude | ψ |2 resulting from a computation in the Hilbert domain of the inner product of the interacting states.
We can imagine that as the number of states represented in the Hilbert space created within the singularity grows, the number of interaction will increase so that the size of the space created and the spectrum of particles occupying that space will also increase, as described in sections 10 and 12.
We imagine that this space is locally Minkowski and the particles existing in it are a random mixture of massless bosons following null geodesics and massive fermions guided by their interactions with one another mediated by bosons.
We might guess that two features of gravitation arise from this situation.
Current approaches to gravitation see the stress energy tensor as the source of gravitons which are believed to account for gravitation. An alternative view is that gravitation is not quantized, since is is represented by the continuous mathematics of a differential manifold and carries no differentiated signal. The only observable features of gravitation are 4 dimensions, free fall and geodesic deviation. The geodesic deviation observable between particles in free fall is the source of al our knowledge of gravitation.
Why does Minkowski spacetime have 4 orthogonal dimensions, that is four degrees of freedom? Does this property have anything to do with the creation of orthogonal dimensions in Hilbert space discussed in section? This seems improbable if Hilbert space is independent of Minkowski space.
It seems more probable that 4 dimensions are a necessary prerequisite to the motion and communication of physical particles. Time is inherently connected with motion and energy. We can understand that three spatial dimensions are necessary and sufficient for the establishment of interference free point to point communication by considering the problems faced by the designers of electrical circuits. One dimension enables only serial connection on a single line. Two dimensions enable us to make direct connection three points without "crossed wires". In three dimensions we can make direct connections between any number of points. To go to four dimensions introduces unnecessary redundancy so that one might expect that if space-time is the outcome of an evolutionary process, that three dimensions are necessary and sufficient for universal connection.
Einstein's gravitation in 4 dimensional spacetime in modelled as a continuous differentiable manifold, an instance of a continuous Lie Group. His happiest thought, the starting point for this theory, was the realization that a person in free fall would not feel their own weight. They would be moving inertially in a Minkowski space, unaware that they were being imperceptibly accelerated by gravitation. They are moving on a geodesic, along which the spacetime geometry is locally Minkowski. They could learn that gravitation is present by observing other bodies in free fall and noting that they too are in inertial motion, but nevertheless they were accelerating relative to one another. So the Moon is freely falling in the space surrounding Earth and appears from Earth to be accelerating. Newton computed the orbit of the Moon on the the assumption that the centrifugal force arising from its curved orbit is exactly balanced by the gravitational attraction between Earth and Moon. Lie Group - Wikipedia @@@
Maybe we can say inertial motion is nothing. 4 dimensions are required for point to point communication; and geodesic deviation is a consequence of the universe being logically closed. We wish to find these features present already in the initial singularity. An argument for this possibility is that the general theory allows for collapse to nothing, so, since we are dealing with a deterministic mathematical structure, it must be reversible which suggests that the gravitational structure carries no entropy at all, it is nothing.
From Einstein's point of view, both these forces are fictitious and we seek insight into this situation by considering the possibility of a particle orbiting inside the initial singularity after the emergence of spacetime. We begin with the observation that the "interior" of the initial singularity is all that exists insofar as to be "outside" the initial singularity is to be in a region of logical inconsistency which cannot exist. Inside is a Hilbert space or orthogonal rays, differentiated, like angels in the Christian heaven, by the fact that they are different species or states. Aquinas, Summa I, 50, 4: Is every angel a different species?
<Gravitation appears to be a geometric feature of Minkowski space. The shell or backbone of the Universe. Einstein's mollusc refers to the soft interior of a differentiable manifold, but this mollusc may also have a hard shell that gives overall structure to the animal and grows as the animal grows. The enormous energies we see in cosmic events may be fixed points induced by the gravitational shell. Einstein, Lawson & Penrose (1916, 2005): Relativity: The Special and General Theory
Zurek shows that the selection process for systems to move from imaginary to real is the condition that information can be transferred from Hilbert to Minkowski. Now we have the creation of spacetime by the random interactions of vectors and operators and we imagine that two classes of particles are formed, bosons and fermions. We have two constraints: the system is closed and therefore the Minkowski space is curved; the fermion network requires interference free communication so we require three real dimensions. This, plus the continuity of spacetime takes care of gravitation. Now we turn to the question of the zero energy universe and potential and kinetic energy and introduce the Lagrangian and the quantum of action. And then we introduce zero charge, positive and negative, magnetism and the vector potential and give a final brief summary of QED. Then quantum chromodynamics. The mad explorer, crashing through the bush, hoping to discover something worthwhile and get through. Wojciech Hubert Zurek (2008): Quantum origin of quantum jumps: breaking of unitary symmetry induced by information transfer and the transition from quantum to classical16: Quantum amplitudes and logical processes are invisible
@@@The interiors of fundamental particles are invisible, but we can see inside the baryons although we cannot isolate the quarks and gluons. We see these invisible features as the Turing machines that give the particles their properties, ie mediate their output given a certain input. They are invisible because they are fully taken up with maintaining the integrity of the particle of which they are part.
page 113: 'Unfortunately it was found in the 1930s that the higher order correlation in the series for e and m are all infinite due to integrations over momentum that diverge in the large momentum (or small distance) limit [maybe an artefact of the fact that the relevant distance is not the distance in Minkowski space but the "distance" in Hilbert space where the action is].
page 113: '(. . . the natural scale of QED is the Compton wavelength
[page 196]
of the electron, 10-11 [or zero if we accept the idea (page 113) that the energy of an electron is infinite, part of the overall problem arising from using the real numbers to describe the quantized world].
What we are thinking is that every additional quantum of actin adds a dimension, that is an oriented complex plane, to the Hilbert space and we would like to correlate these with Turing machines of increasing complexity through superpositions which are feeding into more and more complex features in Minkowski space.
@@@The machinery of quantum transformation through time is represented by the energy (or Hamiltonian) operator. This operator creates a time sequence of mapping of a Hilbert space onto itself. This mapping fulfills the hypotheses for mathematical fixed point theorems like that found by Brouwer: any continuous function f mapping a compact convex set into itself has a point x0 such that f(x0) = x0. The results of quantum observations may be understood as the fixed points predicted both by this theorem in general and by the more specific theorems of quantum mechanics like the eigenvalue equation. Brouwer fixed point theorem - Wikipedia
The evolution of quantum wave functions is invisible. We see only the fixed points revealed by "measurement". We presume here that measurement is not something specific to physicists, but that elements of the Universe represented in Hilbert space continually interact with one another, that is measure one another, and so communicate with one another through the exchange of particles. The creation and annihilation of particles is a reflection of the evolution of wave functions and also controls this evolution, so that we consider the two processes to be duals of one another, carrying the same information in different forms across the boundary between the quantum and classical worlds.
A similar level of uncertainty exists at all other scales, greater and smaller that the human individual. One of the most surprising discoveries of twentieth century physics is the uncertainty principle which holds at the quantum mechanical level in the physical network, our most fundamental theory of the Universe. The Born Rule is a quantum mechanical representation of this uncertainty. Born rule - Wikipedia (link ref above)
Until the advent of quantum mechanics, physicists were generally inclined to believe that the world was deterministic. They still attribute determinism to the invisible process that underlies quantum observations, but they now have to accept that even though this process may be deterministic, it does not determine the the actual outcome of events, but rather the relative frequencies of the spectrum of outcomes that may result from a particular event. Laplace's demon - Wikipedia
Invisibility can arise from three sources. The first is limited resolution. When we measure the diagonal of a unit square with a relatively precise instrument like a micrometer, we might see that it is 1.4142 units, an approximation to √2. With a ruler graduated in units, on the other hand the best we could say that the length is somewhere between 1 and 2. What we can and cannot see depends on the instrument we use. Measurement uncertainty - Wikipedia
We know that the physical Universe is graduated or pixellated in units of Planck's constant, the measure of the smallest action, the minimum possible event. This is the limit of the resolution at which we can study action in the world, and because it is imprecise details are invisible and the prediction of the nature and occurrence of particular actions is uncertain. Planck constant - Wikipedia
What is not uncertain is the exact nature of the action, because it is coupled to Planck's constant. When an electron moves from one orbital to another in an atom it emits or absorbs a photon with one quantum of angular momentum and the electron involved changed its orbital angular momentum by one unit also, in the opposite direction because angular momentum or action is conserved. Using quantum electrodynamics, we can sometimes compute the energy change associated with this transition to many decimal places, and there is no reason to suspect that it is not an exact constant of nature. It may be that in nature values like the Planck constant are implemented with unlimited precision. This is why we can construct atomic clocks accurate to one second in the age of the universe. Photon - Wikipedia, W. F. McGrew et al: Atomic clock performance enabling geodesy below the centimetre level (link ref above)
The second source of invisibility is that what we are looking at must cooperate in the process of being seen. The evolution of the wave function is invisible because it can only be seen if it is communicated, and the communication process also requires computation to encode and decode the message. If a computer were to stop to explain itself after every operation, it would be required to explain the operations of explaining itself and so would not be able to make any progress on the original computation. For this reason we can only see the results of halted processes.
There is, therefore, a logical limit to our knowledge of the world which is implicit in the scientific method of hypothesis and testing. We make hypotheses about what is happening in regions where we cannot see, and test them by exploring their consequences to see if they correctly predict phenomena that we can see. A successful test does not necessarily guarantee a correct hypothesis but a failed test tells us that the hypothesis must be revised.
The fixed points of a quantum dynamic system are revealed by the eigenvalue equation: MΨ = mΨ, where m is an eigenvalue corresponding to an eigenvector Ψ of the operator M, often called the measurement operator. The measurement operator models the extraction of information from the quantum system measured. Eigenvalues and eigenvectors - Wikipedia, Jim Branson: Eigenvalue Equations
It took physicists nearly thirty years, from 1900 to the late 1920s, to bring quantum mechanics to its definitive form. An important step forward was made by Heisenberg who pointed out that our only task is to explain the observable phenomena. We need not be bound by the classical historical picture of the world but are free to explore all possibilities to find satisfactory explanations. Werner Heisenberg: Quantum-theoretical re-interpretation of kinematic and mechanical relations
Why can't we see the mechanism that yields these results? Here we are proposing that the Universe is digital 'to the core'. We understand this by analogy with computer networks like the internet, and there we find an explanation for the invisibility of process and the visibility of the results of processes. We asume that the observable fixed points in the Universe are the states of halted computers and the invisible dynamic of the Universe are executed by invisible computers. We suspect the presence of deterministic digital computers because of the precision with which nature determines the eigenvalues of various observations. Bastin & Kilmister (1995): Combinatorial Physics
The third source of invisibility is symmetry. A snowflake is symmetrical, with six identical 'arms'. Because they are identical we cannot tell which is which. If we look away and someone turns the snowflake around, we have no way of telling how far it was turned or if it is was turned all all.
Traditional theology holds that God is completely mysterious to us and beyond our ken.
Having established the existence of something, the next question is how it exists in order that we learn its nature. But because we are unable to know the nature of God, but only what what God is not, we are not able to study how God exists, but rather how God does not exist. . . .We can show how God does not exist by removing from him inappropriate features such as composition, motion and other similar things. . . . Thomas Aquinas, Summa, I, 3: Introduction, Thomas Aquinas, Summa I, 3 Proemium (Latin)
This is the famous via negativa.
Symmetries are situations where nothing observable happens. They are the practical boundaries of the dynamic Universe. We may picture this to a degree by imagining the string of a piano or guitar. When struck, the string vibrates at every point except at the ends, which are held still by the structure of the instrument, and the nodes, which are fixed by the symmetrical motion of the overtones. Symmetry - Wikipedia
When we consider the Universe as divine, we can imagine the symmetries discovered by physics as the boundaries of the divinity. From a logical point of view, the dynamics of the Universe is consistent. The boundaries of the dynamics are the points beyond which it would become inconsistent, that is non-existent,.
All our experience is experience of God, and all our experiences are in effect measurements of God, that is events that we see as fixed points in the divine dynamics. We can learn a lot more about the natural God than the Christians can learn about their god. The natural God is only partially visible, but since we are continually in contact with it, we have a good chance of learning how it works. We can know nothing about the completely invisible Christian God but what the Christian Churches choose to tell us. Nevertheless true knowledge of God is necessary for survival.
17: Energy bifurcates into potential and kinetic, enabling a zero energy universe. e = mc^2
18: Some models: transfinite classical and imaginary computer networks
Cantor's theory of transfinite numbers may be used to describe a transfinite computer network. The key to this description is the fact that the cardinal of the set of Turing machines is the same as the cardinal of the set of natural numbers, so that that we can establish a one-to-one correspondence between the two sets. Here we concentrate on the development of the second transfinite cardinal ℵ1 from the first ℵo. Since, as Cantor explained, the generation of the transfinite cardinals is a recursive process driven by a "unitary law" this first step is easily extended to all the generation of all subsequent transfinite numbers.
To each sequence of natural numbers in the set of permutations there corresponds a sequence of computers. Aristotle studied space and provided a definition of continuity in his Physics which supposes that two lengths are continuous if their ends overlap (network property b above). This definition is reminiscent of Aristotle’s syllogistic logic in which the premises leading to a conclusion overlap by a middle term. This suggests the name logical continuity.
Here I use the concept of logical continuity to apply Cantors insights to the structure of the universe using a transfinite computer network as a backbone for the idea that the universe is a divine mind.
%%%Each of the systems outlined above contains an invisible isolated process: Aristotle’s god enjoys the pleasure of thinking about itself [1072b15] while remaining completely independent of the world; to communicate with an inertial system would be to exert a force upon it so that it was no longer inertial; the initial singularity is isolated since it contains the Universe outside which there is, by definition, nothing; according to the theory, isolated quantum systems cannot be observed without changing them; one cannot observe the internal state of a Turing machine without breaking into its process and so destroying its determinism.
Here we propose that motions of these isolated systems are equivalent to Turing machines and therefore isomorphic to one another despite their differences in definition and their historical cultural roles. This proposition is based on the premise that the motion of a universal Turing machine embraces all computable functions, that is all classically observable transformations.
%%%19: Space-time is the operating system of the universe
20: Dynamics, creation fixed points and particles
In modern physics the deepest layers are understood to be one or more vacua which are often explained as seething seas of energy perpetually creating and annihilating real and virtual fundamental particles. These vacua are the source of a serious problem. Calculations using the standard theory suggest that their energy density should be approximately one hundred orders of magnitude greater than that observed. No other physical theory has ever been so wrong! It might be claimed that the vacuums are only virtual, but we also believe that gravitation sees energy in all its forms, so the calculated energy density of the vacuum would make the universe very different from the place we inhabit. Cosmological constant problem - Wikipedia
Frank Wilczek, one of the developers of quantum chromodynamics, has popularized his views in a book trying to explain what is going on in the depths of the universe. He proposes a new version of the classical aether (now called condensate) which many thought to have been slain by special relativity. His story seems plausible, until we come to page 109 where he lists a few numbers suggesting that the condensate is denser that we actually measure by factors ranging from 1044 to infinity. If the universe is to be divine, physics and theology must be mutually consistent. On the one hand Christian theology, with its angry and murderous God, serpents, demons and sin is quite incredible, and on the other we have the equally absurd dreams of highly respected physicists like Wilczek. It is clear that some revision is necessay in both fields. Standard model - Wikipedia, Frank Wilczek (2004): Nobel Lecture: Asymptotic Freedom: from Paradox to Paradigm, Wilczek (2008): The Lightness of Being: Mass, Ether, and the Unification of Forces
It may be that the approach outlined above sheds some light on the cosmological constant problem since the creation of new states is constrained by the no-cloning requirement that they all be distinct in a system that as yet does not allow the spatial distinction that we associate with fermions.
The culprit seems to be the zero point energy E = ½ℏω at very high frequencies ω. Wilczek explains: 'This so called zero point motion is a consequence of the uncertainty principle.' Obviously, given the cosmological constant problem, zero point energy as understood by standard theory needs correction. This revision might be consistent with independence of Hilbert and Minkowski space discussed above. How does does the uncertainty principle relate to Hilbert space? Frank Wilczek (1999): Quantum Field Theory (page 3)
In spacetime we have the uncertainty relations: ΔE.Δt ≡ Δx.Δp ≡ ℏ/2 which relate to the non-commutative quantities energy - time and momentum - space. It might be argued that the momentum-space uncertainty relation would be violated by a massive particle sitting motionless at the bottom of a potential well, its momentum and spatial uncertainty thus both being zero. If we are to honour the uncertainty relation above, this situation is impossible. This relation may be used, for instance, to explain why an electron does not fall to the bottom of the potential well created by the proton in a hydrogen atom. We assume that the spatial relationship between the electron and the proton minimizes the sum of the potential and kinetic energy of the electron. Quantum mechanics explains that an electron confined to be close to a nucleus would necessarily have a large kinetic energy so that the minimum total energy (kinetic plus potential) actually occurs at some positive separation. In this spacetime case, where energy is coupled to momentum, zero-point energy is essential for atomic stability. Zero-point energy - Wikipedia
The quantum of action is a very precisely defined natural constant which has recently been coopted to place our system of physical measurements on a completely natural footing. We need no longer depend on a metallic object to define the kilogram, the unit of mass. In Hilbert space, the quantum of action is a measure of the distance between orthogonal base states whose inner product is zero. The transition from one state to another requires one quantum of action. It is in effect, the opposite of the continuum. A continuum is a state where nothing happens, the foundation of symmetry. A quantum of action, as defined above, changes some p into some not-p, it is the generator of orthogonality and distinction (§8).
The current universe within the singularity emerges as a series of differentiations, the first being the differentiation of action into energy and time, bringing us to the fundamental equation of quantum theory, ℏ = E/2πω where ω, is a measure of frequency, that is inverse time. We follow this process up to the emergence of spcetime, guided by the set of principles listed at the beginning of this essay.
We have noted that the action of action is to create action, a process analogous to the traditional procession of God the Son from God the Father. The principle of requisite variety (§3.12 above) suggests the because of its absolute simplicity the initial singularity has no control over its action, that is the procession of action is a random event insofar as the interval between events is unconstrained although the outcome of each event, a new quantum of action, is precisely defined.
This is a general feature of quantum mechanics: the eigenvalue equation defines quantum states precisely, while the Born rule provides only a statistical measure of their occurrence. From a statistical point of view, the frequency of these random events is a measure of energy. We may imagine that the local interval (Δt) between events is a measure of local energy (ΔE), consistent with the "uncertainty" relation ΔE.Δt ≡ ℏ.
What are the fixed points in the initial singularity? The quantum theory of measurement provides a clue. A "measurement" is in fact a contact between two quantum systems, one the measurer, the other the measured. Even though some understand the act of measurement as the contact between a classical system (or a physicists) and a quantum system, here we understand that all systems in the universe are quantum systems.
Quantum measurement involves two equations. The first, the eigenvalue equation, which predicts that a measurement will yield one of the spectrum of possibilities corresponding to the fixed eigenvalues of the eigenvectors of the measurement operator.
The second is the Born rule which predicts the probability of each of these results (§10). In a laboratory situation experimentalists may take some care to devise measurement systems that provide the answers that they want, but these answers are still subject to probabilistic distribution. In the case of the initial singularity whose simplicity precludes control (§3, principle 12) we imagine that the system itself solves the eigenvalue equation by the random meeting of two states that share an eigenvector. This event may be exceedingly rare, but the theory suggests that the result will be a physically observable particle in a certain state.We have here a foundation for the evolution of the "particle zoo" which has revealed itself to the physics community over the last century or so. Particles are selected by the "mating" of randomly generated quantum states that happen to share stationary points, an amplitude φ such that |φ|2 yields the probability of the emergence of an observable particle. Particle Data Group - U of California
Quantum theory envisages a half formed and unconstrained reality in perpetual motion which, like the primordial quantum of action, provides unlimited variety by contact superposition, a process analogous to Cantor's generation of transfinite numbers. Selection occurs by 'measurement', when quantum states meet, communicate and define one another, analogous to a roulette wheel stopping and the ball falling into a stationary slot. Roulette - Wikipedia
21: Quantum electrodynamics: QED>
22: Quantum chromodynamics: QCD
23: Methodology: the mathematical community
23: Methodology: the mathematical community
Because the network model is symmetrical with respect to complexity we can use it to move back and forth from the featureless divine initial singularity to the enormously complex world of daily experience. From an abstract point of view a community is a communication network built from a threesome like the Trinity, the Father, the Son and the Spirit who binds them. We are enmeshed in a web of networks beginning with our relationships to ourselves and expanding through individual personal relationships to the huge array of long distance networks established by travel, postal services and telecommunications.
Following Hilbert, we understand mathematics to be a formal symbolic game whose only constraints are that is is interesting and consistent. The static formal mathematical relationships from which such models are constructed are brought to life in the minds of the mathematical community. The history of this community is documented since about 5000 years ago and we can see its footprints in older artefacts. George Gheverghese Joseph (2010): The Crest of the Peacock: Non-European Roots of Mathematics
We imagine that the creative force that drives mathematics to grow is the same as the creative force that drives the universe to create itself. Mathematics is cumulative, each new discovery or invention building on what is already known. Its formal structures, if correctly proven, do not decay and need no maintenance, apart from copying texts that deteriorate and the birth and education of new members of the mathematical community to carry on the work. Of course much may have been lost in times of disaster.The source of this force is entropy, which we may see as the attraction of the future embodied in imagination. In a nutshell, wide open spaces are attractive. We are attracted to the future because of the second law of thermodynamics: entropy (that is complexity) generally increases. Georg Cantor's theory of transfinite numbers shows us how big a wide open space can become. Transfinite numbers - Wikipedia
Throughout recorded history, mathematics as been an important tool not only for understanding our world but also for dealing with social issues like fair trading, distributive justice and the management of democracy. It begins with counting and arithmetic, accounting for discrete classes of objects like coins and sheep. It extends to the measurement of continuous quantities like land and fabric which inspired geometry. The relationship between physics and mathematics was sealed in Galileo's time when he claimed that mathematics was the language of the universe. Newton took a great step forward by inventing calculus to describe the motion of the Solar System. Gauss and Riemann extended calculus to define differentiable manifolds which became the mathematical foundation upon which Einstein built the general theory of relativity and opened the way to a picture of the universe as a whole. Differentiable manifold - Wikipedia
Calculus has since become the backbone of mathematical physics. Weyl notes that it provides an an exceedingly fruitful mathematical device of making problems "linear" by reverting to infinitely small quantities. If we take quantum theory seriously, the smallest practical quantity in the universe is the quantum of action. A recurrent theme in this essay is the idea that the use of continuous mathematics in quantum field theory could be the source of many of its problems. Hermann Weyl (1985): Space Time Matter
Newtonian physics ruled the world until the middle of the nineteenth century when electricity and magnetism opened up a new field of study. Maxwell's application of calculus showed that light is a form of electromagnetic radiation. About the same time spectroscopists were discovering the close relationships between light and matter and laying the foundations for quantum theory, which began to lead us down into the microscopic inner mechanisms of the universe.
An important development in twentieth century mathematics was the attempt by Whitehead and Russell to express mathematics in purely logical terms. Their approach laid the foundation for Gödel's discovery that formally consistent mathematics is incomplete. Turing's invention of the programmable computer and subsequent engineering developments helped to implement Whitehead and Russell's idea by showing that logic and arithmetic meet naturally at the level of binary arithmetic and binary logic. Whitehead & Russell (1910, 1962): Principia Mathematica
Here I wish to draw a methodological analogy between the mathematical community and the current state of quantum theory, based on the idea that the network model has a foot in both camps. The people in the community are structural particles and the sources of messages, the fermions. The space-time web of communications, conversations, conferences and the literature that connects the players are the bosons, the messengers, the public potential that motivates mathematicians. The interaction of fermions through bosons binds the community into a functioning whole. By being part of our own communities, we may get some feeling for how the system of the world works. Members of any community feel the community "field" through communication.
The Platonic view of mathematics is that its theorems have some sort of independent existence outside the human world so that mathematicians producing new proofs are not so much creating something completely new as discovering something previously unknown. Where these theorems come from we do not know, but in the Platonic world we may see them as ideas in the mind of an eternal God. The common Christian view is that the Universe was intelligently designed and created by an omniscient and omnipotent God. Here I like to think that the discovery of mathematical theorems, initially non-existent, is analogous to the world creating itself from the initial singularity by an evolutionary process of unlimited trial, occasional success and frequent failures. Sometimes we have waited millennia for particular mathematical discoveries like the complex numbers to emerge. Complex number - Wikipedia
We may say that the observable output of the mathematical community is published theorems. All the flow of education, exploration and discussion that leads to the theorems is from this point of view invisible, the symmetry which is eventually broken by the emergence of a theorem. We see a similar phenomenon in our parliaments. Their practical output is legislation. Behind the legislation is an invisible mountain of talk and dealings that feed the meat into legislative sausages, the dynamics behind the fixed points that appear as written laws, the political equivalent of theorems.
The mathematical field in the mathematical community exists in the minds of the mathematicians and their communications. How do we model this field as a local gauge symmetry? One point to note is that given all the human languages and arbitrary symbolic systems that may be involved, each core mathematical idea has a large space of representations which can be translated or transformed into one other. Here I will assume, by analogy with the mathematical community, that the information attributed to fields is stored in or represented by particles. I presume that an electron, like a mathematician, has a personality that guides its interaction with other particles. Further, I assume that every particle is a dynamic sub-network of the universe represented by an integral number of atomic quanta of action.The classical computer network model models reality in that each local component executes its operations in Minkowski space-time driven by inputs from other local components. These all operate on the local substratum of quantum process. We might say that a classical computation is pixellated by the work of the transistors, capacitors and resisters connected by classical communication channels all of whose behaviour is explained by quantum mechanics. We can see this as a model of a classical world pixellated by fundamental particles whose underlying interactions we try to model by quantum field theory. A computer network is in effect a large scale model of a network of atoms.
24: Some principles
1: Nothing comes from nothing, which suggests that the world, or its source, must be eternal.
2. Everything comes from action.
This is the fundamental principle of this essay, derived from the fundamental Christian definition of God: actus purus derived by Aquinas from the work of Aristotle and endorsed by the Roman Catholic Church.
This lays the foundation from an evolutionary process. Because the initial singularity is without structure cybernetics tells is that it has no control and so is able to try anything. The only constraint on this control is the physical implementation of the logical principle that it is impossible for a contradiction to exist.
All that follows in this essay is my own exploration of how this fundamental principle is worked out in physics to create the world we know.
3. Energy is and immediate consequence of action (section 9)
4. Everything is made of energy [section 15]
4. Imaginary systems are independent of real systems (secion
5. All systems may be modelled as trees, that is as layered networks.
6. Imaginary trees are modelled by quantum networks which are in perpetual motion
7. Real trees are an immediate consequences of interactions in invisible imaginary networks
8. We model interactions in imaginary networks as logical processes implemented by the contacts between imaginary states which we call observations.
9. 2: The inevitable and irresistible passage of time represents the perpetual motion at the heart of the universe. Aristotle defined time as the number of motion with respect to before and after. We map time to numbers with clocks. Clocks based on quantized changes in atomic energy are our most precise scientific instruments, W. F. McGrew et al: Atomic clock performance enabling geodesy below the centimetre level 3: We can model the universe as a communication network whose lowest "hardware" layer is the initial singularity, the creator. Theology is the science of God, traditionally understood to be love. Love comprises communication and bonding which I approximate with a computer network. In such networks more complex layers are built using the services provided by the simpler layers beneath them. Practical networks comprise many software layers build on physical layers that may themselves involve many layers of complexity. Tanenbaum (1996): Computer Networks, Computer network - Wikipedia 4: Communication requires contact Isaac Newton was forced by circumstances to admit that gravitation was some sort of "action at a distance", which we understand to be impossible. Quantum entanglement led Einstein to imagine "spooky action at a distance" but we shall see in §11 that this happens because quantum mechanics occupies world where there is no space or spatial distance. In the space-time world contact is maintained by particles moving at the speed of light which follow "null geodesics" whose beginnings and ends coincide in spacetime. (see §16) Geodesics in general relativity - Wikipedia 5: Symmetry with respect to complexity We may model the generation of complexity from simplicity using the representation of the transfinite numbers constructed by Cantor. Each new representation is constructed recursively using permutations and combinations of the elements of the previous representation. Cantor writes: out of ℵ0 proceeds, by a definite law, the next greater cardinal number ℵ1, out of this by the same law the next greater ℵ2, and so on. Georg Cantor (1897, 1955): Contributions to the Founding of the Theory of Transfinite Numbers page 109 6: Respect for our history requires construct our new pictures on a foundation of the old We see the genesis of science as analogous to the genesis of the universe, a gradual progression from feeling, music, poetry and mythology toward evidence based and economically valuable knowledge about how the world works. Our lives have been facilitated enormously through the provision of energy created by physics, health care and food production based on biology, and peace and cooperation through psychology and politics. In the time of Galileo empirical science began to gain ascendancy over dogmatic theology, but ancient theology has yet to escape the influence of politics and flourishes in many religious institutions. In this essay I wish to extend theology toward the realm of science. This opens the way for theology to bring the benefits of empirical knowledge of divinity into every aspect of our lives. 7: Given that the universe is divine and visible to everybody, theology can become a real science. Following Aristotle, Aquinas perceived theology as a deductive process working from obvious principles (per se nota) and the articles of faith developed by the Christian Churches through their imaginative interpretations of the Bible. The modern view of science includes careful testing and requires that our deductive insights are consistent with the Universe we observe. Aquinas: Summa I, 1, 2: Is sacred doctrine a science?, Fortun & Bernstein (1998): Muddling Through: Pursuing Science and Truths in the Twenty-First Century 8: God reveals themself through every human experience. Traditional Christianity understands revelation as the content of the Bible. The Bible is held to be the work of writers inspired by God to explain matters which are invisible to us. In a divine universe, every experience may be interpreted as revelation of the divinity both within and around ourselves, a vast trove of information exceeding all the literature of the world. 9: We identify the initial singularity predicted by relativity with the classical God. Like this God, the initial singularity is the source of unbounded action. Penrose, Hawking and Ellis understand Einstein's general theory of relativity to imply the existence of a structureless initial singularity as the source of the Universe. Astronomical observation of black holes provides evidence for the existence of such singularities. Hawking and Ellis speculate that the Universe was originated by a "time reversed" black hole. The initial singularity and the traditional model of God developed by Aquinas share three properties: they exist; they are completely without structure; and they are the source of the universe. Aquinas maintains that despite their simplicity God possesses a complete plan for the creation of the universe. Here, following modern ideas of the representation of information, I identify the divine mind with the actual universe. We can say no more about the origin of the initial singularity than we can say about the traditional God, and so we assume the traditional position that both are eternal. 10: Given its simple origin, interpretation of our observations of the world may be guided by a 'heuristic of simplicity'. Modern physics attempts to discern the structure of the Universe by observing its current enormously complex state. Quantum entanglement arising from the initial state suggests that every event is influenced to some degree by every other event, so that precise computation of physical parameters requires the superposition of an unlimited number state vectors. The layered network model, guided by the initial simplicity of the universe, may provide clearer view of the basic structures established by quantum mechanics and relativity which persist in the present as observable symmetries. 11: The power of creation is limited only by physical instances of the logical principle of contradiction. Discussing the power of God, Aquinas pointed out that God is limited only by their inability to create logical contradictions, such as that Socrates should be both sitting and standing at the same moment. In a model of God, this condition places a first level of constraint on the emergence of a Universe within the initial singularity, which may thus be understood to house only locally consistent structures. Structures breaking this constraint cannot exist permanently. Nevertheless the uncertainty introduced by the quantized structure of the universe (at all scales) seems to allow temporary excursions into undefined situations which may enable the creative power of the world. From this point of view momentum, like jumping, can bridge otherwise impossible gaps. Aquinas, Summa I, 25, 3: Is God omnipotent? 12: Logically consistent mathematics leads, via Gödels theorem to incompleteness and to the cybernetic principle of requisite variety. Following Galileo, mathematical modelling has become a primary tool of modern physics. Mathematical insights and methods have progressed well beyond what was available to Galileo. Aquinas believed that an omniscient and omnipotent God has total deterministic control of every event in the world. Gödel found that logically consistent formal systems are not completely determined and Chaitin interpreted Gödel's work to be an expression of the limits to control known as the cybernetic principle of requisite variety. This principle suggests that a completely structureless initial singularity has no power to control its behaviour, so that insofar as it acts it acts at random. Gregory J. Chaitin (1982): Gödel's Theorem and Information, W Ross Ashby (1964): An Introduction to Cybernetics 13: An entropic force attracts the emergent universe toward increasing complexity. The general theory of relativity shows that the expanding Universe is driven by a strong tendency to create space. This is consistent with the second law of thermodynamics which tells us that on the whole entropy does not decrease, meaning that the Universe has a tendency to increase its degrees of freedom. We might see this reflected in human aspirations for freedom, and we can see a mathematical foundation for this tendency in Cantor's theory of transfinite numbers. Like gravitation and the traditional God, freedom and increasing entropy are fundamentally attractive, acting like the old Aristotelian final cause. Andrea Falcon (Stanford Encyclopedia of Philosophy): Aristotle on Causality, Cantor's theorem - Wikipedia 14: Zero sum complexification. It may be imagined that conservation of energy requires that all the energy in the current universe is present in the initial singularity to power the big bang. An alternative view is that the total energy of the universe is zero. The potential energy carried by gravitons and other bosons being exactly equal and opposite to the energy of the fermions that emerge as discrete entities within the initial quantum of action. The idea here is that each step in the complexification of the universe steps through a phase of uncertainty to create new entities that add up to nothing, like potential and kinetic energy or positive and negative charge. Zero-energy universe - Wikipedia 15: Information and logical operations are a physical entities. Aristotle and Aquinas recognized two principles of being, matter and form. Form or idea was a key concept in Plato's picture of the world. Aristotle brought Plato's forms down to Earth to devise an explanation for change: change occurs when an element of matter assumes a new form. Matter thus constrains form to create a particular object. Quantum mechanics envisages an invisible world of quantum states analogous to Plato's forms. Physical particles are created when quantum states interact with one another in the process called (in laboratory situations) measurement. These particles are physical representations of the quantum processes that create them. 16: Only fixed points are observable We cannot see a photons because they have no rest frame. Their existence between their creation and their annihilation is invisible and they are in effect a quantum mechanical entity whose existence is purely speculative. On the other hand, we can observe massive particles because they have rest frames. We may illustrate this with a classical analogy. While the roulette wheel is spinning, we cannot see which sector the ball occupies. Only when the wheel comes to rest, can we observe the number the ball has chosen. This principle establishes the difference between the quantum and classical worlds. The quantum world is invisible because it is in perpetual motion. The classical world represents the fixed points of the quantum world, many of which feature in our everyday experience like houses and trees whose components, like atoms and molecules, are also massive and observable with suitable instruments.
25: Some conclusions18: PCT and all that
The emergence of space-time introduces new degrees of freedom and new constraints into the system of the universe. The Minkowski metric and the Lorentz transformation are interpreted as consequences of the special principle of relativity that all observers on inertial frames of reference see the same physical symmetries, particularly the same velocity of light. Einstein arrived at this conclusion by studying Maxwell's equations. As we have seen above, the structure of space is intimately connected to the velocity of light, and the velocity of light is intimately connected to the electromagnetic properties of space-time. Special relativity - Wikipedia, Maxwell's equations - Wikipedia
As Eintein discovered, gravitation is the foundation of the large scale structure of the universe, controlling its structure from the first instant until (probably) the last. The very small scale structure of the universe is controlled by the strong and weak forces, but the intermediate structure, including human physiology, is controlled by electromagnetism. The foundation of electromagnetism is electric charge, which is quantized and comes in two equal and opposite varieties, positive and negative.
Einstein felt that gravitation is a logically complete phenomenon and there is really only one possible gravitational field equation. A similar completeness is to be found in the world of electromagnetism, closely related to both the velocity of light and the structure of space, since magnetism is a relativistic effect of moving electric charge. It is not surprising that both Einstein and Weyl both sought (in vain) for a unification of gravitation and electromagnetism. Is this possible if we consider Hilbert and Minkowski spaces as independent degrees of freedom? Herman Weyl (1985): Space Time Matter (link ref §7)
The electrical force is roughly 1040 times stronger than gravity, but we rarely feel it because positive and negative charges are exquisitely balanced on a microscopic scale so that the world is mostly electrically neutral. This raises a question: is electromagnetism strong, or is gravitation weak. I will assume here that gravitation is weak and that the characteristic strength of the electrical force is closely connected to quantum contact interactions such as we find between the electrical and magnetic phases of the quantum harmonic oscillator which Einstein showed to be a physical particle, the photon.
From a relativistic point of view, photons are outside space and time, since the Lorentz transformation shows that an observer would see a particle travelling at c to have zero spatial extension and zero time, the characteristics of a null geodesic, the geometric representation of massless bosons. The sources of photons are electrically charged particles, electrons and positrons which are massive, observable fermions,
Nineteen century physicists realized that if electromagnetic radiation is a wave in some ether, the ether must be exceedingly rigid to account for the velocity of light and the forces between the electrical and magnetic components of the waves correspondingly high. We calculate the energy of a photon using the quantum mechanical formula E = ℏω, which suggests that the work done by each cycle of a photon is closely related to quantum of action, which happens to have the dimensions of angular momentum.
For many the discovery of the photon made aether unnecessary, to be replaced by the notion of field which some, like Auyang quoted above, see to be the invisible ontological foundation of the world. Here we see field as an abstract representation of the network of communication between fermions through the medium of bosons.
Historically, quantum field theory encountered many difficulties arising from mathematical beliefs in point particles and infinite degrees of freedom in spacetime carried across from classical physics. Feynman, Schwinger and Tomonaga solved this problem by substituting measured values of particle properties for mathematical assumptions, thereby suggesting the the infinities encountered in computations were artefacts of the theory. Richard P. Feynman: Nobel Lecture: The Development of the Space-Time View of Quantum Electrodynamics
The approach explored here is to restore the integrity of the quantum of action and give it a natural role in restoring finite integer values to the parameters of physics, leading to an understanding of the Hilbert world as a description of a communication process in perpetual motion constrained by logical consistency. This is based on the idea that a quantum of action is a physical representation of the universal logical operator not-and. Logically, a quantum of action transforms some p into some not-p which the no-cloning theorem requires to be orthogonal to the original p. This quantum mechanical process we take to be the source of spacetime, as described above.
We explore this spacetime using the Lorentz transformation which unites the dualities known as parity (P), charge (C) and time (T) which taken together are a single symmetry, but each of which can have two states, odd and even parity, positive and negative charge, and forward and reverse time.
The description of the relativistic transformation laws in the first chapter of Streater and Wightman's book PCT, Spin, Statistics and All That is based on the Heisenberg picture which honours the requirements of special relativity by treating time and space on the same basis:
to each state of the system under consideration there corresponds a a unit vector, say Φ in a Hilbert space, H. The vector does not change with time, wheres the observables, represented by hermitian linear operators on H, in general do. The scalar product of two vectors Φ and Ψ is denoted by (Φ, Ψ), called the transition amplitude of the corresponding states.
Two vectors which differ only by multiplication by a complex number of modulus one describe the same state, because the results of physical experiments an a state described by Φ may be expressed in terms of the quantities| (Φ, Ψ) |2
which gives the probability of finding Φ if Ψ is what you have. The set Φ of vectors eiα where α varies over all the real numbers, and the norm of Φ (written || Φ || and defined as [(Φ , Φ )] ½ is unity, is called a unit ray. . . . The preceding remarks can be summarized: states of a physical system are reperesented by unit rays.
Although the mathematical forms of Hilbert space place little constraint on the possible vectors and operators apart from unitarity, observations in Minkowski space have led to the development of super-selection rules which are understood mean that not every unit ray in Hilbert space represents an actual physical system. As discussed above, this suggests that Minkowski space places constraints on the physical realization of the possibilities implicit in Hilbert space, establishing a system analogous to evolution where the survival of individual organisms places constraints on the possibilities of generic variation, since many genomes are not viable enough to be reproduced.
In the realm of fundamental particles, viability appears to be controlled by group structures such as the Lorentz and Poincare groups and their subgroups. The details of these constraints are to be found in the enormous literature of particle physics.
Although we can now see the speed of light as a universal constant, in the beginning we can imagine that it was an arbitrary evolutionary solution to the problem of maintaining quantum contact in a spatially extended system. As we have noted, the velocity of light and null geodesics are aspects of the same innovation. Before the advent of space quantum interactions were instantaneous, but as the discussion of Alice and Bob above explains, entanglement is not suited to transmitting defined information, which is only possible with the preparation and reception of bosons travelling at c.
19: Space-time is the operating system of the universe
Our interface with the quantum world is generally called measurement or observation, and we are inclined to think of it as a carefully designed experiments aimed at answering particular questions like Does beta decay conserve parity? This is a very narrow conception of our interface with the quantum world, since every event that occurs in the classical world, from kicking a football to baking a cake is an input to the quantum world, which returns its answer in the form of the observable event.
Cooking a cake, for instance requires the assembly and mixing of specific ingredients, often with specific techniques in specific order, and the application of the right amount of heat for the right time. The outcome if everything is done well is a cake, as planned. The transformation from ingredients to cake is a very complex quantum mechanical event. Failures are possible if things are not done right. Beta decay - Wikipedia
A very common and easily understood interface between the classical and quantum worlds is a computer. I type on the keyboard, which initiates to a complex sequence of simple binary logical operations which display my typing on the screen. Most of this transformation is performed by integrated logical circuits which contain millions of simple electronic components like transistors, resistors and capacitors. Each of these acts as an interface between the spacetime world of wires and electric currents and the quantum mechanical marshalling of the electrons which represent the logic of the computation as electric potentials. My own body works in a similar way, my mind controlling my muscles as I type, my eyes transmitting the results to my mind for further appraisal.
Quantum mechanics began life in 1900 as a physical theory, but by about 1980 people began to think of it as the computational basis for the life of the universe. Transistors, the principal functional units in modern classical computers, are quantum devices and they are designed using quantum mechanics. As components are made smaller we approach quantum limits on making them predictable and deterministic enough to be components of relatively error free classical machines. David Deutsch (1997); The Fabric of Reality: The Science of Parallel Universes - and its Implications
The quantum world is inherently dynamic, in perpetual motion. This motion is considered to be continuous, deterministic and unitary (reversible) until it is interrupted by the by the interaction of two quantum states initiated by input from the classical world. Quantum mechanics has developed over the last century to understand this motion. The foundations of this picture of the world are quantum mechanical vacuums which are understood to be a continuous fields of random motion from which various symmetries select the fundamental particles which construct the universe (§5.2). The vacua are the seen as the ground states of the universe, the lowest possible energy states, often represented by the symbol | 0 >.
In modern physics vacua are home to two sorts of energetic excitations which represent real and virtual particles. A real particle is an energetic excitation of the vacuum like an electron or a proton which can be observed in the spacetime world. Virtual particles are not observable but are believed to exist in the quantum world. They have a sort of half reality as concentrations of energy which mediate interactions between real particles. A real photon is a particle of light which can be observed and measured. A virtual photon is a representative of the electromagnetic field that binds electrons and protons together to form the atoms of the elements of the periodic table.
Virtual particles are said to be 'off mass shell' which means that they have temporary lives made possible because quantum uncertainty enables brief evasions of the constraints of classical spacetime physics. The are in effect a sort of creative wild card in quantum field theories which enable systems to get to places they otherwise could not go. A common name for this phenomenon is 'tunnelling' which enables particles to cross barriers that are classically forbidden. The price paid for such events comes in the form of low probability. Despite the intense internal activity of a uranium238 nucleus, measured by its mass, the probability of an alpha particle (helium nucleus) tunnelling out of this nucleus is so low that it takes 4.5 billions of years for half of a mass of U238 to decay by this route to thorium 234. On shell and off shell - Wikipedia
A principle assumed for this essay is that only fixed points are observable (§2.15). This idea flows from the notion that the quantum world is invisible because it is in perpetual motion (§13). Mathematical fixed point theory tells us, however, that given certain conditions moving systems have fixed points, parts of the motion that do not move. These fixed points are the particles we can see in our world. Fixed points have a spectrum of lifetimes, since changes in motion lead to changes in fixed points. Protons live for a very long time, perhaps forever. Other real particles have fleeting lives, billionths of a second or less.
In the network model we understand particles not just as a fixed points but as sources, entities capable of sending and receiving messages in the universal network. On this definition I am a particle with a lifetime of about a century and you are reading some of my output right now. Particles are sources because they stand in the midst of a flow of information entering and leaving them. Food and air flow through me, sustaining my life. Information flows into me through my senses and out through my muscles.
The operating system of my computer handles input and output to the processor, connecting it to the world of data which flows through it. This data is transformed by the computational process defined by the software stored in the memory of my machine. Here, by analogy, we understand spacetime play the role of an operating system with respect to events in the quantum world, handling input, output and memory. Operating system - Wikipedia
Fixed point theory reveals how fixed points appear in motion. The role of science is to identify fixed points in our moving world. This knowledge is valuable, because the relationship between motion and fixed points works in both directions: changing motion gives us changing fixed points; changing fixed points changes motion. I manipulate my computer and my world by manipulating fixed points, seeking to achieve other fixed points through changing the invisible motions that generate fixed points. This is how I make cakes.21: Conclusion
This essay began as a theory of peace published on 2BOB Public Radio Taree in 1987. From my point of view the downside of religion is the sanction of war. By invoking an omnipotent deity, theologians justify a "religious exemption" to large the scale murder and genocide executed by political communities seeking hegemony. Jeffrey Nicholls (1987): A theory of Peace
History shows that the spread of religions has often been facilitated by imperial wars. In many religions, killing an unbeliever has been considered a virtuous act. The Catholic Church had its Crusades and since that time Christian nations have been warring against one another and systematically murdering, raping and pillaging all the communities around the globe that lack the military forces to resist them. We might consider religion and mass murder as a reciprocally related: wherever we see mass murder we might suspect a religious motivation, hidden perhaps by colour, race, caste, wealth, power or some other distinction between human groups.
It is unfortunate that this situation is to some extent guaranteed by the nature of evolution. Ultimately hegemony is gained by reproduction and in zero sum situations where survival of the fittest is played out murder, rape and pillage are (at least temporarily) effective strategies. The message of my theory of peace was that the antidote to this terrible reality is to take ourselves out of zero-sum situations. We can do this by taking advantage of the transfinite creation of human spiritual space and the consequent creation of economic space which are facilitated by creative communication and cooperation.
This appears to be the mechanism, present in systems at all scales from fundamental particles to human communities, that has brought the universe to be from an almost infinitesimal beginning. This beginning, identical to the traditional Christian God, has unbounded power of action bounded only by local consistency.
Two ideas have made it possible for me to add some solid flesh to this abstract idea.
First, theology is the traditional theory of everything that we use to understand our position in the world. Ultimately theology is about love and bonding, that is about the creation of networks. An important feature of networks is that they are scale invariant. Every network is built from an atomic structure which comprises two sources and a communication link between them. This structure is invariant, whether the sources are fundamental particles, people or galaxies.Second, the scale invariance of networks means that we can identify two very common sets of events in our world which occur at very different scales. The first is the mental act of insight which pivots between ignorance and understanding. It occurs randomly but often. At one moment we are faced by a situation we do not understand. Some time later we suddenly see what is going on. In the healing professions, for instance, a practitioner is faced by a series of symptoms. Sometimes the cause is obvious, like a broken bone. At other times considerable diagnostic effort may be required to reveal the cause.
The second set of events is quantum observations or measurements. This is the ubiquitous phenomenon studied in physical laboratories. Physicists set up situations where different pieces of matter can interact with one another. They control what is going in and observe what is coming out, and try to understand the invisible quantum mechanical processes that join output to input. Here quantum mechanics plays the role of the mind of the universe, and the work of the physicist is to discover what the universe is thinking. Michio Kaku (2021): The God Equation: The Quest for a Theory of Everything (link ref §2)
All this is discussed in some detail in this essay and provides us with glimpses of a way to understand our world as the mind of God, a basis for a scientific theology to put our religious beliefs on a realistic basis.
I am getting old now and my days are numbered. I hand these ideas over to others to develop as a step toward world peace. I am not the first nor the last to have this ambition, but we must keep trying if we are to take advantage of the five billion years of usable sunshine that lie ahead of us. I am already wealthy enough, so I don't need to profit from my ideas. They are yours to develop so long as you approach them with scientific honesty.
Theology, the theory of everything, tells us what we are as a function of our environment. Politics, the source of collective action, seeks to determine how we should act, given what we know about ourselves. Together they represent the confluence of knowledge and power whose principal historical manifestations have been the wars that have shaped human history. Margaret Macmillan (2020); War: How Conflict Shaped Us
In the context of evolution, war is an aspect of the selection processes that sift though the possible human cultural futures at the imperial scale of politics. We have seen a history of collapsing empires from ancient times to the destruction of the British and American empires in the last few centuries. In almost every case, vast military supremacy has failed in the face of theological and religious unity. The most recent examples being the failure of the military resources of the now extinct Soviet Union and the United States in Afghanistan.
Unfortunately power in all its forms, theological and physical, corrupts. The military aspects of both theology and physics have perverted them. Both the Catholic Church and the nuclear weapons establishment think that the raw power that they wield is a good thing, to be developed without limit. In fact it is largely useless because their power makes no contribution to humanity, and in fact degrades it. Christianity died when it sold out to Constantine. Physics died when it sold out to nuclear weapons. Jeff Tollefson: US achieves laser-fusion record: what it means for nuclear-weapons research
By uniting physics and theology I hope to bring them back to life by showing how creation creates peace. We are living on a benign planet in the midst of a universe of enormous violence. If we can understand how this happened, we can understand how to create peace for ourselves. The simple answer is to create space using sunshine, a boundless source of energy. Jeffrey Nicholls (July 2019): Entropy and metaethics
What is the difference between Christianity and a Fairy Tale. The stories are equally fantastic. The difference is a political institution asserting the truth of the Christian fairy tale, historically on pain of death, so that people accept it as real and live by it, even though there is no evidence whatever for the story as it applies to us before we are born and after we die. In effect we come from nowhere, go nowhere and our lives are governed by fictions about these two nowheres. The scientific story begins with an initial singularity, big bang and evolution, bringing us to be in an observable context into which we are absorbed and recycled when we die. One may say that all indigenous stories follow a similar pattern, the only difference being how we deal with the here and now bracketed between the two mysteries. Grace James (1912): Green Willow & Other Japanese Fairy Tales
Since time immemorial it seems that people have imagined that invisible forces control the world. Over the same period, some people have claimed the ability to control these invisible forces and developed businesses based on their claims. Traditionally these businesses often trade under the name religion, justified by a body of theory called theology. The picture of natural theology presented here broadens this picture of agency by providing a clear picture of how we, and every other agent in the world, handle the invisible computational side of the world in order to achieve our desires. This is the central idea of both cognitive cosmology and the technologies that derive from it, covering every aspect of human culture.
Religions are a bit like Star Wars and similar movie series, which eventually expand the narrative to produce a prehistory and a posthistory of an original central story. In real life, we only have access to the central story. The prehistory and posthistory are largely invisible to us, imaginative mythology rather than imaginative science built on measured experience.
Insofar as we are guided by some reality while we are alive , we can take a scientific attitude to life and manage our fantasies of before and after in order to optimize our lived, a sort of Lagrangian approach, which forms, for me, the essence of the scientific theology of which I wish to become a doctor before I die.
Christianity is built around the life of Jesus who spun a story about his origin as the Son of the Creator and his end sitting at the right hand of his Father, judging the living and the dead. From a personal point of view he was unlucky to fall foul of the Roman forces occupying Jerusalem but his gruesome death added weight and credibility to his story so that it has entrained billions over millennia. Now that I have reached a preachable story I wish to take a similar path to glory leaving a legacy of peace without unnecessary pain by pointing out that in a well planned life pain is not particularly necessary and is in many cases due to unscientific beliefs about reality. This is the deepest message of cognitive cosmology, my conclusion.
%end potential and kintic energy#4: Bugs and patches I: Theology
I wish to identify the universe with the creative mind of God. Practically this requires a union of physics and theology in a cognitive cosmology. Here I identify some theological impediments to this union. §5 deals with problems on the physical side. With the decks cleared we can then proceed to construct a picture of the world. Worldview - Wikipedia
The principal question for theology is "what does it mean to be human". I was brought up in the Irish Roman Catholic tradition in a small town in Australia. I first faced this question explicitly when I started school at the age of four. The nuns got us to learn the Catechism by recitation: Q. Why did God make me? A. God made me to know Him, to love Him, and to serve Him here on Earth, and to be happy with Him for ever in heaven. Until my 40s my humanity was defined by this and other doctrines of the Catholic Church. Now approaching my 80s, I have different ideas.
What do the Taliban, the Communist Party of China and the Catholic Church have in common? The simple answer is the enforced by indoctrination of children. I speak here to the Roman Catholic Church because I am a baptized member. I am also an ex-cleric well educated in its ways. One may see analogous positions in many other religious institutions. I see the chief conflicts between the Church and its human environment in the following positions:
1: The Roman Catholic Church's claim of divine right, infallibility and papal supremacy:
As well as claiming infallibility, the Pope enjoys supreme, full, immediate and universal ordinary power in the Church, which he can always freely exercise. Such power has often enabled the Church to ignore human rights and evade justice. Not only has it been responsible for widespread sexual abuse of children, it has frequently attempted to pervert the course of justice to hide these crimes. This evil has cost the Church not only its claim to moral and ethical superiority in the human community, but civil claims against the Church continue to absorb a large fraction of the funds donated by the faithful for its upkeep.
The alternative to this approach is a scientifically based theology which provides evidence based foundations for human rights, equality, the rule of law and human agency through democratic politics. John Paul II (1983): Code of Canon Law: §331: Papal Power, Kieran Tapsell (2014): Potiphar's Wife: The Vatican's Secret and Child Sexual Abuse, Papal supremacy - Wikipedia
2: The distinction between matter and spirit:
3: Deprecation of the worldThe Church depends for its livelihood on a claimed monopoly on communication with its God. Part of the cosmology that goes with this claim is that human spirits are specially created by God and placed in each child during gestation. This leads it to claim that the theory of evolution does not fully explain human nature and that neither we nor the Church are really indigenes in this world, but alien pilgrims destined for a post mortem life in a different state. The Church claims, in effect, that there are two truths: the truth of science and its own picture of the world which it considers to be superior to science.
This position is not tolerable in a scientific community which seeks evidence to justify public opinion. Insofar as spirit is real, it must be observable and open to study. If the universe is divine there is no reason to demand special creation for the human spirit. Pope Paul VI (1964): Dogmatic Constitution Lumen Gentium § 48 sqq., Pope John Paul II (1996): Truth Cannot Contradict Truth
The Church holds that the God of the Old Testament created a perfect world and then, angered by the disobedience of the first people (the Original Sin), punished us all for the duration of human existence by subjecting us to death, pain, the need to work for a living and the domination of reason by passion. In addition we no longer enjoy the supernatural grace of God considered necessary to admit us to our eternal post mortem reward in heaven. Original sin - Wikipedia
The Christian New Testament attempts to provide a limited happy ending to this divinely dictated disaster. It claims that the Father's murder of his divine Son, Jesus of Nazareth, "saved" us by giving them satisfaction for our crime. Those who are baptized in the Christian rite may now reach heaven. None of the other damage that God did to us and our world will be repaired until the end. We continue to live lives of pain and work in the face death. At the end of the world God the Father will repair the damage they did to punish us. Those judged to have lived in the required manner will enjoy an eternal life of the blissful vision of God. The rest are damned an eternity of suffering in Hell.This whole scenario is purely fictitious and is in effect a fraud on the human race. It is of great value to the Church, since we pay for our salvation by supporting this flawed institution. Insofar as the church promotes social welfare it is valuable, but it must base its claims on demonstrable truth and become a law abiding corporate citizen of the human world.
4: Misunderstanding of pain:
In a similar vein, the Church holds both that pain is punishment for sin, and that endurance of pain, even self inflicted pain, is a source of merit. It overlooks the fact that pain is in general an error signal that enables us to diagnose and treat errors, diseases, corruption and other malfunctions that impair our lives. Included here is the unnecessary pain caused by the false doctrines of the Church.
Often it is necessary to suffer a certain amount of pain to achieve an objective, such as having a baby, which means going beyond our comfortable limits, but there is little real value in pain for its own sake apart from its diagnostic role.
5: Absolutism:
From a scientific point of view, the Catholic model of God and the world is an hypothesis, to be accepted or rejected on the evidence. From the Church's point of view, the fundamentals of its model are not negotiable, and anybody who chooses to disagree with them is ultimately a heretic to be excommunicated from the Church. Historically the Church has tortured or murdered dissidents. The principal sanction in modern times is dismissal. There is no room in the Church for the normal scientific evolution of our understanding of our place in the Universe.
Within the Roman Catholic Church, the glass ceiling for women is at ground level: women are excluded from all positions of significant power and expected to play traditional subordinate roles. Even in recent times the Papacy has emphasized that, for reasons probably based on a misunderstanding of history, women must still be barred from the priesthood. Pope John Paul II (1994): Ordinatio Sacerdotalis: Apostolic Letter to the Bishops of the Catholic Church on Reserving Priestly Ordination to Men Alone.
7: Belief in active evil agents, aka Satan and other demons:
The Church claims to save us from both the original sin and from the dangerous activities an evil being, Satan, whom it claims to have been responsible for the Fall. This is a purely fictitious position that falls under the political ploy often known as a "paper tiger", a claim to protect us from a non-existent threat. Catholic Catechism: §§ 385-412: Satan
This is not to deny that there is evil in the world, much of which arises from the nature of evolution. Evolutionary success is achieved by reproduction. In the zero sum situations where survival of the fittest is played out evils like murder, rape and pillage are (at least temporarily) effective strategies. The antidote to this terrible reality is to take ourselves out of these situations. We can do this by taking advantage of the expansion of human spiritual and economic space which results from realistic global theology. This task is closely related to promoting systems of governance that enable individual agency and control the corrupting tendencies of uncontrolled power.
The New Testament constitution of Catholic Church claims a right and a duty to induce everyone to hear and accept its version of human existence. This is a natural policy for every organization whose size and power increases in proportion to its membership. The modern world, however, expects any corporation promoting itself in the marketplace to deliver value for value. People contributing to the sustenance of the Church and following its beliefs and practices need to be assured that they will indeed receive the eternal life promised to them. The only potential evidence for this is miracles attributed to saints. Saint - Wikipedia, Matthew Schmaltz: What is the Great Commission and why is it so controversial?
In addition, we need to be assured that the information provided to us by the Church is true judged by modern standards. It has been traditional to exempt churches from the usual requirements of consumer protection legislation but this is inconsistent with modern good marketing and advertising practice. The Church needs to be particularly careful that it is not passing on unverifiable information to children and others with limited critical ability. Ad Gentes (Vatican II): Decree on the Mission Activity of the Church, Lumen Gentium (Vatican II): Dogmatic Constitution on the Church
5: Bugs and patches II: Physics
1: Observers have no role in true science?
A fundamental lesson of quantum theory is that what we see depends on how we look. One of the saddest stories in science is that of Albert Einstein, whose total dedication to the notion of objective reality was perhaps the source of a lifetime of scepticism about quantum theory. The root of the belief in objective reality lies in the idea that the frames of reference that we use to measure nature are not part of nature. Therefore, all the measurements we take of a particular system using different frames of reference must be identical once we adjust for the differences between reference frames.
We might think that the claim that what we see depends on how we look destroys the concept of certain knowledge. The answer to this objection is the central point in Einstein's general theory and all his relativistic work. In order to get an arithmetic grip on the geometry of nature Einstein used Gaussian coordinates which are to a large extent arbitrary, compared for instance to the Cartesian coordinates which impose a fixed metric structure on a Euclidean space. The key to the general theory is, in Einstein's words: The Gauss co-ordinate system has to take the place of a body of reference. The following statement corresponds to the fundamental idea of the general principle of relativity: "All Gaussian co-ordinate systems are essentially equivalent for the formulation of the general laws of nature". Gaussian curvature - Wikipedia, Albert Einstein (2005); Relativity: The Special and General Theory, page 123
The key to getting a deterministic mathematical theory out of this somewhat arbitrary coordinate system is that the only observable points in nature are events and the space-time intervals between them. Whatever coordinate systems we choose must be constrained to give a one to one correspondence between identical distances and identical differences in Gaussian coordinates. An observation is an event, so that the foundation of science is equivalent to the foundation of general relativity: all observers, no matter what their state of motion must, to make sense, agree on what they actually see when it is transformed to the rest frame of the observed system
The difference between classical and quantum physics is that while a classical human observer may be considered to be completely independent of the phenomenon measured, quantum observations are in effect the universe observing itself. In the early days of quantum mechanics quantum measurement was taken to be the interaction between a classical observer and a quantum system, but the fact is that all systems in the universe are quantum systems so all events in the universe, (which include "measurements") are quantum interactions.
Beneath the real space layer of certainty reflected in classical relativity we have the quantum mechanical layer which introduces uncertainty because it encompasses invisible states which can only be brought into the realm of certainty by measurement, which means a somewhat unpredictable interaction between two invisible quantum states. In a laboratory situation the state to be measured is unknown; the state used to measure it is to some degree under the control of the experimenter. Because of this uncertainty we cannot predict what a measurement will reveal as we can in a macroscopic classical situation. This, I feel, is why Einstein was convinced that quantum mechanics is incomplete. Every event is in fact completed by the information obtained by a particular observer.
We will encapsulate this idea in a layered network model, the layer of identifiable observations being the outcome of interactions within a layer of invisible states which yield an amplitude ψ. What we observe in the classical layer are particles whose nature is determined by the interacting quantum states and whose probability of appearance is equal to the absolute square of the amplitude: P = |ψ|2. The interaction of invisible quantum events, ie an observation, gives rise to visible events, ie particles. What we require in a quantum theory is that linear transformations of the underlying states to different bases do not change P. This criterion is used to demonstrate the equivalence of the matrix, wave and path integral formulations of quantum mechanics.
2: Particles or fields?
3: The anthropic principle?Einstein derived the general theory of relativity from the assumption that reality remains the same no matter how we look at it so long as we have an algorithm to convert between different points of view. In general relativity this algorithm is the Einstein field equation and the general process of changing points of view is known as a "symmetry transformation": A symmetry transformation is a change of point of view which has no effect in the underlying reality: no matter how we look at a perfect sphere, it remains a perfect sphere: walking around it does not change how it looks any more that travelling around in the universe changes its reality.
Einstein was concerned with the universe as a whole. Fundamental physics is concerned with "fundamental particles", the simplest observable entities in the universe from which all the large visible objects like ourselves and planets are made. In the 1930's Eugene Wigner developed "Wigner's theorem", which used Einstein's idea to define a particle as something that remains the same no matter how we look at it. Wigner's theorem - Wikipedia
The philosopher Auyang writes:
"According to the current standard model of elementary particle physics based on quantum field theory, the fundamental ontology of the world is a set of interacting fields. Two types of fields are distinguished: matter fields [fermions] and interaction fields [bosons]. . . . In fully interactive field theories, the interaction fields are permanently coupled to the matter fields, whose charges are their sources. Fundamental interactions occur only between matter and interaction fields and they occur at a point. . . ." : How is Quantum Field Theory Possible? pp 45-46.
"Field has at least two senses in the physical literature. A field is a continuous dynamical system, or a system with infinite degrees of freedom. A field is also a dynamical variable characterizing such a system or an aspect of the system. Fields are continuous but not amorphous: a field comprises discrete and concrete point entities each indivisible but each having an intrinsic characterization. The description of field properties is local, concentrating on a point entity and its infinitesimal displacement. Physical effects propagate continuously from one point to another and with finite velocity. The world of fields is full, in contrast to the mechanistic world, in which particles are separated by empty space across which forces act instantaneously at a distance." (Auyang page 47)
We see the particles. The fields of quantum field theory are understood to be entities represented by mathematical functions ψ(x) at every point x in space-time, which create and annihilate particles at x. As we see from the article by Kuhlman quoted at the beginning of this essay, quantum field theory is a vast and difficult labyrinth of theory intended to explain the simplest entities in our world. Can it be simplified? Steven Weinberg (1995): The Quantum Theory of Fields Volume I: Foundations, page 31.
Simplification is achieved by symmetry. All electrons, for instance, are identical: if you have seen one, you have seen them all. Physicists would say that there is just one electron field throughout the universe which create and annihilates electrons. This essay is built around a much broader symmetry which we call action. The idea is that every discrete entity in the universe is a source, able to act by sending and receiving messages. Every particle, whether it be an electron, a person or a galaxy, has properties which determine how it interacts with its neighbours.
A source is an element of a network which speaks and listens to other sources. Here we proceed on the assumption that the fundamental source of the universe is the quantum of action, identical to the classical divinity. The multiplication and differentiation of the initial singularity is modelled initially on the Christian doctrine of the Trinity to be extended to the transfinite domain in both the classical and quantum theoretical worlds by the network model introduced in §6.
When we study the evolution of the universe from a gas of hydrogen and helium to its present state embracing carbon based life forms, we sometimes encounter points at which very improbable events are required to make further development. in he theoretical world, we may see this as analogous to Einstein's formulation of the general theory of relativity which finally opened our eyes to the universe as a whole. If Einstein crossed an abyss. If had not lived, would we now be in possession of this theory? The history of science points to long relatively arid periods between the paradigm changes that mark major scientific steps forward. Thomas Kuhn (1996): The Structure of Scientific Revolutions
One way to understanding the past through the empirical present is the anthropic cosmological principle. The idea here is that the Universe was deliberately constructed by a designing creator to allow for our evolution. This conclusion arises because some see evolutionary bottlenecks which require precise tuning of various physical parameters to arrive at conditions conducive to life. One of these concerns is the creation of carbon itself. We understand that heavier elements are synthesized by fusion of lighter ones. It turns out that there is no way to make carbon except by the fusion of three helium nuclei. Nuclear physics suggests that at first sight this very improbable event, which may nevertheless have been made possible by a couple of coincidences designed in by a creator. Anthropic principle - Wikipedia, John Barrow & Frank Tipler (1996): The Anthropic Cosmological Principle
The first of these is a resonance of beryllium-8 which increases the probability of fusion of two helium-4 nuclei. The second is the existence, predicted by Hoyle, of an excited state of carbon-12 which encourages the fusion of beryllium-8 and helium-4. Without these resonances it may be that the formation of carbon and carbon based life would be impossible. There are other scenarios where different universes with different fundamental physical constants would not have enabled the existence of life, and so prevented the existence of curious people asking questions like these. Triple-alpha process - Wikipedia
Here I assume that the intelligent universe I am trying to describe has overcome these apparent obstacles by its its own ingenuity without the help of a pre-existing creator. Here we meet the ancient theological problem: is the creator identical to the universe, or other than it? As we go along I shall try to point out that the universe as we know it is capable of exploring the same space of consistent possibility as is open to any divinity. As Aquinas notes, the only bound on the power of God is their inability to create inconsistency.
4: Does the world comprise discrete events or continuous processes?
Measurement is basically a matter of counting. We measure sets of discrete objects like trees, people and beans by bringing them into correspondence with the natural numbers. We measure continuous quantities by counting appropriate standard units of length, time, mass and so on. Here we may encounter fractional units which we usually express as decimal fractions carried to a precision appropriate to the task in hand. It was long ago recognised that there are formal quantities like the length of the diagonal of a unit square that can only be represented precisely by real numbers which can in theory be represented by decimals of countably infinite length. Completeness of the real numbers - Wikipedia
Real numbers and real and complex vectors constructed from real numbers are essential components of mathematical physics. This leads us to think that the physical world uses real and complex numbers to represent itself and that it is real in the sense of continuous. We feel free to use differential and integral calculus in our computations and to linearize complex phenomena by imagining them at an infinitesimal scale and then integrating the result to get the bigger picture.
Although complex functions are valuable intermediaries in modelling and computation, we feel that realities, eigenvalues for instance, must be represented by real numbers. Everything we actually observe in detail comes in discrete units, ranging from galaxies and stars, trees, people, atoms, fundamental particles and ultimately quanta of action. Insofar as quanta of action are atomic, their fragmentation into infinitesimals and integration into continuous measures may be convenient, but may misrepresent reality. Logical processes, which I propose here as the foundation of cognitive cosmology, are sets of discrete actions executed as time ordered sequences which we model mathematically as computing machines. I assume therefore, that all events represent integral numbers of quanta of action. The probabilities of events, on the other hand, may be represented by real numbers which are the absolute values of complex amplitudes which may themselves be sums of the amplitudes of a large number of superposed subsidiary events.
5: Are the boundaries of physics the same as the boundaries of mathematics?
The formalist approach to mathematics promoted by Hilbert sees mathematic as a symbolic game divorced in principle from physical reality and constrained only by the need for internal consistency. He felt that consistent formal mathematics would be complete and computable and was surprised when Gödel and Turing showed that this is not so. Formalism (mathematics) - Wikipedia, Gödel's incompleteness theorems - Wikipedia, Turing machine - WikipediaIf we assume that mathematics is capable of a faithful representation of the physical, computational and communication structure of the universe, this would lead us to suspect that these theorems also account for uncertainties in the universe and further suggest that a God limited by consistency alone is not capable of complete deterministic knowledge and control of the world.
Uncertainty opens two ways to understand infinity. Literally it means without boundary, unfinished. This places no constraint on size but on definiteness. Colloquially, it also means very big, as we might say the universe is infinite. We can also imagine another understanding of infinity which relates to the theological concept of omnipotence. Aquinas asks if God is omnipotent. Yes, he says, God can do anything possible. The only limit is that they cannot establish the existence of inherent contradictions, such as that Socrates should be simultaneously sitting and standing.
Let us assume that mathematics and physics are both equally omnipotent in the theological sense, constrained only by the non-existence of actual local contradiction (principle §3.11 above).
6: What is the value of Planck units and the Planck scale: why should h, c, G and kB be 1?.
The Planck units are a set of physical units devised by setting Planck's constant ℏ, the speed of light c, the gravitational constant G and the Boltzmann constant kB to the scalar value 1. This convention establishes a Planck time (5 x 10-43 second), Planck length (1.6 x 10-35 metre) and Planck mass (2 x 10-43 kilogram) corresponding to the conventional dimensions M, L and T. Apart from their elegance, these units are relatively impractical. Planck units - Wikipedia
Some expect that gravitation, macroscopically quite a weak force, will become significant in a quantum theory of gravitation expressed at the Planck scale, but since we have no quantum theory of gravitation, there is little evidence for this expectation. The discussion the evolution of the physical universe below will contain some suggestions for the origin of these and other constants of nature.
Like the reference frames with which they may be used, units are simply conventions that established the numerical correspondences between the continuous quantities and the units. From a theoretical point of view, it is very satisfying to use natural constants such as the velocity of light and Planck's constant as units for measurement, although their actual values may make them inconvenient and secondary standards are usually established for practical use.
7: Increases in entropy do not just happen, they must be constructed
Entropy is one of the simplest an most useful measurable quantities in the world, since it is intimately connected to communication which is the foundation of universal structure. It is simply a count of states, and a state, in Cartesian terms, can be any definite and separate entity ranging from a quantum state to a sheep to a galaxy.
On the whole entropy has received a bad press in engineering thermodynamics since it appears to place limitations on the efficiency of heat engines. Entropy is not responsible for this: it is conserved in an ideal Carnot engine. The problem lies with the temperature differences available between the hot and cold sources and the fact that the zero of temperature is so far below the normal exhaust temperature of the average heat engine. On the other hand, once Boltzmann showed that entropy could be understood as a count of the states of a system its reputation has been revised in the light of Shannon's communication theory and it has become a standard measure of information. Entropy - Wikipedia
While entropy was considered a "bad thing" the second law which tells us that entropy generally increases was not felt to require explanation. From the information theoretical point of view, however, entropy is a scarce and valuable resource which can only be increased by constructing more states, a very important long term goal in the design of integrated circuits and image sensors. Von Neumann has pointed out that quantum measurement increases entropy so that it is effectively creation of new states. The assumption here is that the entropy of the structureless initial singularity is zero, and progress in constructing the universe can be measured by the increase of entropy produced by the quantum universe observing itself. Second law of thermodynamics - Wikipedia, John von Neumann (2014): op. cit
8: Are the infinities appearing in quantum field theory real features of nature or are they artefacts suggesting the need for revision of the theory?
The success of the Standard Model has led the idea that the universe is built on a quantum mechanical vacuum to become a settled doctrine. It has had a chequered but triumphant career from Dirac's equation and his sea of positrons through the discovery of renormalization, beta functions and running couplings to its present supremacy marred, perhaps, by some difficult questions: where did the vacuum come from; how does it relate to the initial singularity predicted by the general theory of relativity; why is the cosmological constant computed using the standard model about 100 orders of magnitude distant from measured reality; why can't we renormalize gravitation? This has led to a situation reminiscent of the tough times between Dirac and Shelter Island when vacuum polarizarion was finally brought under control. Kerson Huang (2013): A Critical History of Renormalization, Beta function (physics) - Wikipedia, Shelter Island Conference - Wikipedia
These problems have a history going back to apparently infinite electromagnetic self mass of the classical point electron. Its reappearance in quantum elecrodynamics has been to some extent removed from the spotlight by renormalization group theory. Nevertheless, the unification of the standard model and gravitation is still blocked by the non-renormalizability of possible theories of quantum gravity. Now we are engaged in the desperate speculation we call string theory which, if nothing else, seems to violate the principle 10 above, the heuristic of simplicity which seems to be implicit in the notion that the universe is derived from an initial singularity. Renormalization group - Wikipedia, Michio Kaku (1998): Introduction to Superstrings and M-Theory
The fundamental problem, it seems to me, is the attempt to describe a universe built from inherently discrete atomic quanta of action using continuous mathematics. This leads, one way or another, to integrals which have zero in the denominator. Much of this essay is devoted to discussing this problem and proposing a solution which suits my desire to unify theology and physics: that the fundamental mechanism of the universe is best described using the logical network of communication and computation implicit in quantum theory.
9: Where does the vacuum come from?:
%Further comments on path integral%The quantum field theoretical vacuum is considered to be the base state of the universe from which all the details of particle physics are selected by the operation of various symmetries. Anthony Zee (2010): Quantum Field Theory in a Nutshell
Here we assume that the initial singularity is a quantum of action and that the action of action is to act. From the fundamental equation of quantum mechanics, E = ℏ ∂φ / ∂t we assume that such actions create energy. The absolute simplicity of the initial singularity, coupled with the principle of requisite variety suggests that the initial singularity has no control over its action, so repeated actions lead to a random spectrum of energies, which we may identify with the quantum mechanical vacuum.
This feature of quantum mechanics carries through to all scales. There is no mechanism available for us to predict exactly when a quantum event will occur. At best, through the Born rule, we can estimate the probability of an event occurring in a given time interval, often measured by a half-life.
Feynman's method establishes that the true path x(t) will be the one where the phase does not change as a result of small variations of the path. Classical Lagrangian mechanics uses the calculus of variations to identify stationary paths and finds that these coincide with the paths predicted by Newton's method. As we get further from the true path the phases become relatively random and so cancel one another in the superposition. We might assume from this that the true path is equivalent to one complete cycle of phase (φ = 2π), equivalent to precisely one quantum of action. This idea may serve as a connection between physical and logical events. An advantage of the Lagrangian approach (noted by Dirac) is that it it is relativistically invariant. Given Feynman's description of the quantum world, we can ask: how does this structure come to be within an initial singularity which is effectively a quantum of action? Before Michelson and Morley and Einstein, Maxwell and others saw the ether as the representative vehicle of electromagnetic radiation. Now the spirit of formalist mathematics may have penetrated physics and people seem to have become relaxed about formal structures with no representative vehicles. In the realm of field theory many see pure mathematical fields as the ontological foundation of reality. Michelson and Morley: On the relative motion of the earth and the lumeniferous ether The ancient doctrine of immaterial spirits seems to have taken root here, although its justification in terms of the need for intellect to be unhindered by matter has been overridden by neurophysiology, and the quantum foundation of mental process is tens of orders of magnitude finer than neural synapses and action potentials. Human brain - Wikipedia The root of the problem here seems to be the set of mathematical problems which began with Zeno and came into the mathematical and physical mainstream two thousand years later with Newton's invention and application of calculus to describe the heavens. The core issue is the attempt to represent a continuum by an ordered sequence of infinitesimal but discrete points. This approach seems implicitly self-contradictory as Zeno clearly illustrated. How does nature represent an infinitesimal point? How does a set of discrete points make a continuum? Zeno's paradoxes - Wikipedia Aristotle, ever practical, defined continuity as having endpoints in common, rather like the way the links in a chain embrace one another. This deals with Zeno's paradoxes and suggests that there is contradiction in Auyang's claim that "Fields are continuous but not amorphous: a field comprises discrete and concrete point entities . . ." Auyang op cit page 47, Aristotle (continuity): Physics V, iii, 227a10 sqq (ref link above). The physical reality appears to be that every physical interaction involves at least one quantum of action, a real event of finite size. Fractional quanta, particularly infinitesimal fractions, do not exist. The world proceeds stepwise, like logic. Mathematical continuity is a handy symmetry for dealing with probabilistic issues where the law of large numbers is applicable, but logical processes, such as those involved in cognition, are inevitably discrete, represented by physical entities like electrons, photons and action potentials. Action potential - Wikipedia, Andrey Kolmogorov (1956): Foundations of the Theory of Probability
In quantum mechanics Planck, Dirac and Feynman arrived at a much subtler approach. We begin with Planck's discovery that while the quantum of action is it is a real number it is, from a physical point of view, an integer and with the dimensions of angular velocity and all real physical events comprise an integral number of quanta of action.
In 1933 Dirac set out to understand the role of action in quantum mechanics. Quantum mechanics began from the Hamiltonian formulation of classical mechanics. Lagrange provided an alternative formulation which has two advantages: it enables on to collect all the equations of motion and express them as the stationary point of a certain action function; and it is relativistically invariant. Since there is no direct route from the Lagrangian to quantum mechanics, Dirac sees that we must take over the ideas of classical Lagrangian theory, not the equations.Paul Dirac (1933): The Lagrangian in Quantum MechanicsDirac chooses a route through the theory of contact transformations and finds that the Lagrangian, S plays the role of an exponent in a unitary transformation between two diagonal representations of a quantum state. In passing he finds that the classical definition of S does the job.
%Generation of the universe from the initial singularity%
We have guessed that massless bosons travelling at the speed of light on null geodesics enable to creation of space, that is a venue in which discrete massive entities can exist independently of one another through spacelike separation. At the fundamental level, we see such entities as fermions and the exclusion principle as constitutive of the spatial metric. The spacetime approach to this structure is the spin statistics theorem whose proof derives from the the existence of spacelike separation in Minkowski space. In the underlying Hilbert space, bosons are distinguished by their energy or frequency. The emergence of space is accompanied by an increase in the potential entropy of the universe, since energy differences are no longer required to differentiate particles and spatial separation makes room for large numbers of identical fermions, like electrons. Streater & Wightman (2000): PCT, Spin, Statistics and All That
The Hilbert space explanation for the differentiation of fermions explains that because they have half integral spin the probability of the superposition of two fermion states is zero. If we assume that Hilbert is the primordial space and use the principle of simplicity to argue that in the beginning there are only two opposite phases of recursive function at any energy / frequency (a form of digitization) we can account for the primordial existence of spacetime and bosons and fermions. But where does gravitation come in?
The answer that appeals most to me is that the general theory is a measure of the 'ignorance' of the initial singularity. This is because all information and communication is digital, as shown by Shannon's theory of communication, Turing's theory of communication and Gödel's theory of logic discussed above (§§8, 6 & 7). Therefore, insofar as gravitation is described by the continuous functions of differential geometry it carries no information and is therefore consistent with the primordial simplicity of the universe. It is, in other words, a perfect description a system of zero entropy which is subsequently populated by quantum theory, describing the quantum of action which underlies all communication and the magnificent complexity of the universe which has grown within the gravitational shell. This ignorance is implicit in the notion of general covariance that all Gaussian cooordinate systems are equivalent for the description of physical 'law' which is the foundation of Einstein's work:
The following statement corresponds to the fundamental idea of the general principle of relativity: "All Gaussian coordinate systems are essentially equivalent for the formulation of the general laws of nature."] Einstein (2005): Relativity: The Special and General Theory, page 123
We take the fundamental equation of quantum mechanics E = ℏω to be primordial in that the action of action is to act and energy is repeated action. This, combined with the no cloning theorem, establishes each of these actions as equivalent to the universal logical nand gate. At this point quantum theory sees no distinction between potential and kinetic energy. Only energy differences feature in quantum mechanics and we can set the zero of energy wherever we wish.
The modern approach to the "laws of nature" is to see them as symmetries. The modern view of symmetry was pioneered by Emmy Noether, based on the properties of continuous groups. The network model accommodates symmetry naturally by seeing each symmetry as a feature of a particular layer in the network. We take the Universe to be pure action. Symmetry, on the other hand, is absence of action, so that we understand symmetries to be the boundaries of the Universe, the borderlines between action within the Universe and inaction (ie nothing) outside it. Noether's theorem - Wikipedia
The origin of energy through repeated action seems to violate the conservation of energy, the action of action apparently creating energy without bound, the source perhaps of the big bang. To prevent the violation of the conservation of energy and to honour the principle of zero sum complexification (§3, principle 14) we need an inverse to the kinetic energy of the big bang which we understand to be potential energy, manifest as the gravitational potential. Potential energy, we guess, is the fixed point dual to kinetic energy, identical to mass.
As a consequence, the Lagrangian of the universe as a whole, ∫ (KE - PE) dt approximates, within one quantum of action, to 0. In Newtonian terms for [almost] every action there is an equal and opposite reaction.
In the previous section we introduced space as the dual manifestation of time transformed by the speed of light. Now we introduce momentum as the dual manifestation of energy, transformed once again by the velocity of light. For the massless photon, the transformer between time and space, energy is identical to momentum. For massive particles, the same idea is expressed by Einstein's equation E = mc2. Although mass and energy are numerically equivalent, they are conceptually quite different.
The most obvious feature of gravitation is the gravitational potential that holds us on the planet and kills us when we fall from significant heights. We may look upon potential energy as kinetic energy travelling in the form of a massless particle at the speed of light. This is the nominal speed for gauge particles like photons, gravitons (if they exist) and gluons. We may understand the potential energy of massive particles as arising from their internal motions moving at light speed, so that their interior world comprises null geodesics which account for their apparent zero size in spacetime. This seems consistent with the Wilczek's idea proposed above that the mass of baryons is produced by the kinetic motions of their internal components which generally much lighter than the baryons themselves. Mass, we might say, is energy trapped in a null geodesic. Potential energy - Wikipedia, Wilczek (2002) op. cit. chapter 10 sqq.
We can understand the 3D component of Minkowski space by thinking of the differences between wired and wireless in practical communication networks. Wireless communication is serial (one dimensional) and channels are distinguished by frequency or energy, as we find on quantum mechanics. Wired networks, on the other hand, need three dimensions for their existence in order to prevent wires intersecting in the same plane. We may consider the case of moving fermions by analogy with city traffic. In a two dimensional road system, time division multiplexing introduced by traffic lights enables traffic streams to cross. Three dimensional structures like overpasses and tunnels enable uninterrupted two dimensional traffic flow and separation of air traffic in three dimensional space is established by requiring vehicles travelling in different directions to operate at different altitudes. If we understand the emergence of new features in the universe as a matter of random variation and controlled selection, we may see that 3D space is adequate for complete wired connection, so spaces with 4 or more dimensions have no evolutionary raison d'etre and may be selected out.
Wired networks are therefore more like plumbing or electrical power networks.. Tuning is not required to discriminate sources but switching maybe necessary for one source to connect to many others. A wired network transmitting pure unmodulated power shares three properties with gravitation: it exists in four dimensions, three of space and one of time; it can deal indiscriminately with energy in any form; and the direction of motion of signals is determined by potentials.
From an abstract point of view a fixed point is the dual of the compact and convex nature of a set and we can expect to find a different fixed point corresponding to every continuous mapping of the set onto itself: fixed point theory is symmetrical with respect to the set of all continuous functions. In the case of quantum mechanics these fixed points are the solutions to the eigenvalue equation. Their existence is guaranteed by the Hermitian nature of the unitary operators in quantum mechanics. Agarwal, Meehan & O'Regan (2009): Fixed Point Theory and Applications
An important feature of the network model is that it is symmetric with respect to complexity. The atom of a network is two sources and a channel, which we may think of quantum mechanically as two bosons and a fermion. Sources and connections can exist at any scale. Communications between discrete sources become possible when they share a language or codec, that is a symmetry. Since gravitation is the universal codec which couples all sources without discrimination so long as they have energy, we can imagine that it emerges in the about same epoch of the evolution of the universe as quantum mechanics. Unlike quantum mechanics however, where connection is established by specific values of energy which we compute using the eigenfunctions of specific operators, gravitational connections are indiscriminate. This suggests that they represents a symmetry deeper and simpler than quantum mechanics which reflects the consistent unity of the initial singularity. We would expect gravitation to respond to all the energy levels present in a vacuum, for instance, which is why the cosmological constant problem is so troubling in a universe with infinite degrees of freedom each with attached zero point energy..
Consequently we might conjecture that gravitation is not quantized. In §8 above we have use Shannon's communication theory to connect error free communication with quantization. If gravitation is a universal code which cannot go wrong, there is no ground for quantization. Nevertheless gravitation imposes the precise structure on the universe.
The classical general theory of relativity predicts classical gravitational waves which have now been observed, giving us information about large cosmic events occurring at great distances. Great distance and the overall weakness of gravity mean that these waves require very large high precision interferometers for their detection. Gravitational-wave observatory - Wikipedia
In short perhaps we may see gravitation as a hybrid between classical and quantum reality before they became differentiated. Our cities are festooned with power cables, telephone lines and surrounded by clouds of wireless. Our bodies, on the other hand, are largely wired. But in all cases communication requires contact, and a symmetry that unites Hilbert and Minkowski space.
Given this picture, we might understand that energy attracts energy because it is all the same energy created by action. It is subsequently and being mapped into fixed potential energy equal and opposite to the kinetic energy from which it came. Lagrangian mechanics - Wikipedia (ref link above)
We might imagine that the coupling between the two spaces Hilbert and Minkowski which we ascribe to observation describes the inner life of particles, ie it is a story of consciousness in the same way as my conscious awareness is related to my physical actions. So I imagine that quantum computations in Hilbert spaces are to be found inside particles as my consciousness is to be found inside me. What I hope is that a clear distinction between Hilbert and Minkowski removes many of the difficulties in quantum field theory that arise from ignoring this distinction. So we think that every particle is a computer and the relationships between particles are networks.
This, and the idea that gravitation is not quantized, suggests that gravitation must be some primordial quality of the interior space of the initial singularity described by the classical general theory of relativity. This would be consistent with the idea that it is present from the beginning and guides the growth of the universe as a whole as Einstein found. As Einstein noted when he published his field equation for gravitation, the general theory of relativity is a closed logical structure and there is little choice for an alternative. From this point of view, we might see the interior of the initial singularity as a continuous Lie group fulfilling the hypotheses of fixed point theory. General relativity - Wikipedia, Abraham Pais (1982): 'Subtle is the Lord...': The Science and Life of Albert Einstein page 256, Lie Group - Wikipedia
We have built a Hilbert space inside the initial singularity by its propensity to act, our starting point being mathematical fixed point theory. The topological barrier constraining the universe being the boundary between being consistent inside and inconsistent outside.
-->