Essay 30: Cognitive cosmology: a prelude to cognitive cosmogenesis
In conclusion one has to recall that one reason why the ontological interpretation of [Quantum Field Theory] is so difficult is the fact that it is exceptionally unclear which parts of the formalism should be taken to represent anything physical in the first place. And it looks as if that problem will persist for quite some time. Meinard Kuhlmann (Stanford Encyclopedia of Philosophy): Quantum Field Theory
A physical understanding is completely unmathematical, imprecise, an inexact thing but absolutely necessary to a physicist. Richard Feynman: Lectures on Physics II Chapter 2: Differential Calculus of Vector Fields
Research is to see what everybody has seen and think what nobody has thought. Albert Szent-Györgyi (1957): Bioenergetics
Table of Contents
§1: Abstract
§2: Introduction — top down or bottom up?
§3: Action from the unmoved mover to the quantum
§4: Theology: a new paradigm?
§5: God's ideas, cybernetics and singularities
§6: Evolution, genetic memory, variation and selection
§7: Networks, bodies and brains
§8: The theology of the Trinity
§9: The active creation of Hilbert space
§10: The emergence of quantum mechanics
§11: Quantization—the mathematical theory of communication
§12: The quantum creation of Minkowski space
§13: Quantum amplitudes and logical processes are invisible
§14: Measurement—the interface between Hilbert and Minkowski
§15: Quantum amplitudes and logical processes are invisible
§16: Network I: Cooperation
§17: Network II: transfinite logical space
§18: Transfinite Minkowski space
§19: Space-time—cosmic memory and operating sstem
§20: Fixed points, laws and symmetries
§21: Potential + kinetic = zero energy universe
§22: Gravitation and quantum theory—in the beginning
§23: Quantum field theory
§24: Quantum chromodynamics: QCD
§25: Conclusion
§26: Some principles
§1: Abstract
Quantum mechanics began as a physical theory to explain spectra of atoms but now it is understood as a theory of computation and communication. Theology is the ancient and traditional theory of everything. If the Universe is to be divine, physics and theology must share the same space which means that they must speak the same language. This I understand to be the power of communication and computation implicit in the formalism of quantum theory.
What follows is my attempt to unite physics and theology in a Universe understood, by analogy with our own minds, to be the mind of God.
(revised Sunday 29 January 2023)
§2: Introduction — top down or bottom up?
In the beginning God created the heaven and the earth. Genesis I:1-4 (Genesis, from the Holy Bible, King James Version)
My theological goal is to describe a divine universe, which means that theology and physics, both disciplines seeking a theory of everything in their own domains, must become mutually consistent.
The key to my story is symmetry with resect to complexity, the idea that the universe grows and thinks as I do so that a theory of everything and a theory of my own life are essentially isomorphic.
I began as an unthinking embryo constrained by my genome and fed with material and energy by my mother. Now I am an old person approaching 80. Throughout this history I have been following the same strategy, building on what I have to make the next step.
The primordial step in my journey was taken by the Christian God with the procession of the second person of the Trinity, truly God, but not God the Father. Dogma restricts the Trinity to three persons. In my story there is no limit to the fecundity of the divine action so the Universe is full of Gods as Thales realized, a huge history of quanta of action. Aquinas, Summa, I, 27, 1: Is there procession in God?, Thales of Miletus - Wikipedia
My starting points for this work are Thomas Aquinas and Bernard Lonergan. I read Lonergan's Insight after three years studying Aquinas as a young monk, still a firm Catholic believer. Lonergan broke the mould. His work is an attempt by a Jesuit priest to bring the Dominican Aquinas into the twentieth century. For me he failed. His attempt to produce a new proof for the idea that the world in not God made it very clear to me that any consistent approach to theology must identify God and the world. My first attempt to express this view led to my expulsion from the Order. Bernard Lonergan (1992): Insight: A Study of Human Understanding, Jeffrey Nicholls (1967): How Universal is the Universe?
My program, initially motivated by Lonergan's Insight, is to work in the domain of logic and computation. Both these disciplines have the same unique constraint: only those systems will last that do not involve a contradiction.
The modern model of physical change is quantum field theory, which proposes a space of invisible fields to guide the behaviour of the observable world. This theory has problems. Practically, the most acute is the 'cosmological constant problem'. One interpretation of quantum field theory predicts results that differ from observation by about 100 orders of magnitude, ie 10100. One point of this essay is to re-interpret the relationship between mathematical theory and reality in a way that points to a solution this problem. Quantum field theory - Wikipedia, Cosmological constant problem - Wikipedia
A second problem is that the project to produce a model that unifies the four physical communication channels, gravitation, electromagnetism, the weak force and the strong force, has also run into trouble with its inability to embrace gravitation in the Standard Model.
In Christianity and many other religions, power comes from above, from the Heavens, the blessed realm of pure divine spirit. The lower levels of creation, in contrast, tend toward weakness, evil, and corruption: the underworld. The reality is exactly the opposite. The world is built on action which creates energy and drives perpetual motion at the creative foundation of the world.
In the traditional creation story God created the world according to a plan which already existed in their mind. Thomas Aquinas calls this plan ideas, a nod to the Platonic notion that an invisible heaven of ideas or forms defines the structure of the world. Plato thought that our world is a poor reflection of these perfect ideas, which is why it was, to his mind, rather shoddy. Only philosophers (like himself) could perceive the true beauty of the invisible forms. Aquinas, Summa I, 15, 1: Are there ideas in God?, Form of the Good - Wikipedia, Allegory of the cave - Wikipedia
Here I wish to break from this heavenly tradition, following a consequence of the general theory of relativity.
Penrose, Hawking and Ellis deduced that Einstein's general theory of relativity predicts the incidence of singularities at the boundaries of the universe. There is now strong astronomical evidence for the existence of black holes, and Hawking and Ellis speculate that the big bang that initiated the emergence of the universe within the initial singularity might be understood as a "time reversed black hole". Hawking & Ellis (1975): The Large Scale Structure of Space-Time, Big Bang - Wikipedia
These singularities are considered to lie outside the laws of nature. Our experience with black holes, shows that they contain energy and mass which shape the space-time around them, so controlling the motions of nearby visible structures. This suggests that the pointlike singularity at the root of the universe may contain all the energy of the universe.
This seems to me hard to imagine. From a formal point of view, this initial singularity is indistinguishable from the traditional God of Aquinas: it is outside space and time, and so eternal; it has no structure, so it is absolutely simple; it is beyond the laws of nature, so can give no meaning to energy; and it is the source or creator of the universe. Tradition, dating from Aristotle 2350 years ago and descended to us through Aquinas and Catholic theology, holds that God is pure action actus purus. So here I will assume that the initial singularity is action rather than energy. Aquinas, Summa, I, 3, 7: Is God altogether Simple?
The absolute simplicity of the initial singularity, in the light the cybernetic principle of requisite variety, precludes the existence of any plan of the universe within it. All we have is unbounded reproductive activity controlled by local consistency. What we need to look for is a mechanism for the emergence of the universe as we know it within the initial singularity. W. Ross Ashby (1964): An Introduction to Cybernetics
Charles Darwin has already provided us with the answer to this problem. Evolution has two facets, random variation (the source of creativity) and deterministic selection, which picks out the logically consistent survivors from the myriad of possibilities created by variation.
The ancients, like Plato, Aristotle and Aquinas divided the world into material and immaterial or spiritual. They thought that knowledge and intelligence are correlated with immateriality. Aquinas argued that God is maximally intelligent because they are maximally spiritual. Since that time, we have come to see information as a physical entity and intelligence, that is information processing, as a physical process. Intelligence is represented technologically by the work of computing machinery, biologically as processes in complex neural networks like our brains, and physically by the power of computation and communication embedded in the quantum world. Aquinas, Summa: I, 14, 1: Is there knowledge in God?, Rolf Landauer (1999): Information is a Physical Entity, Nielsen & Chuang (2016): Quantum Computation and Quantum Information
Quantum mechanics is worked out in Hilbert space, an invisible abstract linear vector space which works with complex numbers. We may think of quantum theory as a modern description of the ancient idea of spirit. It is rather diaphanous, it is invisible and exists in an environment of pure action that precedes spacetime and classical physics. It is very similar to music, in perpetual motion, feeding the visible universe with the possibilities which serve as the foundation of evolution. Interpretation of quantum mechanics - Wikipedia
We are therefore led to imagine that one of the first things to emerge within the initial singularity is spacetime. Isaac Newton imagined that space and time are completely distinct, but Einstein's special relativity shows us that they are two aspects of the same reality. One of the remarkable features of Minkowski space is the existence of null geodesics, the pathways through spacetime which are followed by massless particles like the photon. Photons are in effect eternal. We can now observe the photons of the cosmic background radiation that was emitted less than a million years after the Universe began. These photons have spent fourteen billion years travelling a distance of fourteen billion light years, but have experienced no space-time.
This peculiar property of Minkowski space is the key to Einstein's theory of general relativity. Most of the problems of modern physics arise from the from the attempt to reconcile quantum mechanics and relativity. From my point of view this problem arises from the decision to build quantum mechanics on Minkowski space. It seems to me that physics has put the cart before the horse. Hilbert space is the source of Minkowski space. My feeling about the creation of spacetime appears (after a few preliminaries) on page 9: The active creation of Hilbert space.
Von Neumann shows that universe increases its entropy (that is creates itself) by observing itself. This implies that physics is a logical or cognitive process, akin to human self awareness, that we might best understand by thinking of the universe as a mind. John von Neumann (2014): Mathematical Foundations of Quantum Mechanics, Chapter 5
I see gravitation as our primordial image of God. It dates from day 0 of creation, before the initial singularity differentiated into spacetime and quantum mechanics.
This essay is a condensed version of my website Cognitive cosmology.com. There is a one-to-one correspondence between the sections of this essay and the pages of that site. If you want more detail about the content of any of these sections, please click on the link at the end of the section like this one: Cognitive cosmology page 2: Introduction — top down or bottom up?
(revised 23 January 2023)
§3: Action from the unmoved mover to the quantum
Action is what makes something happen. The actors are set to move and the director calls action. Action releases a potential that changes the state of a system.
Aristotle devised a simple theory of everything with three elements, potency, action and an axiom connecting them: no potency can actualize itself. This led him to the notion of an unmoved mover as the source of all motion. Motion as a transition from potential to actuality. If nothing can move itself, everything is moved by something else. But there must be an initial source source of motion, otherwise there would be no motion at all. This is Aristotle's unmoved mover which is, by definition, pure actuality. In it and through it, all potentials are actualized. Unmoved mover - Wikipedia
Aristotle's works entered the newly formed Christian universities of Europe when they were brought back from the Muslim East by the Crusaders. Albert the Great and Thomas Aquinas used Aristotle's work to build a new philosophical foundation for Christian theology. Aquinas developed a new Catholic model of God (which has since become standard) from Aristotle's theological treatment of the first unmoved mover. Thomas Aquinas, Summa, I, 2, 3: Does God exist?
Implicit in these ancient ideas are the dichotomies of God and the World, Heaven and Earth, Matter and Spirit. The leading idea is that matter is dead and inert. It cannot move itself. It cannot be the seat of understanding. It cannot be creative. Since the advent of modern physics, founded on relativity and quantum theory, these ideas are exactly wrong. Physics based on quantum theory describes a universe in perpetual motion as a gigantic network of communication (which is a form of computation) equivalent to a mind.
Medieval scholars, engineers, farmers and parents gradually learnt that ancient texts and pure reason could not fully explain the world. This attitude was strongly supported by astronomy, a science based on observation. Galileo's telescope led to radical developments in astronomy, and some conflict with ancient religious beliefs. Galileo's opinion that mathematics is the language of the universe reached a high point in Isaac Newton's description of gravitation which shows that the Moon in the heavens and apples on Earth were guided by the same structure. Galileo affair - Wikipedia, Isaac Newton (1972): Philosophiae Naturalis Principia Mathematica
Newton's work sparked huge developments in mathematics and physics which still continue. On a more philosophical level, Maupertuis (1698-1759) speculated that a wise creator would have made the world as efficient as possible, and started a train of thought that lead to Hamilton's principle, the amalgamation of the calculus of variations with Langrangian mechanics.
Joseph-Louis Lagrange sought to express classical Newtonian mechanics in a form that would make it easier to study many body problems like the solar system. His work, Mecanique Analytique placed mechanics on an algebraic rather than a geometric foundation. Joseph-Louis Lagrange (1811): Mécanique analytique Volume 1
In the Lagrangian approach the action S associated with an event x that takes place between times t1 and t2 is expressed by the action functional
S(x) = ∫ L dt.
The Lagrangian L = (T(t) −V(t)), where T and V are functions of the kinetic and potential energy of the system. Lagrangian mechanics postulates Hamilton's principle that the actual trajectory taken by a particle whose motion is constrained by T and V coincides with a stationary value of S (a fixed point in the action) which may be found using Euler's calculus of variations. Lagrangian mechanics - Wikipedia, Hamilton's principle - Wikipedia, Calculus of variations - Wikipedia
Lagrangian mechanics has been found to be very versatile and serves as a bridge between classical and quantum mechanics, quantum field theory and physical problems in general. On this basis, we might understand mechanics in spacetime as the study of action in the relationship between kinetic and potential energy.
Quantum mechanics began with Planck's discovery, in 1900, that action is quantized and that the quantum of action h is the constant of proportionality between the energy of radiation and its frequency. This is now the fundamental equation of quantum theory, E = ℏω where ℏ is the reduced Planck constant, h / 2π and the frequency is expressed in radians per second, ω, understood as the rate of change of the phase of a quantum state ∂ |φ> / ∂ t. Max Planck: On the Law of Distribution of Energy in the Normal Spectrum
For Aristotle and Aquinas action is a metaphysical term, but here we see that it has a physical realization, providing a bridge between physics and metaphysics in a way analogous to its role in coupling classical and quantum mechanics. Dirac found that this role goes deeper, and Feynman used it to create a new representation of quantum mechanics, the path integral formulation.
Quantum mechanics came of age in the 1920's in two versions known as wave mechanics (the Schrödinger equation) and matrix mechanics. These were shown to be equivalent by Schrödinger, given a clear abstract symbolic expression by Dirac and a sound mathematical foundation by von Neumann using linear operators in abstract Hilbert space. Dirac notes that the major features of quantum mechanics are linearity and superposition. Schrödinger equation - Wikipedia, Matrix mechanics - Wikipedia, Paul Dirac (1983): The Principles of Quantum Mechanics, chapter 1, John von Neumann (2014): Mathematical Foundations of Quantum Mechanics [ref above]
Feynman introduced a third approach to quantum mechanics which has since found favour because it provides a more direct route to quantum field theory and string theory. His path integral formulation seeks a stationary superposition of the contributions of all possible space-time paths between an initial and a final state. In principle, this set of paths spans the whole classical universe so the formulation depends implicitly on the idea, discussed on §13, that the quantum world is prior to and independent of spacetime. Feynman & Hibbs (1965): Quantum Mechanics and Path Integrals, Path integral formulation - Wikipedia
The path integral relies on the three general principles of quantum mechanics formulated by Feynman:
Feynman lectures on physics III Chapter 3: Probability Amplitudes1. The probability that a particle will arrive at x, when let out at the source s, can be represented quantitatively by the absolute square of a complex number called a probability amplitude — in this case, the “amplitude that a particle from s will arrive at x.”
2. When a particle can reach a given state by two possible routes, the total amplitude for the process is the sum of the amplitudes for the two routes considered separately. There is interference.
3. When a particle goes by some particular route the amplitude for that route can be written as the product of the amplitude to go part way with the amplitude to go the rest of the way.
The path integral computes the probability amplitude for a particle to go from s to x by dividing every possible path into infinitesimal segments and multiplying the amplitudes for the particle to cross each segment according to principle 3 to get the amplitude for the whole path, adding these amplitudes as required by principle 2 and computing the probability represented by the resulting amplitude by principle 1. The process works and contributes to computations in quantum field theory which precisely match observation. But, we might ask, does nature really work this way? If we are to consider the quantum of action as an atomic event, can we trust the mathematical fiction that a quantum path can be sliced into an infinity of infinitesimal events?
A more detailed discussion of the material covered in this section is available at cognitive cosmology.com page 3: Action from the unmoved mover to the quantum.
(reviewed Sunday 29 January 2023)
§4: Theology: a new paradigm?
Thomas Kuhn, a physicist, historian of science and a philosophy proposed that scientific revolutions are caused by changes in the "disciplinary matrix" or paradigm shared by people working in a particular field. The paradigm represents "business as usual", dealing with small problems as they arise. The time comes, however, which a large accumulation of problems in a discipline forces people to take an entirely different view of things. Kuhn noted that this view may be so different that it is difficult to see its connection with the old system. Thomas Kuhn (1996) The Structure of Scientific Revolutions, Alexander Bird (Stanford Encyclopedia of of Philosophy): Thomas Kuhn
Einstein's theory of gravitation itself marked a paradigm change. Newton had a theological bent, and he took a "God's eye" view of the solar system as he developed his theory of universal gravitation. He envisaged an absolute space and an independent time and saw the Solar System as created and sustained by God.
Einstein, on the other hand, worked from the point of view of a person within the Universe. Since the time of Descartes it had become customary in physics to establish coordinate frames of reference from which to measure the positions and motions of physical particles. From earliest times astronomers have sought to establish such frames in understand motions in the heaven. This is not simple. We inhabit an orbiting and revolving planet and and all the other planets and moons are also in motion.
In classical physics it is axiomatic that physical phenomena are totally independent of the choice of reference frame used to measure them. This requires a mathematical relationship of covariance and contravariance between frames and reality. If we rotate the frame of reference clockwise, we must rotate the measurements taken from this frame anticlockwise so as maintain consistency with the measurements taken before the frame is moved.
What Einstein sought was general covariance, the transformation between bodies in accelerated motion. His work had the remarkable result that the influence of gravitation, which is present in every moment of our lives, is caused by the geometry of spacetime. Newton knew that there was a force acting between the Earth and the Moon that explained the Moon's orbit, but all he could say was that this force is something established by God about which he could say nothing. He might have been amazed if Einstein told him that in the new geometric paradigm of general covariance gravitation is a consequence of the "shape" of spacetime which is itself a consequence of the distribution of energy throughout spacetime. Gravitation is treated in detail (after pages 8 to 21 on quantum theory) on page 22: Gravitation and quantum theory—in the beginning
As well as gravitation, Newton also sought to understand the nature of light. He discovered that white light comprises the colours of the rainbow, laying the foundation for the spectroscopic studies that led to quantum mechanics. Opticks - Wikipedia
Much of Newton's work used sunlight but spectroscopists found that artificial light made by candles, lamps and other hot bodies also exhibited a spectrum of colours. in the middle of the nineteenth century Gustav Kirchoff proposed that there was a relationship between the temperature and the spectrum of a hot body:
For a body of any arbitrary material emitting and absorbing thermal electromagnetic radiation at every wavelength in thermodynamic equilibrium, the ratio of its emissive power to its dimensionless coefficient of absorption is equal to a universal function only of radiative wavelength and temperature. That universal function describes the perfect black-body emissive power.Kirchoff's law of thermal radiation - Wikipedia
Kirchoff's proposition set off a search for this universal function. Success finally came with Mac Planck's discovery in 1900 of a function that accurately modelled the relationship between wavelength and temperature. He achieved success by rejecting the classical idea that the emitted radiation formed a continuum and proposing instead a radical new assumption, that thermal radiation we emitted in discrete packets. Each packet carried energy proportional to its frequency, the fundamental equation of quantum mechanics E = hf. h is a new universal constant, now known as Planck's constant and f the frequency of radiation measured by spectroscopists. Planck's equation is known as Planck's law. Max Planck (1901): On the Law of Distribution of Energy in the Normal Spectrum, Planck constant - Wikipedia, Planck-Einstein relation - Wikipedia
It took another thirty years for quantum mechanics to become a fully fledged theory. Since then it has been engaged in a quest that has lasted for nearly a century to come to terms with relativity. Nevertheless the technological fruits of quantum theory, ranging from thermonuclear weapons to extraordinary astronomical instrumentation have completely change many aspects of modern life.
Kuhn's theory suggests that paradigm changes like quantum mechanics are forced upon us when scientific business as usual begins to feel that it has hit a wall. Do we need a paradigm change in theology and religion?
Christianity brought a theological paradigm change. Jesus of Nazareth changed the divine attitude. He was nothing like God the Father at Sinai who announced "I am the Lord your God and you shall have no other Gods before me". This God then ordered his servant Moses to murder all the people who were following an ancient tradition and worshipping a Golden Calf. Exodus 32: Moses slaughters the worshippers of the Golden Calf
Jesus had a temper, but he directed it at the oppressors of his people. He did not use his divine power to kill anyone. Instead he healed people sick with both physical and psychological disease, fed the hungry and preached the love of all, even Samaritans. The Roman occupiers of his homeland, apparently abetted by the local ruling class, identified him as a dissident and had him tortured to death. Tradition has it that he came alive again, founded a church to carry his message to the world and ascended into heaven, sending the Holy Spirit to nurture his new followers.
The Church is now approximately 2000 years old and has the status of a sovereign nation, established by the Lateran Treaty in 1929. It is huge organization, an absolute, allegedly infallible autocracy with approximately 1.5 billion members and approximately a million staff. Carol Glatz (2021): Vatican statistics show continued growth in number of Catholics worldwide
The Church is based on faith. According to the definition given by Aquinas, taken from Paul's Epistle to the Hebrews, faith explicitly rejects evidence, replacing it with hope. Experience shows that it rests on a long series of unverifiable hypotheses, which are nevertheless claimed to be true. It is therefore unscientific and the Church has no means to establish that it can deliver the heavenly goods it promises to its followers. Aquinas, Summa: II, II, 4, 1: Is this a fitting definition of faith: "Faith is the substance of things hoped for, the evidence of things that appear not?" (Hebrews 11:1)
There are many features of the Catholic Church which are repugnant to modern ideas of corporate governance and social licence. Because it is such a huge and ancient organization it has enormous social momentum and will not be easy to turn around.
I imagine that the starting point for this reformation is the establishment of a scientific theology, that is a set of beliefs about the nature of the Universe and ourselves that are consistent with reality. The following pages are my attempt to articulate the position I have reached over a long lifetime. I began with absolute fealty to the Catholic Church and slowly moved to a strong feeling that it is s massive burden on the human spirit, actively propagating false hypotheses about ourselves and our world.
It is a common ploy for a politicians wishing to cast themselves as saviours of the people to construct a paper tiger to represent the danger they are defending against. The story of the Fall plays this role in Christianity. It is a story with no factual or scientific basis whatever. The current Pope Francis has surprised us by he frequent mentions of Satan, a mythical being. Anthony Faiola: A modern Pope goes old school on the Devil
It is true that there is evil in the world but that does no arise from the anger on an omnipotent and omniscient creator, but from the nature of the evolutionary process that brought the universe to be. The fundamental problem is that while living creatures can reproduce exponentially, the resources available to support them are both limited and variable, so that times will come when there is not enough to go around. In the human world war and pillage then become to some degree rational: for many it is better to die fighting than to die of starvation. I return to this issue on page 6: Evolution: genetic memory, variation and selection
The Catholic Church, having comprehensively established the human species as a vast multitude of sinners, has then taken upon itself to be our saviour. The salvation on offer, however, is not of this world but in an afterlife.
Given that there is no more evidence for an eternal afterlife than there is for an original sin, this whole program is impossible to believe. If Churches were subject to the normal consumer protection that operates in civilized democracies, such a Church should not be permitted to collect funds on the claim that it is capable of rendering the services it offers.
Of course it may be considered that I am bitter because as a youth I fell this whole story hook, line and sinker. I joined the Dominican Order (encouraged by my teachers) and took vows of poverty, chastity and obedience in the belief that I was securing my salvation. Later I learned found that the story I had been told was a dream, a castle in the air.
I am not bitter about my past. I am excited. I was saved by writing an essay undermining the Catholic business plan and was expelled from the Order. During my time in the order I read the work of Aquinas continually and slowly formed the opinion that if one could replace the rather rudimentary Aristotelian science that was available to Aquinas with modern science, one could make a good case for establishing a scientific theology in the modern sense of science. That is my objective here. Jeffrey Nicholls (1967): How universal is the universe?, Aquinas, Summa, I, 1, 2: Is sacred doctrine is a science?, Fortun & Bernstein (1998): Muddling Through: Pursuing Science and Truths in the Twenty-First Century
(revised Sunday 29 January 2023)
§ 5: God's ideas, cybernetics and singularity
Plato was among the first to take the idea of disembodied form seriously and it has remained central to theology ever since, particularly in the common idea that we have an immaterial spiritual soul which is eternal because it has no material parts which can come apart.
Since the development of computation and information theory, however, the idea that immaterial forms have independent existence has fallen out of favour. Instead we understand that information is carried by marks, like the printed symbols that constitute this page. We have found that matter is enormously complex, right down to the level of atoms and fundamental particles, and so capable of representing huge amounts of information.
Although we no longer think that disembodied forms have independent existence, the idea found a new life in the study of infinity. The study of infinity was motivated in the nineteenth century by the need to put the ideas of differential and integral calculus on a firm logical footing. The Pythagorean theorem, showed that many geometric objects, like the diagonal of a unit square, cannot be represented by rational numbers. This led to the development of real numbers whose role is to provide a name for every point in a geometric continuum. Square root of 2 - Wikipedia, Real number - Wikipedia, Differentiable function - Wikipedia, Mathematical analysis - Wikipedia
Cantor's theorem - WikipediaGeorge Cantor worked in a mathematical milieu where it is believed meaningful to create a continuum from closely spaced discrete points. Cantor's theorem - Wikipedia
Cantor's work led to a revival of formalism. Some theologians felt that Cantor's work verged on pantheism, but he was defended by Hilbert, who introduced an explicitly formalist approach to mathematics and declared Aus dem Paradies, das Cantor uns geschaffen, soll uns niemand vertreiben können. (From the paradise, that Cantor created for us, no-one shall be able to expel us.) Cantor's paradise - Wikipedia, Joseph Dauben (1990): Georg Cantor: His Mathematics and Philosophy of the Infinite, page 144 sqq.
Cantor's introduction of set theory not only clarified the foundations of mathematics, but also introduced paradoxes that led to careful reevaluation of mathematical proofs. Hilbert believed that a proper application of formalist methods would eventually solve all these problems. This was not to be.
Hilbert defined what he considered to be the ideal features of formal mathematics. It should be consistent, complete and computable.
Consistency means that no argument could arrive at the conclusion P ≡ not-P. Completeness means that any properly formed mathematical statement can be proved either true or false. Computable means that there exists definite algorithms capable of proving completeness.
Kurt Gödel upset the first of Hilbert's expectations by establishing that consistent mathematics is incomplete. Gödel's incompleteness theorems - Wikipedia
Soon afterwords Alan Turing established the existence of incomputable functions. Alan Turing (1936): On Computable Numbers, with an application to the Entscheidungsproblem
There is a second modern development about which the ancients knew very little: cybernetics. One of its founders, Norbert Wiener, defined it as the science of control and communication in the animal and the machine. Norbert Wiener (1996): Cybernetics or Control and Communication in the Animal and the Machine, Cybernetics - Wikipedia
Following Galileo, mathematical modelling has become a primary tool of modern physics. Aquinas believed that an omniscient and omnipotent God has total deterministic control of every event in the world. Gödel found that logically consistent formal systems are not completely determined and Chaitin interpreted Gödel's work to be an expression of the limits to control known as the cybernetic principle of requisite variety: One system can control another only if it has equal or greater entropy than the system to be controlled. This principle suggests that a completely structureless initial singularity has no power to control anything, even itself. Insofar as the initial singularity acts, its action must be a random event. Only later, as the universe grows in structure and entropy does the possibility of control emerge, but it can never be perfect. The entropy of the future is almost always greater than the entropy of the past. Aquinas, Summa, I, 22, 3: Does God have immediate providence over everything?, Gregory J. Chaitin (1982): Gödel's Theorem and Information
This same principle invalidates the idea that the traditional God would have planned the universe from the beginning.
Things go wrong when we lose control, but absolute control forecloses on creativity. The inability of the initial singularity to control itself open the way for the creation of new structures in the universe, making emergence and evolution possible.
As we shall see, our universe was not planned beforehand, therefore, but has evolved through its own unlimited random activity. Some products of this activity do not last, other are are consistent and durable, and these are the ones selected to exist, to give us the more or less permanent features of the world we see.
(Revised Sunday 29 January 2023)
§6: Evolution: genetic memory, variation and selection
Charles Darwin is best remembered for explaining the origin of species by evolution through variation and selection. Charles Darwin (1859): The Origin of Species: By Means of Natural Selection, or the Preservation of Favoured Races in the Struggle for Life
Biological evolution is just part of the story of creation. Life began in Earth about 4 billion years ago, but the creation of Earth capable of sustaining life took ten billion years before that. This began with the emergence of spacetime and the generations of the stars and supernovae that created the chemical elements that constitute our planet. Stellar nucleosynthesis - Wikipedia
Every living creature carries two copies of itself, one a formal abstract representation in its DNA genetic code, the other its full physical reality. Genes provide a link between generations, and the evolution of species arises from genetic changes from generation to generation. . Nucleic acid - Wikipedia, Human Genome - Wikipedia
While the function of genetics in biology is now well understood, it remains an open question whether analogous processes can be invoked to explain the development of the structure of the universe from the initial singularity to the origin of life and the evolution of human intelligence. The remainder of this site examines this question in detail.
Basic living metabolism has changed little since the emergence of life. This structure has been reproduced trillions of times over billions of years. Here we see an important clue to the way the world worked before life came along. Life is a periodic process, or should we say an almost periodic function, since it changes from moment to moment.
Quantum mechanics, our fundamental theory of the universe, also deals in periodic functions. In humans the period between generations is about 25 years. In the quantum mechanical world the period between events varies between infinitesimal fractions of a second to billions of years. Periodicity, an event that repeats itself time after time, points to the existence of some form of memory.
Random variation is the first requirement for evolution and the source of creativity in evolution. They key to variation and probability is lack of control, discussed on page 5: God's ideas, cybernetics and singularity in sections 5.8 to 5.10.
Selection, on the other hand, depends on controlled activity. Those creatures that survive depend on both the memory embedded in their genes and things they have learnt since birth. This is particularly important in our own survival, which depends not only on our genes and upbringing but also on the collective memory embedded in our cultures.
The big bang hypothesis imagines that the initial singularity is enormously energetic at a very high temperature, although neither energy nor temperature have meaning in the absence of spacetime structure. Our experience with high energy particle accelerators has shown us that we can produce large numbers of particles from the collision of massive bubbles of almost pure energy. This suggests that very fast evolutionary processes can build a spectrum of particles from a structure very close to the singularity.
Events in the universe cover a wide range of time scales. It is about 14 billion years since the big bang, and there is no reason to suggest that the universe will not last for a long time yet. Some particles, like the proton, are believed to have a lifetime many orders of magnitude greater than the life of the universe. At the other end of the scale is the Planck time, perhaps the smallest physically relevant period. Heat death of the universe - Wikipedia, Proton decay - Wikipedia, Planck units - Wikipedia
If the universe begins as a zero energy quantum of action, we can imagine the initial processes to proceed at a relatively leisurely rate so that such structures as come to be at an early time have a long lifetime and thus provide an selective environment for shorter lived structures that appear later in the emergence of the universe. This timing replaces genetic memory in the biological evolution process and may enable physical evolution in the absence of life and genes.
All of this discussion arises from the fact that we are conscious, intelligent organisms seeking to understand ourselves and our place in the world. From our point of view, our mental capacity is the ultimate product of the evolution of the universe. By studying ourselves we get clues to the nature of the system that created us.
The problem then becomes how can something that is absolutely simple develop complexity. The ancient answer lies in the Christian doctrine of the Trinity to be explained in §8: The theology of the Trinity.
Our minds are a product of our brains. This system is a vast communication network. Our minds operate on a scale of complexity comparable to the internet which connects billions of computers into a worldwide network. Our next question therefore is Are networks intelligent? The model of the world developing here is derived from a model of the mind presented in the next section §7. Central nervous system - Wikipedia
(revised 30 January 2023)
§ 7: Networks, bodies and brains
God created mankind in his own image;
in the image of God he created them; male and female he created them. Genesis 1:27: God creates humans
Aristotle was one of the first to make a detailed study of human psychology in his De Anima. He applied his ideas of matter and form and potency and act to sensation and understanding. Aristotle: On the Soul - Wikipedia
If the universe is divine, we can study it scientifically and learn the real mind of God. One way to approach the divine mind, made possible by the fact that we are created in the image of God, is to think about our own minds.
Since we became conscious and began to talk to one another about our feelings and experiences we have found that nothing interests us as much as each other. This discussion covers a very wide spectrum from the language of love and cooperation to the language of criticism, abuse and violence. The superpower that we share with the universe is creative imagination. Nowhere do we exercise this power more vividly than in the category of discourse known as gossip, sharing hypotheses about what the people around us are really doing and thinking.
The overall framework for this picture is a communication network. One beauty of networks us that they are, like quantum theory, symmetrical with respect to complexity. The fundamental properties of quantum mechanics are the same in a Hilbert space of 2 or a countable infinity of dimensions. The fundamental properties of a communication network are the same whether we are considering the "network atom" of two sources communicating through a channel, or the a network like the internet of number of sources communication through a set of channels that connects each one to all the rest.
We might understanding this process by comparing a fertile human egg to the information processing system that grows out of it. I am much more complex that the egg I grew from. Although this is consistent with the second law of thermodynamics that entropy increases, it seems to contradict the idea that nothing comes from nothing.
The information in my egg is encoded in my genome, a DNA string of some three billion symbols (A, T, G, C) each representing 2 bits of information for a total of approximately 1 gigabyte, about 1010 bits.
Life is a electrochemical process based on insulating membranes and ionic motors which create and utilize electrical potentials across the membranes. This system is closely analogous to the techniques of electrical engineering. Multicellular plants rely on electro-chemical signalling to coordinate the operations of individual cells. All but the simplest of animals use neural networks, both for internal housekeeping and for interaction with the world around them. Cell signalling - Wikipedia
Neural networks are constructed from neurons, cells (sources) adapted to receive, process and transmit electrical signals. The connectivity in the network is high. Signals are transmitted along the fibres in the neural network by a discrete voltage spike known as action potentials which propagate along the fibre at quite high velocity. All these action potentials are effectively identical, like the digits in a computer network. Their information content is a function of their timing
The principal functional connections between neurons and the fibres connecting them are synapses. Processing and memory in a neural network are modulated by synaptic weights which are a measure of the level of influence, positive or negative, a particular synapse may have on the neuron to which it is connected. The details of a neural network are extraordinarily complex, there are many different neurotransmitters and many varieties of cells which perform auxiliary functions associated with the neural network.
The ontological development of an individual human human brain poses an interesting problem in network creation. An important source of formal guidance in the develop of any living creature is the genome. Formally, programmed deterministic development is subject to the cybernetic principle of requisite variety. This law establishes the conditions for completeness and computability that render any process deterministic enough to have a high probability of success. Protein Biosynthesis - Wikipedia
The human central nervous system comprises some 100 billion neurons each with possibly 1000 connections to other neurons.
In the specification of a standard engineered computer network, every physical connection is precisely specified by source and destination. Calculation reveals that the information required to specify all the connection in a human brain is approximately a million times greater the information content of the genome. It is necessary, therefore, that some other mechanism must account for the connective structure of the brain, which is to say that to a large degree this system must define itself. Brains must have a self-structuring property.
The explanation appears to be a form of evolution by natural selection. The neurons in an infant brain seek out synaptic connections with one another, a process which is to a large degree random, creating an excessive number of connections. There follows a process of pruning which continues through the teenage years, eliminating little used connections. We might imagine that similar process will eventually remove much of the rubbish on the internet, links withering for lack of use. Synaptic pruning - Wikipedia
As well as determining the wiring of the brain over a period of years, experience determines the synaptic weights connecting neurons. Changes in weight may occur in milliseconds during the real time processing of speech, or over a lifetime during the acquisition of knowledge and experience. The physical development of a brain is thus closely related to the reception of information from the environment via the senses and feedback from the results of actions (like learning to walk). It serves as a microcosm of the development of the universe. Our minds are the product not just of our genes, but of the environment in which we find ourselves.
Mental evolution provides us with an enormous advantage, since thought is cheaper than action. In the natural world of evolution by natural selection many newborns fail to reproduce for one reason or another. In some species this failure rate may be very high, thousands being born for every one that survives and reproduces. In more complex species like ourselves most children are carefully nurtured by their parents, leading to a high rate of survival.
Cognitive cosmology sees the universe as a mind, a creative mind, and we are the ideas in that mind, created over many billions of years by a long and complex process of evolution that we have really only become aware of in the last two centuries.
Human cultural evolution seems slow. In particular we have found that a century is a short time in the development of theology. But compared to the biological evolution of the world, we see cultural, scientific and technological changes occurring in centuries where evolutionary changes require thousands or millions of years.
(revised 30 January 2023)
§8: The theology of the Trinity
Hebrew monotheism was carried over into Christianity but was modified by a remarkable theological development: the sole Hebrew God Yahweh became a member of the Christian Trinity. There emerged, within the Christian God, three really distinct divine persons, Father (Yahweh), Son (Jesus of Nazareth) and Spirit. Dale Tuggy: History of Trinitarian Doctrines
The Christian doctrine of the Trinity was firmly established in the Nicene Creed based on the authority of the New Testament. The theological reconciliation began with Augustine and was further developed by Aquinas and Lonergan. Nicene Creed - Wikipedia, Augustine (419, 1991): The Trinity
Here I take the theological insights developed to explain how three could be seamlessly combined into one without breaking the unity of the one as a starting point for the creation of our immensely complex universe from an initial singularity. Two features of the traditional theology stand out.
First Augustine based his model of the Trinity on an ancient picture of human psychology as a combination of intellect and will. This led me to suspect that a divine universe might best be understood by through cognitive model that combined physics and theology.
Second, although the Trinity is limited for dogmatic reasons to three, the idea that divinity proceeds from divinity has no logical limit. The notion of trinity could be extended to any number of persons: from trinity to transfinity. Unmoved mover - Wikipedia, Thomas Aquinas, Summa, I, 2, 3: Does God exist?, Aristotle: 1072b14 sqq: Metaphysics book XII; The life of God
Aquinas derived all the traditional properties of God from the conclusion that God is pure act, actus purus, a consequence of the proof for God's existence which he received from Aristotle.
The first clue to an explamation of the may be in John's gospel, which begins: "In the beginning was the Word, and the Word was with God, and the Word was God." (John, 1:1). This sentence may allude to the ancient psychological belief that the source of the words that we speak are the mental "words" (ideas, forms) that enter our consciousness as we think about what we are going to say. Because their God is absolutely simple, traditional theologians hold that attributes which are accidental in the created world are substantial in God. God's concept of themself, God's word (Latin verbum, Greek logos), is therefore considered identical to God. The author of the Gospel identifies this word with Jesus, the Son of God, the second person of the Trinity, who "was made flesh and dwelt among us". The relationship of this Word of God to God has been discussed at length by the twentieth century theologian Bernard Lonergan in his book Verbum. John the Evangelist: The Gospel of John (KJV), Bernard Lonergan (1997): Verbum : Word and Idea in Aquinas, Lonergan (2007); The Triune God: Systematics Lonergan (2009): The Triune God: Doctrines
The human psychological foundation of the procession of the Son from the Father is therefore the mental image each one of us has of ourselves. The love of self manifested in ourselves as a mental accident when we contemplate our self image becomes, in the Trinity, a real divinity, the Holy Spirit. This emergence of the Son from the Father is called procession. The Holy Spirit is understood to be the divine manifestation of the love of the Father for the Son. Aquinas, Summa, I, 27, 1: Is there procession in God?
In each case, the person is truly divine, that is truly pure act, so the processions of the persons may be conceived as pure actions producing pure actions. This process is limited to the production of three persons by the Christian dogma of the Nicene Creed. If, however, we accept that every action may produce action, there is no limit to the process.
The doctrine of the Trinity which presents a logical problem insofar the doctrine claims that the three persons are identically God while being clearly distinct from one another. This problem can be solved by assuming that the Trinity defines a space. The defining property of a space is that it enables identical elements to exist, while being distinguished by occupying different points in the space. From this point of view, the Trinity might be seen as a three dimensional space with three orthogonal dimensions, each of which is not the other but which are nevertheless one space. On page 9: The active creation of Hilbert space I suggest that the natural children of a quantum initial singularity are the orthonormal basis vectors of a complex Hilbert space, each of which represents a quantum of action.
It may seem counterintuitive to identify the quantum of action, the smallest possible event in the universe, with the traditional divinity, the largest imaginable being. The resolution of this conundrum lies in the idea that the Universe is best understood as a mind and the quantum of action not as a physical object in spacetime but as a logical operator, an element of a cosmic information processing system. This view is supported by the idea that quantum mechanics is best understood as a model of information processing in the Universe, that is computation and communication. Nielsen & Chuang (2016): Quantum Computation and Quantum Information
(Revised Tuesday 31 January 2023)
§9: The active creation of Hilbert space
In its simplest incarnation, we may consider the quantum of action as a not operator. This operator changes some situation p into some not-p. In the binary realm of propositional logic, we understand that not-not-p = p, but in the wider world I can see, for instance, that there are about seven billion instances of not-me, each one living in different circumstances.
The effect of this definition is to interpret action in terms of logic which may be understood to be purely formal. Hilbert, Whitehead and Russell's formal mathematics was considered to exist outside spacetime. Nevertheless it does have spacetime support in the literature and in the brains of its authors and students. Hilbert's program - Wikipedia, Whitehead (1910, 1962): Principia Mathematica
Von Neumann showed that quantum mechanics is best described using an abstract Hilbert space. Hilbert space is a complex linear vector space analogous to Cartesian space, with a metric defined by an inner product. Von Neumann describes this space as complete and separable so it may have a countably infinite number of dimensions, ℵ0. Physical states are represented by rays in this Hilbert space, and we assume that the initial state of the quantum initial singularity has 1 mathematical dimension represented by the complex plane, the initial ray. John von Neumann (2014): Mathematical Foundations of Quantum Mechanics, Inner product space - Wikipedia, Complete metric space - Wikipedia, Separable space - Wikipedia
The theory of quantum computation and quantum information operates in the n dimensional Hilbert pace. Nielsen & Chuang (2016): Quantum Computation and Quantum Information page xxix
Each vector in a Hilbert space may represent a quantum of action, and we assume that since the specific dynamic property of a quantum of action is to act, the initial singularity will eventually become populated with a ℵ0 states. The final result is an ℵ0 dimensional Hilbert space of orthonormal basis vectors. On page 23: Quantum field theory I consider using a Fourier transform to map the ℵ0 quantum states onto the set of ℵ0 Turing machines to make a quantum computer network and to generate what I call a Turing vacuum, a toolkit of processes for constructing a universal network.
The abstract Hilbert space defined by von Neumann as the domain of quantum mechanics is also a formal immaterial structure. The orthogonal basis vectors of this space have zero pairwise inner products like the basis vectors of Euclidean space. Here we imagine the ultimate creation of a series of Hilbert spaces ℵ0 which constitute the formal foundation of a quantum mechanical vacuum. The orthogonality of this set of basis vectors is guaranteed by the quantum no cloning theorem. All Hilbert spaces are essentially identical, differentiated only by their number of dimensions. Hilbert space - Wikipedia, No-cloning theorem - Wikipedia
Let us assume that the initial singularity is pure action identical to the traditional God, and that it has the power to reproduce itself indefinitely free of the dogmatic limitations imposed upon the Trinity. We may guess that a sequence of actions creates Hilbert space, dimension by dimension.
We identify the simplest Hilbert space with the initial singularity. We can represent the first new vector created by the action of the initial singularity by the symbol |0> and the second to be |1>. This vector is orthogonal to |0>. The superposition of |0> and |1> gives the qubit, represented by the the symbol |qubit> = a|0> + b|1>, where a and b are complex numbers. Qubit - Wikipedia
The qubit and all subsequent superpositions of newly created vectors a|0> + b|1> + c|2> . . . are normalized by the requirement that a2 + b2 + c2 . . . = 1. This normalization, which is maintained by the unitary operators of quantum systems, establishes that any quantum system obeys the probabilistic structure of a communication source, to be described on page 11: Quantization: the mathematical theory of communication. Normalization, Unitary operator - Wikipedia
Here we assume that the initial singularity is action and that the action of action is to act. From the fundamental equation of quantum mechanics, E = ℏ ∂φ / ∂t we assume that such sequences of actions are equivalent to energy. The absolute simplicity of the initial singularity, coupled with the principle of requisite variety suggests that the initial singularity has no control over its action, so repeated actions lead to a random spectrum of energies, which we may identify with the quantum mechanical vacuum. W. Ross Ashby (1964): An Introduction to Cybernetics
The traditional God is invisible, and this is also true of the quantum operations that underlie the observable features of the universe. This is because in a relativistic universe to observe and to be observed both require action. As a consequence he principle of general covariance, as understood by Einstein, does not hold. We will discuss this in more detail on page 14: Measurement: the interface between Hilbert and Minkowski and page 15: Quantum amplitudes and logical processes are invisible.
(revised Tuesday 31 January, 2023)
§10: The emergence of quantum mechanics
We are attempting to construct a universe from a primordial quantum of action. Let us imagine that the mechanism for this construction has the same two elements as Darwinian evolution: variation and selection. We find both these elements in quantum mechanics.
Quantum mechanics involves two distinct and apparently incompatible processes. The first, described by the Schrödinger equation, is believed to be a continuous error free deterministic process which describes the evolution of undisturbed quantum sysems through time. This is described on this page and leads to the discussion of error free communication on page 11: Quantization: the mathematical theory of communication. Schrödinger equation - Wikipedia
The second describes the interruption of this process by observation which leads, after some preparation on page 12: The quantum creation of Minkowski space and page 13: Is Hilbert space independent of Minkowski space? to a discussion of the so called measurement problem on page 14: Measurement: the interface between Hilbert and Minkowski spaces
A common simple illustration of the difference between the behaviour of microscopic quantum particles and macroscopic classical particles is provided by the double slit experiment. If we spray real bullets at random on a barrier with two holes, we see that some bullets go through one or other hole and strike a target behind the hole.
Quantum particles, on the other hand, appear to go through both holes and generate an interference pattern on a screen behind the holes. Double-slit experiment - Wikipedia
In his lectures on the double slit experiment Richard Feynman summarizes the quantum experiment in three propositions:
1. The probability of an event in an ideal experiment is given by the square of the absolute value of a complex number φ which is called the probability amplitude:
P = probability,
φ = probability amplitude,
P = |φ|2.2. When an event can occur in several alternative ways, the probability amplitude for the event is the sum of the probability amplitudes for each way considered separately. There is interference:
φ = φ1 + φ2,
P = |φ1 + φ2|23. If an experiment is performed which is capable of determining whether one or another alternative is actually taken, the probability of the event is the sum of the probabilities for each alternative. The interference is lost:
P = P1 + P2
The explanation of this behaviour is the key idea, identified by Paul Dirac in his treatise on quantum mechanics, of superposition, which simply means the addition of vectors in a Hilbert space representing physical states of a system. Paul Dirac (1930,1983): The Principles of Quantum Mechanics (4th ed)
It seems intuitively obvious that a real particle would go through one slit or the other. If we block one slit, or devise a way to decide which slit the particle goes through, however, the interference pattern is lost. (Feynman's proposition 3). How can this be?
The answer proposed here, to be explained on page 12: The quantum creation of Minkowski space and page 13: Is Hilbert space independent of Minkowski space?, is that the formalism of quantum mechanics emerges in an epoch when the growing universe has not yet acquired the geometric spacetime familiar to us in everyday life.
From an intuitive point of view, we can say that Hilbert space is the realm of the imagination of the universe, just as our own minds are the realm of our imagination. Imagination has a significant input into what we do just as quantum theory has a significant input into what the universe does. In the case of quantum mechanics, this input is not deterministic, the heart of the measurement problem discussed on page 14: Measurement: the interface between Hilbert and Minkowski. Nor is our imagination deterministic. It provides us with many options which change as our experience changes.
Here we touch on the heart of Einstein's difficulty with quantum mechanics. Classical relativity is based on the notion that the physical world is independent of observers. This idea is implemented mathematically by general covariance which maintains the independence of phenomena as the frames of reference used by observers change. In the quantum world interactions are more like conversations where two sources of communication interact with one another, sharing messages that change the state of both. In effect, to observe is to be observed, and the universe is in effect conscious because it observes itself. General covariance - Wikipedia
This idea become more plausible as we progress. It lies at the heart of both quantum theory and cognitive cosmology, the idea that we can produce a comprehensive theory of everything by seeing the universe as the divine mind.
I feel that there is a theological reason for the remarkable collusion between mathematics and physics identified by Wigner. It is implicit in Aquinas's explanation of the limits to God's omnipotence:
. . . God is called omnipotent because He can do all things that are possible absolutely; . . . For a thing is said to be possible or impossible absolutely, according to the relation in which the very terms stand to one another, possible if the predicate is not incompatible with the subject, as that Socrates sits; and absolutely impossible when the predicate is altogether incompatible with the subject, as, for instance, that a man is a donkey. Aquinas, Summa I, 25, 3
The formalist approach to mathematics proposed by Hilbert, which justifies the existence of Cantor's Paradise, puts a similar bound on the "omnipotence" of mathematics: every mathematical statement is acceptable as long as it does not involve a contradiction. God and mathematics are playing the same game, and this may be why, in a divine universe, mathematics, physics and theology have a lot in common. Formalism (mathematics) - Wikipedia, Cantor's paradise - Wikipedia
(revised on Wednesday 1 February, 2023)
§11: Quantization: the mathematical theory of communication
The standard model of quantum mechanics comes in two parts. The first deals with the evolution through time of undisturbed quantum systems. This process is essentially invisible, so our knowledge of it is speculative. The second desribes the evolution of systems when they are disturbed by observation or measurement. We will explore this second mode on page 14: Measurement: the interface between Hilbert and Minkowski spaces.
We assume that the structure of the universe is maintained by relatively error free communications between its components. The mathematical theory of communication developed by Shannon shows that quantization and error prevention are very closely related. Claude Shannon (1949): Communication in the Presence of Noise
The evolution of quantum states follows the Schrödinger or energy equation which is an elaboration of the Planck-Einstein relation which couples energy to time frequency and the quantum of action. f = E / h becomes
∂ψ / ∂ t = Hψ
Planck-Einstein relation - Wikipedia, Schrödinger equation - WikipediaIn communication terms, the message to be sent is a point in message space and the signal transmitted is a point in signal space. The role of the transmitter is to encode the message into the signal. The receiver does the opposite. Together they form a coder-decoder or codec. The error control is embodied in this encoding Codec - Wikipedia
Shannon describes the key to the distinction of signals:
. . . two signals can be reliably distinguished if they differ by only a small amount, provided this difference is sustained over a long period of time. Each sample of the received signal then gives a small amount of statistical information concerning the transmitted signal; in combination, these statistical indications result in near certainty.
The technique is to package or quantize the message and the signal to produce units extended in time which are clearly distinguishable:
The transmitter will take long sequences of binary digits and represent this entire sequence by a particular signal function of long duration. The delay is required because the transmitter must wait for the full sequence before the signal is determined. Similarly, the receiver must wait for the full signal function before decoding into binary digits.Shannon goes on to proves this result using geometric methods in function space. He concludes with a summary of the properties of a system that transmits without error at the limiting rate C called an ideal system. Some features of an ideal system are embodied in quantum mechanics, particularly quantization.
1. To avoid error there must be no overlap between signals representing different messages. They must, in other words, be orthogonal, as with the bases of a Hilbert space.
2. The basis signals or letters of the source alphabet may be chosen at random in the signal space, provided only that they are orthogonal. Quantum processes are reversible in time in the sense that the unitary evolution of an isolated quantum system acts as though it is processed by a lossless codec. Unitary operator - Wikipedia
3. Only in the simplest cases are the mappings used to encode and decode messages linear and topological. For practical purposes, however, they must all be computable with available machines. How this applies in quantum theory is closely related to the measurement problem and the so called collapse of the wave function (see page 14: Measurement: the interface between Hilbert and Minkowski).
In his Mathematical Theory of Communication Shannon framed the communication problem in terms of entropy, a concept derived from thermodynamics. Claude E Shannon (1948): A Mathematical Theory of Communication, Entropy in thermodynamics and information theory - Wikipedia
Shannon's work defines the entropy of a communication source. The entropy of a source A with a spectrum of i symbols ai with probabilities pi such that Σi pi = 1 to
H = Σi pi log2 pi.
Quantum systems described by algorithms such as the Schrödinger equation are constrained to evolve through time in a unitary and reversible manner. The Schrödinger equation defines an error free communication channel which is nevertheless invisible to us. This process is interrupted when systems interact, just as computers in a network are interrupted when they are required to deal with an incoming message. Unitarity (physics) - Wikipedia
An important role of unitarity in quantum mechanics is to constrain the outcome of quantum measurements so that they are a complete system of events identical to the output of a communication source as defined above.
Hilbert space is a vector space. The orthonormalization of vectors in Hilbert space introduces quantization into this space and the effect of superposition is to add vectors changing their direction while maintaining orthonormality. Orthonormality - Wikipedia, Quantum superposition - Wikipedia
Information in Hilbert space is therefore carried by the direction of vectors. Since Hilbert space is linear, we can easily change the basis in which vectors are represented. This means that the directions of individual vectors are not significant but the angles between vectors are.
The eigenvalue equation extracts the direction of the components of superposed vectors while also yielding the eigenvalues associated with each vector in the superposition. These probability amplitudes are converted to probabilities by computing the absolute square of the amplitudes associated with each eigenvector, so yielding the spectrum of the measurement operator. This computation is known as the Born Rule. Eigenvalues and eigenvectors - Wikipedia, Born rule - Wikipedia
The facts that quantum measurement yields only one eigenvalue at a time is called 'collapse' of the wave function. A more reasonable explanation is simply that measurement is an information source consistent with the demands of the theory of communication. Quantum mechanics, like the theory of communication, is not concerned with specific messages but rather the constraints on error free communication of all possible messages established by the statistics of a particular source. Wave function collapse - Wikipedia
(revised Wednesday 1 February 2023)
page 12: The quantum creation of Minkowski space
In conclusion one has to recall that one reason why the ontological interpretation of [Quantum field theory] is so difficult is the fact that it is exceptionally unclear which parts of the formalism should be taken to represent anything physical in the first place. And it looks as if that problem will persist for quite some time. Meinard Kuhlmann (Stanford Encyclopedia of Philosophy): Quantum Field Theory
We are inclined to take classical space-time as given and see it emerging fully formed from the initial singularity. This Minkowski spacetime is understood to be the domain of quantum field theory, so that the fields of quantum field theory described in Hilbert space are subject to the Lorentz transformations of the special theory of relativity. Martinus Veltman (1994): Diagrammatica: The Path to the Feynman Rules, page 20
I like to see the world as a layered computational network whose basic hardware layer comprises quanta of action whose invisible inner processes are the minimum required to maintain their existence. Like engineered networks, this network is layered, from hardware at the base to users at the top. Each layer acts as a symmetry, breaking the symmetry of the layer beneath it and being applied to provide services to the layer above. Users may be people, corporations of machines. Overall the lowest "hardware" layer is the initial singularity and the topmost layer is the universe itself. Internet Protocol - Wikipedia
This model suggests that rather than consider spacetime to be the domain of quantum theory, it might be more reasonable to understand spacetime to be a layer of reality constructed from elements provided by quantum mechanics. We suppose that quantum mechanics is the underlying symmetry that is applied. The discussion of invisibility on page 15: Quantum amplitudes and logical processes are invisible explains why we cannot see what is going on in Hilbert space.
Here we come to an interesting part of this story, the interface between the abstract invisible quantum world that explores possible futures in the universe behind the scenes, and the world of space, time, momentum, energy and observable particles (at every scale, including ourselves) in which we live.
The key idea here is tautological or zero-sum complexification. For instance, since it is of the nature of action to act, and we define energy as the rate of action, we may see that action of its nature (ie tautologically and kinematically) creates energy.
Let us assume that communication and causality require contact. Isaac Newton was forced by circumstances to admit that gravitation was some sort of "action at a distance", which we understand to be impossible in ordinary space. Quantum entanglement in Hilbert space led Einstein to imagine "spooky action at a distance". We shall suggest on page 13: Is Hilbert space independent of Minkowski space? that this is possible because quantum mechanics works in the world before spatial distance in the Newtonian sense has emerged. Geodesics in general relativity - Wikipedia, Quantum entanglement - Wikipedia
The most peculiar feature of Minkowski spacetime is its metric ημν, which is diagonal 1, 1, 1, -1 (or equivalently 1, -1, -1, -1). This suggests that zero bifurcation is at work, so that in some sense space + time = 0. The principal ingredients of a model of the emergence of spacetime are therefore symmetry, zero bifurcation and the speed of light. The null geodesic, made possible by the Minkowski metric, is the accommodation made in spacetime to maintain contact after the emergence of space. The velocity of light is an artefact of this accomodation and enables contact in the quantum world to continue uninterrupted despite the emergence of space. How can this happen? We invoke the evolutionary principle that uncontrolled action can try everything, and that consequences of these trials that are self sustaining are selected and may become fixed with lives of varying length.Minkowski space - Wikipedia, Proton decay - Wikipedia
Interaction is local. Before space enters the world, contact is inevitable and quantum systems can evolve unitarily without interruption. To correlate their evolution, spatially separated systems must communicate to maintain contact. The metric of Minkowski space enables the existence of null geodesics whose endpoints are in contact because the space-time interval between them is zero. The unitary contact of spatially separated systems can thus be maintained if the messenger travelling between them proceeds at the speed of light in Minkowski space. In other words the speed of light makes space possible by maintaining the integrity of the contact and unitarity that is essential to the work of quantum mechanics, and this "trick" explains the Minkowski metric. Kevin Brown (2018): Reflections on Relativity, page 693.
It has been generally assumed that Minkowski space is the domain of Hilbert space so that it is necessary to apply Lorentz transformations to both Hilbert spaces and particles in quantum field theory. This may not be necessary if Hilbert space is prior to and independent of Minkowski space. On page page 23: Quantum electrodynamics: QED we hope to find that this approach removes some of the ontological confusion that Kuhlmann identifies in quantum field theory. Martinus Veltman (1994): op cit page 20
This may seem too good to be true, but on the other hand if the universe started as a quantum initial singularity prior to space-time something like this event must have occurred on the way from then to now, and given the layered network to be developed on page 16: Network I: Cooperation must still exist and be effective now.
(revised Thursday 2 February 2023)
§13: Is Hilbert space independent of Minkowski space?
The special principle of relativity holds that every observer sees the same laws of physics, including the same speed of light, in their own rest frame. This defines the Lorentz transformation. This transformation is expressed succinctly by the 1, 1, 1, -1 metric of Minkowski space so that if we set the speed of light c to 1, all observers see an invariant interval ds2 = dx2 + dy2 + dz2 - dt2. Minkowski space - Wikipedia, Tests of special relativity - Wikipedia
It seems to be generally accepted in quantum field theory that the Lorentz transformation applies equally to states in Hilbert space and to particles in Minkowski space. This implies that the domain of Hilbert space is Minkowski space. Martinus Veltman (1994): Diagrammatica: The Path to the Feynman Rules page 20.
If, however, the quantum world constitutes a layer of the universe built on the initial singularity before the emergence of observable energy, time, space and momentum, this convention may need revision. The phenomenon of entanglement suggests that the Hilbert quantum world exists prior to and independent of the Minkowski classical world. The apparently instantaneous propagation of correlations associated with entanglement might better by attributed to to the absence of space rather than to infinite velocity.
If this is the case, we find a new degree of freedom in the relationship between quantum and classical dynamics which may remove some of the confusion in quantum field theory noted by Kuhlman in the epigraph to this site on page 1: AbstractIn the course of a paper intended to show that quantum mechanics is incomplete, the authors identified 'spooky action at a distance' which is a now understood as entanglement. Einstein, Podolsky & Rosen: Can the Quantum Mechanical Description of Physical Reality be Considered Complete?, Quantum entanglement - Wikipedia, Gabriel Popkin (2018): Einstein's 'spooky action at a distance' spotted in objects almost big enough to see
EPR equate reality with predictability: If, without in any way disturbing a system, we can predict with certainty (i.e. with probability equal to unity) the value of a physical quantity, then there exists an element of physical reality corresponding to this physical quantity.
Experimental tests of entanglement have gradually improved. It is now widely believed that entanglement is real. It has been shown that this correlation operates at many times the velocity of light. Pan, Bouwmeester, Daniell, Weinfurter & Zeilinger (2000): Experimental test of quantum nonlocality in three-photon Greenberger–Horne–Zeilinger entanglement, Salart, Baas, Branciard, Gisin & Zbinden (2008): Testing the speed of 'spooky action at a distance', Juan Yin et al: Lower Bound on the Speed of Nonlocal Correlations without Locality and Measurement Choice Loopholes
For instance, electrons have two spin states , up and other down, so that the total spin of an electron singlet is zero. Singlet - Wikipedia
Entanglement establishes that when these electrons spatially separated, they behave, when observed, as though they are still in contact. If one electron is observed to be spin up, the other will be observed to be spin down no matter how far apart they are. The fact that this correlation appears to be instantaneous suggests that while the electrons are distant in Minkowski space, they are in contact in Hilbert space.Although the traditional observers Alice and Bob can communicate the fact of their observations via entanglement immediately and definitely, it cannot be used to communicate information faster than the speed of light. This is because Alice cannot control what she is going to observe, and therefore control what Bob receives.
This iphenomenon is called quantum non-locality. Quantum nonlocality - Wikipedia, Spacetime - Wikipedia
Classical physics is founded on a belief in local realismwhich has three features:
1. regularities in observed phenomena point to the existence of physical reality independent of human observers;
2. consistent sets of observations underlie 'inductive inference', the notion that we can use them to devise models of what is going on behind the scenes; and
3. causal influences cannot travel faster than the velocity of light.
Long experience and detailed argument as shown that quantum mechanics is not a local realistic theory. Bernard d'Espagnat (1979): The Quantum Theory and Reality
John Bell studied EPR and formulated a first version of Bell's theorem which would show that quantum mechanics was not a local realistic theory. The phenomena and theory both appear to point to the fact that the quantum Hilbert world is prior to and independent of the relativistic Minkowski world. John Bell (1987): Speakable and Unspeakable in Quantum Mechanics, Myrvold, Genovese & Shimony (Stanford Encyclopedia of Philosophy): Bell's Thorem
Research continues into the application of entanglement to quantum error correction. Nielsen & Chuang (2016): Quantum Computation and Quantum Information
(Revised Thursday 2 February 2023 )
page 14: Measurement: the interface between Hilbert and Minkowski spaces
The interactions between the invisible processes in Hilbert space and the visible process in Minkowski space have been a perennial issue in quantum theory often known as the measurement problem. Measurement problem - Wikipedia
Our conjectures about this hidden quantum mechanical structure are based on observing the interactions of visible particles. What we see are eigenvalues, for instance frequencies or energies, which correspond to the eigenfunctions of the matrix operator we use for a measurement. The theory predicts that there are as many possible eigenvalues as the dimension of the measurement operator. The terms collapse or reduction of the wave function refer to the fact that individual observations only ever reveal just one of the possible states of an unknown system. In this respect, a quantum measurement is equivalent to the emission of one symbol from a communication source. The spectrum of a quantum measurement operator corresponds to the alphabet of a classical communication communication source.
The radical problem facing quantum mechanics and the development of quantum computation is illustrated by the difference between a classical bit (binary digit) and its quantum analogue, the qubit. A classical bit has just two states, usually represented 0 and 1. These states are orthogonal, one is not the other. A qubit on the other hand is a vector formed in a two dimensional Hilbert space by adding the orthogonal basis states |0> and |1>. This vector has a continuous spectrum of possible states represented by the equation |qubit> = a|0> + b|1>, where a and b are complex numbers subject to the constraint that |a|2 + |b|2 = 1. When we observe a qubit, however, all we ever see is |0> or |1> with frequency P( |0> ) = |a|2, P( |1> ) = |b|2. The infinite amount of information which we suppose to be represented by the qubit turns out to be at best just 1 classical bit. It has collapsed.
We often think of a measurement as an interaction between a classical and a quantum system, but in reality it is the interaction of two quantum systems. One classical system is the source of a state which we call the measurement operator. The operator interacts with the unknown state attached to another classical system, yielding a classically observable result, the outcome of this interaction. A measurement interrupts an isolated system by injecting another process, represented by a measurement operator, into the isolated system. This is analogous to one person interrupting another to start a conversation.
Zurek's suggests that the alleged collapse of the wave function is necessary to transmit information between two quantum systems. This transmission occurs when the measuring state and the measured state share a common eigenvector. When we measure a two state system with a two state system, the only eigenvectors available are |0> and |1>, and the outcome of a series of the measurement will be one or other of these eigenvectors with probabilities corresponding to the distances between these eigenvectors in the measuring and the measured state.
The distinction between observer and observed is fictitious, in the sense that the quantum process is simply the communication channel in Hilbert space between two sources in Minkowski space. The mathematical theory of communication treats the space of all possible communications between two sources but its results apply to each particular communication. The mathematical expression of quantum mechanics may work in the same way. Wojciech Hubert Zurek (2008): Quantum origin of quantum jumps: breaking of unitary symmetry induced by information transfer and the transition from quantum to classical
Zurek begins with a concise definition of standard quantum mechanics in six propositions. The first three describe its mathematical mechanism:
(1) the quantum state of a system is represented by a vector in its Hilbert space;
(2) a complex system is represented by a vector in the tensor product of the Hilbert spaces of the constituent systems;
(3) the evolution of isolated quantum systems is unitary governed by the Schrödinger equation:
i ℏ ∂|ψ> / ∂t = H |ψ > where H is the energy (or Hamiltonian) operator.
The other three show how the mathematical formalism in Hilbert space couples to the observed world:
(4) immediate repetition of a measurement yields the same outcome;
(5) measurement outcomes are restricted to an orthonormal set { | sk > } of eigenstates of the measured observable;
(6) the probability of finding a given outcome is pk = |<sk||ψ>|2, where |ψ> is the preexisting state of the system.
Schrödinger equation - Wikipedia, Born rule - Wikipedia
Zurek examines a system in 2D Hilbert space, noting that the complexity invariance of quantum mechanics enables an extension of the argument to a space of any dimension. He writes:
The aim of this paper is to point out that already the (symmetric and uncontroversial) postulates (1) - (3) necessarily imply selection of some preferred set of orthogonal states – that they impose the broken symmetry that is at the heart of the collapse postulate (4).He concludes:
Selection of an orthonormal basis induced by information transfer – the need for spontaneous symmetry breaking that arises from the unitary axioms of quantum mechanics (i, iii) is a general and intriguing result.From this point of view, the so called collapse of the waves function is a form of quantum natural selection which picks out a visible state <v|w> = 0 with an orthogonal basis from a set of unknown states. It establishes that a classical communication source, the output of a measurement, only emits one symbol at a time.
The difference between classical and quantum physics is that all classical phenomena are considered to be completely independent of the fact that they are being observed. This is the foundation of the idea of general covariance which Einstein used to derive his field equation.
If we think of Einstein's general covariance in human terms, it is very like dictation. I dictate and you write, and you are not permitted to talk back to me. The quantum world is much more natural. It involves conversation. Every communication is a meeting. There are always two actors and they are generally changed by the meeting.
A successful meeting produces an agreed result when the people involved understand one another. In quantum mechanical terms this means sharing an eigenvector as Zurek shows in the calculation presented in his paper. Although both the vectors in a quantum meeting may be the superposition of a large number of states, information is only transferred when the same state is shared by both observer and observed.
The metric in Hilbert space measures the distance between states. The Born rule shows that the probability of observing states that are close together is higher than the probability for states further apart.
Von Neumann shows that quantum mechanical measurement creates entropy. This may seem counterintuitive: the alleged annihilation of quantum states implicit in measurement process would seem to decrease the entropy of the system. Nevertheless observation leads to the selection of a real state, the outcome of the measurement. John von Neumann (2014): Mathematical Foundations of Quantum Mechanics, Chapter V §3 Reversibility and Equilibrium Problems
Everywhere, the universe is measuring itself, and at the basic level this is happening at the interface between quantum mechanics and spacetime, a conversation between the invisible world of Hilbert space and the visible world of Minkowski space.
The spacetime in which we live acts as our interface with the quantum world. Every move we make sends signals to this invisible world for processing and the answer comes back to us as the result of our action. Our personal actions follow a similar cycle. The decision to move my finger arises in my mind which is a complex information processing system whose every move is coupled to moves in my observable body. We will discuss this in terms of visibility and invisibility on the next page.
(Revised Friday 3 February 2023)
§15: Quantum amplitudes and logical processes are invisible
We hold out high hopes for quantum computation, but are somewhat stymied by the fact that quantum processes and details of their outcomes are invisible to us.
Understanding the hidden quantum information is a question that we grapple with for much of this book and which lies at the heart of what makes quantum mechanics a powerful tool for information processing. Nielsen & Chuang (2000): Quantum Computation and Quantum Information, page 16Much of what goes on at the lowest levels in the universe is invisible to us for three main reasons:
1. Limited resolution - many features are too small to be seen;
2. Perfect symmetry means that there is nothing to be seen;
3. A system must act to be seen - it must transmit information to us.When we measure the diagonal of a unit square with a relatively precise instrument like a micrometer, we may find that it is 1.4142 units, a rational approximation to the real number √2. With a ruler graduated in units, on the other hand the best we can say is that it is somewhere between 1 and 2. What we can and cannot see depends partly on the resolution of the instrument we use. Measurement uncertainty - Wikipedia
The dynamic resolution of the universe is limited to one quantum of action. Precision in measurement of position is traded off against precision in the measurement of momentum, and precision in the measurement of time against precision in the measurement of energy, expressed by the equations:
Δ x × Δ p ≈ h
Δ t × Δ p ≈ hThe evolution of quantum wave functions is invisible. We see only the fixed points revealed by "measurement". We presume here that measurement is not something specific to physicists, but that elements of the Universe represented in Hilbert space continually interact with one another, that is measure one another, and communicate with one another through the interaction of fields in Hilbert space representing particles in Minkowski space.
A similar level of uncertainty exists at all other scales, greater and smaller that the human individual. An uncertainty principle operates, for instance, between what we think, what we say and what we do. This arises because there are many different ways of expressing a thought and many different ways of putting words into action. The uncertainty principle is one of the most surprising discoveries of twentieth century physics although it has always been hiding in plain sight like the quantum: everything we see is a discrete object. Born rule - Wikipedia
The second source of invisibility is symmetry. A snowflake is symmetrical, with six identical 'arms'. Because they are identical we cannot tell which is which. If we look away and someone turns the snowflake around, we have no way of telling how far it was turned or if it is was turned at all.
Traditional theology holds that God is completely mysterious to us and beyond our ken because they are perfect structureless symmetry.
Aquinas shows that god is completely simple, omnino simplex. Thomas Aquinas, Summa, I, 3: Introduction, Fundación Tomás de Aquino: Corpus Thomisticum: Summa I, 3 (Proemium)
This is the famous via negativa. There is nothing to be seen in the traditional God.
The third source of invisibility is that what we can only see something if it is actively emitting a signal that we can receive. Attention seeking, from childhood to superstardom, is a consistent feature of human nature. We signal because we want to be seen.
Here we model all communication in the universe with a computer network. We cannot tell what a computer is doing unless it communicates with us, and communication is itself a computation. We cannot therefore see every move that a computer makes. It would have to stop what it was doing to explain itself after every operation and since this explanation is also a computation it would also be required to explain the operations of explaining itself , creating an endless loop which would stop it from getting anywhere. For this reason we see only the results of halted processes. In effect, we can only read stopped clocks.
There is, therefore, a logical limit to our knowledge of the world and each other. The arts of science, fiction and evolution all explore this space. This creative exploration is ultimately made possible by the resulting uncertainty. If it weren't for this uncertainty the world could not have evolved from the initial singularity to its present state. Jules and Jim - Wikipedia, Adam Sisman (2015): John le Carré: The Biography
The clock in a computer has two complementary roles. First it determines the rate of processing in cycles per second, typically of the order of a billions of cycles per second, gigaHertz, GHz. In the physical world the rate of processing varies from zero (ie eternity) to an upper bound which represents the total kinetic energy of the universe.
Second, it hides the actual physical dynamics of processing. A clock signal has two 'edges' which we may call up and down. The up edge sets the process in motion, transistors switch and voltages change. The down edge shuts the process down, so everything halts and the machine rests in a stationary state until the next up edge. The effect of this is to make the computer behave like a purely kinetic formal logic machine. Turing machine - Wikipedia
The clock hides the physics to reveal the formal logic. The inverse process, performed by mathematical physicists, is to try to fit formal logical and mathematical ideas to the world. This leads to a number of extreme positions which will be implicitly criticized in the pages that follow. Carlo Rovelli (2017}: Reality is Not What it Seems: The Journey to Quantum Gravity
(Revised Saturday 4 February 2023)
§16: Network I: Cooperation
Our story began with the definition of God developed Aquinas who, following Aristotle, defined God as pure act: actus purus. This model of God still prevails in Catholic theology. It is a very abstract model which we have been interpreting in terms of quantum theory, a complex mathematical ideal.
Now I wish to develop this model in sufficient detail to clearly identify the simple God with the complex Universe within them. For this I turn from actus purus to another equally ancient but more personal definition: God is love. The key to love is communication, meeting and understanding. Now that we have got so far as endowing the fledgeling universe with spacetime, I wish to follow the emergence of the universe from this point to the present using networks as the milieux enabling practical love. There can be little doubt that all human language, science, art and culture arise from conscious communication, within and between ourselves. 1 John 4:7-21: God is love
Networks operate both within and between Hilbert and Minkowski space. We begin with the operation of classical networks in Minkowski space since they are well researched and widely implemented. They are based on Shannon's mathematical theory of communication and Turing's theory of computation. Claude Shannon (1949): Communication in the Presence of Noise, Alan Turing - Wikipedia, Andrew Tanenbaum (1996): Computer Networks, Codec - Wikipedia
Since I began this work in the 1960s, I have seen the emergence of the internet, a huge complex computer communication network which embraces about half the people in the world. This network serves as a paradigm of my model of God and explains some of its advantages. History of computing hardware - Wikipedia, History of the Internet - Wikipedia
From an abstract point of view, we may describe a computer network as a set: network = { processors, memories }. A modern computer is itself a network, connecting billions of elements of memory to one or more processors. The formal difference between a computer and a computer network is principally a matter of timing. All the processes in a stand alone computer are synchronized by a single clock. Each computer in a network runs on its own clock at its own frequency and communication protocols between computers must deal with this discrepancy.
Computer and computer networks are build up in layers of complexity starting with simple binary logical operations and building up step by step to systems like the internet with billions of machines and billions of users connected together by thousands of languages and protocols. The internet is self documenting in the sense that one can use the internet to search for and find documentation of almost every component of network technology. Internet - Wikipedia
The well developed formalism of computer networking throws light on all other forms of networking between people, businesses and nations. In every case successful communication depends on shared language and culture, often unspoken, a tacit dimension. The tacit dimension of interest throughout this site is theology, the traditional theory of everything. Michael Polanyi (1966, 2009): The Tacit Dimension, Theology - Wikipedia
Quantum theory divides all elementary particles into two classes, fermions and bosons. From a network point of view, fermions are sources and bosons are the messengers that carry information between these sources. Bosons are distinguished by having integral spin while fermion spins are half integral. Everything we can see is made of atoms, which are constructed from elementary particles. Our sight is mediated by the most primitive of elementary bosons, photons. Fermion - Wikipedia, Boson - Wikipedia, Photon - Wikipedia
My main interest here is in theology and religion. Since our modern species Homo sapiens evolved some 300 thousand years ago, we have spread around the planet to all inhabitable regions and developed a vast array of languages and cultures. Different human groups have often evolved in isolation for tens of thousands of years and remained largely ignorant of one another. In the last ten thousand years or so these diverse cultures have gradually become aware of one another through travel, empire building, and marine navigation. This often causes friction. People with guns pillage, rape and murder people with spears and warlords have gradually expanded their powers to large empires, usually killing, raping and pillaging many of the people they conquer.
The fact that we share a global habitat must now be exploited to deal with global problems of physical disease and mental dissonance and the damage we are doing through uncontrolled industrial development to the global ecology that sustains us. We face common problems which must be solved by common knowledge and action.
Formally a network is a set of communication links. A communication link comprises two sources that are able to send and receive messages, and a channel between them that can carry information. Networks are ubiquitous and thus a promising structure to be developed as a theory of everything, that is a theology.
Networks provide a means to avoid evil. We feel the full and painful force of of natural selection whose"zero sum" situations arise. When the lives of some depend upon the death of others, predation, pillage, rape and war become reasonable. The strategy required is to avoid such situations by sufficient effective cooperation through planetary networks of prudent and economic use and sharing of limited resources.Zero-sum game - Wikipedia, Malthusianism - Wikipedia, Jeffrey Nicholls (1987): A theory of Peace
On the next pages I offer a discussion of networks in real Minkowski and quantum Hilbert space to serve as a system of sufficient complexity to give an address and a meaning to every event in the universe.
(revised Sunday 5 February 2023)
§17: Transfinite logical space
The cardinal of a set is the number of elements it contains. Cantor devised a second measure of a set S which he called its ordinal type:
Thus the ordinal type of S is itself an ordered set whose elements are units which have the same order of precedence amongst one another as the corresponding elements of S from which they are derived by abstraction. Georg Cantor (1897, 1955): Contributions to the Founding of the Theory of Transfinite Numbers, page 112The concept of "ordinal type" developed here, when it is transferred in like manner to "multiply ordered aggregates" embraces, in conjunction with the concept of "cardinal number" or "power" . . . everything capable of being numbered that is thinkable, and in this sense cannot be further generalized. Contributions: page 117<Georg Cantor, working in the nineteenth mathematical milieu which saw a continuum as a dense set of points sought a representation of the cardinal of the continuum: how many points does it take to make a continuum? Cantor invented set theory for his study of this infinity. He found no clear answer to his question because an important feature of sets is that they place no constraint on the number of elements they contain. Nevertheless set theory took on a life of its own and became a foundation of mathematics. Georg Cantor - Wikipedia, Set theory - Wikipedia
Cantor took the formalist approach and saw that even though it could not be realized there was no logical inconsistency involved in imagining the set of all the natural numbers. This set in endless (infinite) since there is no last natural number. Since its cardinal cannot be any natural number Cantor represented the cardinal of the set of natural numbers with the new symbol ℵ0. Formalism (mathematics) - Wikipedia
A proof of Cantor's theorem proceeds by forming the power set P(S) of a set S and establishing that the cardinal of the power set is greater than the cardinal of S. Cantor proved that card P(S) > card S even if S is infinite. If S is the set of natural numbers card S ≈ ℵ0 so card P(S) ≈ ℵ1. If we think in terms of permutations, we may say that the cardinal of the set of permutations of the natural numbers ℵ0! is equivalent to the second transfinite cardinal ℵ1, and in the spirit of Cantor we see that ℵ1! is ℵ2 and so on. We can interpret this Cantor universe as a layered hierarchy of permutation groups, each constructed by permuting the elements of the group before it. The subscripts on the alephs number the layers in this transfinite universe. Since all groups are subsets of the permutation (symmetric) groups it is not surprising that when we study the world we find the theory of groups very useful. Cantor's theorem - Wikipedia, Axiom of power set - Wikipedia, Permutation group - Wikipedia
It is generally agreed that the representation of continua by dense collections of points leads, through the notion of limits, to logically sound treatments of continuity for the purposes of calculus. Insofar as we understand a point as an isolated object however the construction of continua from points makes little sense.
Aristotle's notion of continuity seems more satisfactory: things are continuous if they have extremities in common. Aristotle: Physics V, iii: Continuity
Computer networks, and the elements of individual computers, honour Aristotle's definition of continuity because they communicate by sharing elements of memory. Processes in a Turing machine communicate with one another by reading from and writing to the same square on the computer tape. I communicate with you by writing things into your memory which you read to receive what I have written. This is a digital process, proceeding, in computing machinery, bit by bit, in print by strings of letters, in speech by sequences of phonemes.
It may be that our universe is digital to the core, so that the only legitimate use of real numbers is in probability theory where they serve as a vehicle for the application of the law of large numbers which is central to the theory. Andrey Kolmogorov (1956): Foundations of the Theory of Probability, Law of large numbers - Wikipedia
We may imagine mathematics as a factory for producing formal symmetries based on proofs that can link certain inputs to certain outputs. Such proofs define symmetries which can then be applied to a spectrum of possible inputs (say income tax returns) to make decisions like the amount of tax payable in particular instances. The symmetries are broken by application, just as we fill different values into Newton's equation a = F / m .
Much of human creativity in every field arises from random personal meetings , or by the broadcasting ideas in learned journals, newspapers, radio, television and the internet taking the chance that, like pollen grains floating through the air, a copy of the idea will meet a receptive mind, creating a productive instance of natural selection.
This step appears to me to be analogous to Cantor's step from the set of natural numbers to the the second transfinite number which many assume to be the cardinal of the continuum. Here, as described on page 18: Transfinite Minkowski space we imagine a space which does not so much represent the number of points on a continuous line as the number of different network structures that can be built with the set of ℵ0 computable functions that correspond to the set of halting Turing machines. Page 23: Quantum field theory provides more details of this "Turing vacuum".
(revised Monday 6 February 2023)
§18: Transfinite Minkowski space
A physical understanding is completely unmathematical, imprecise, an inexact thing but absolutely necessary to a physicist. Richard Feynman: Lectures on Physics II Chapter 2: Differential Calculus of Vector Fields18.1: Minkowski space is bigger than Hilbert space
In the traditional approach, Minkowski space is taken to be the domain of Hilbert space, and so we can imagine that both spaces are of the same size, both having the cardinal of the continuum.
We have had quite a bit to say about the invisible quantum world. Now we turn to the classical Minkowski world and its role in the application of Hilbert space.
We might say that the theories of space and time developed by Newton and Einstein constitute half of the classical history of the spacetime in which we live. The other half came from the study of electrodynamics. Maxwell showed that electromagnetic radiation manifests as light. Hertz went on to identify it with radio waves. Maxwell's equations - Wikipedia, Feynman Lectures on Physics II:18: Maxwell's Equations, Heinrich Hertz - Wikipedia
Einstein's motivation for the special theory of relativity was an asymmetry in the contemporary understanding of electrodynamics. Introducing his paper he wrote:
It is known that Maxwell's electrodynamics—as usually understood at the present time—when applied to moving bodies, leads to asymmetries which do not appear to be inherent in the phenomena.
Einstein solved the electrodynamic problem not by making a modification to electrodynamics but by going much deeper and revising our understanding of spacetime. His new theory made the velocity of light a central actor in the structure of spacetime and led Herman Minkowski to the peculiar metric of Minkowski space. This metric shows that although a particular photon may travel across the universe, the spacetime interval between its creation and annihilation is 0. This fact is the central point on page 12: The quantum creation of Minkowski space. Minkowski space - Wikipedia
If Hilbert space is the source of Minkowski space, where is it? The answer is that it is nowhere. It exists in an epoch before the existence of classical space, so there is no where for it to be. We may imagine that Hilbert space and all the other spaces and algorithms we use to model the world are represented in the symmetries of the universe that we understand by the creation of mathematical models in our minds. This is my motivation for imagining that cosmology is cognitive. Quantum mechanics is esential to every process in classical space. Its effect is manifest, but the process is invisible, like all the computational activity in the neural networks that enable us to think, walk and talk.
The key to the existence higher transfinite numbers in Minkowski space is that the no-cloning theorem is not operative. We may think of the ticking of a clock. Every tick is an instance of an identical event but each tick is also different, occurring at a different point in spacetime. Every one of these identical events may be associated with an execution of some quantum algorithm, but the no cloning theorem is not operative because each execution of the quantum algorithm is instantiated by the tick to which it is attached.
This leads to a new question: how are the ticks differentiated in Minkowski space if they are identical? We learn the answer by counting angels.
Each of Plato's forms is unique. Aristotle, when he brought the forms down to Earth, enabled the existence of multiple individuals of each species by embodying the forms in matter, which became known as the principle of individuation. The old theologians faced a similar problem with the multiplicity of angels. Since angels are held to be immaterial there can be no material principle of individuation. There can therefore be only one angel in each species, since formal differences are needed to differentiate immaterial structures. From this point of view we may liken angels to orthogonal vectors in a Hilbert space. Principle of individuation - Wikipedia, Aquinas, Summa I, 50, 4: Is every angel a different species?
The modern principle of individuation is based on the existence of fermions, which, through the application of the spin-statistics theorem in Minkowski space, demand separation in spacetime. All fermions are massive, which we might understand to mean that they are material rather than immaterial, so implementing a modern version of the insight on indivuation which Aquinas inherited from Aristotle. Fermion - Wikipedia, Spin-statistics theorem - Wikipedia
The complete description of the role of particles in Minkowski space requires two further discussions on page 19: Space-time—the cosmic memory and operating system and page 20: Fixed points and particles. Given the existence of Minkowski space and the particles (including ourselves) within it, we have enough to to develop the idea of a classical transfinite computer network.
Minkowski space can differentiate transfinite logical structures by enabling the attachment of copies of the initial Hilbert space to each discrete particle or event. The cosmic discovery of a quantum process to create a spacetime containing discrete particles and processes thereby enabled an unlimited increase in the entropy of the universe.
We can now extend the one-to-one correspondence proposed on page 17.6 between the natural numbers and the elements of the set of Turing machines. We might then apply Cantor's argument in the way he applied it to the natural numbers to describe a transfinite space with the power of the continuum from the countably infinite basis states of the Hilbert space described on page 12.
Quantum dynamics is bounded by the no-cloning theorem. Like the angels, we must see each quantum basis state as a new species comprising the superposition of all the states so far generated. We might say the Hilbert space grows organically through the continued injection of action.
Like all the other features of a universe evolving by chance, each new development arrives as a self fulfilling prophecy, a system that can sustain itself given the available resources, a new stable product that can reproduce exponentially. So action bifurcated into energy and time which form the backbone of quantum mechanics. The next step is for spacetime to emerge snd sustain itself using the resources of quantum mechanics, as explained on page 12.
(revised Monday 6 February 2023)