natural theology

We have just published a new book that summarizes the ideas of this site. Free at Scientific Theology, or, if you wish to support this project, buy at Scientific Theology: A New Vision of God

Contact us: Click to email

Essay 30: Cognitive cosmology

In conclusion one has to recall that one reason why the ontological interpretation of [Quantum Field Theory] is so difficult is the fact that it is exceptionally unclear which parts of the formalism should be taken to represent anything physical in the first place. And it looks as if that problem will persist for quite some time. Meinard Kuhlmann (Stanford Encyclopedia of Philosophy): Quantum Field Theory

A physical understanding is completely unmathematical, imprecise, an inexact thing but absolutely necessary to a physicist. Richard Feynman: Lectures on Physics II Chapter 2: Differential Calculus of Vector Fields
Research is to see what everybody has seen and think what nobody has thought. Albert Szent-Györgyi (1957): Bioenergetics
1: Abstract
2: Introduction: The creation of the world
3: Action: unmoved mover to quantum
4: Gravitation and classical singularity
5: God's ideas, cybernetics and singularity
6: Evolution: variation and selection
7: Networks, brains and intelligence
8: The classical trinitarian theology of action
9: The active creation of Hilbert space
10: The emergence of quantum mechanics
11: Quantization: the mathematical theory of communication
12: The quantum creation of Minkowski space
13: Is Hilbert space independent of Minkowski space?
14: The measurement problem
15: Gravitation
16: Why are quantum amplitudes and logical processes invisible
17: Energy bifurcates into potential and kinetic, enabling a zero energy universe.
18: A model: classical and quantum information processing networks
19: Space-time is the operating system of the universe
20: Dynamics, fixed points and particles
21: Quantum electrodynamics: QED
22: Quantum chromodynamics: QD
23: Methodology: the mathematical community
24: Some principles
25: Some conclusions
1: Abstract

Theology is the ancient traditional theory of everything. If the Universe is to be divine, physics and theology must be mutually consistent, and this constraint requires that both disciplines be radically revised,

The modern search for a theory of everything has run aground on gravitation. We are pretty sure that quantum mechanics is the true theory of the world, and we have a comprehensive quantum mechanical explanation for all known particles from fundamental to the planets, stars and galaxies that dot the vast spaces of the universe, but a quantum mechanical explanation of gravitation still eludes us, casting some doubt on progress so far.

Einstein's classical gravitation, in the hands of Penrose, Hawking and Ellis suggests that the universe began as a structureless initial singularity which imploded to give us the universe we observe. We imagine that both gravitation and quantum mechanics are connected with the emergence of the universe within the initial singularity, so it is a perfect starting point to develop of a comprehensive model of creation.

Von Neumann shows that universe increases its entropy (that is creates itself) by observing itself. This implies that physics is a logical or cognitive process, akin to human self awareness, that we might best understand by thinking of the universe as a mind. John von Neumann (2014): Mathematical Foundations of Quantum Mechanics, Chapter 5

A path to cognitive cosmology is relatively clear: we begin with the initial singularity and identify it as the primordial quantum of action, identical to the traditional God described by Aristotle and Aquinas. Unmoved mover - Wikipedia, Thomas Aquinas, Summa, I, 2, 3: Does God exist?

We then examine the ancient Christian theological and psychological doctrine of the Trinity devised by Augustine and Aquinas to explain how one God can become three. This doctrine exhibits features similar to superposition in quantum mechanics and leads us to an understanding of divine creativity.

The next step is to identify the features of this theological structure that are present in the initial singularity. At this point the traditional divinity and the initial singularity are both formally identical quanta of action and the source of the universe.

The final step is to trace the emergence of quantum mechanics, space-time, gravitation and particles, providing material to construct enormously complex intelligent beings such as our planet and ourselves. The known history of the Universe adds empirical support to this hypothetical picture of an intelligent divine world. Chronology of the universe - Wikipedia

Back to toc

2: Introduction: Top down or bottom up?
In the beginning God created the heaven and the earth. Genesis, from the Holy Bible, King James Version

In the traditional creation story God created the world according to a plan which already existed in their mind. The medieval theologian Thomas Aquinas calls this plan ideas, a nod to the Platonic notion that an invisible heaven of ideas or forms defines the structure of the world. Plato thought that our world is a poor reflection of these perfect ideas, which is why it was, to his mind, rather shoddy. Only philosophers (like himself) could perceive the true beauty of these invisible forms. Aquinas, Summa I, 15, 1: Are there ideas in God?, Form of the Good - Wikipedia, Allegory of the cave - Wikipedia

Aquinas made a direct connection between the ideas and the design of the created world. God, by analogy with a human builder, has architectural plans in mind as they make the world. We might ask how long it took God to design the world. The standard answer is that since God is eternal, "how long" has no meaning. God just is, and the contents of God's mind are just as eternal as they are. God could have made a different world but it is what it is. Aquinas, Summa, I, 14, 8: Is the knowledge of God the cause of things?

The Christian tradition still runs deep in the scientific world. Given that the aim of science is to understand the world, we usually take the nature of the world as given and seek a scientific understanding of what the creator had in mind when they made it. Paul Davies (1992): The Mind of God: Science and the Search for Ultimate Meaning, Michio Kaku (2021): The God Equation: The Quest for a Theory of Everythings

Here I wish to break from this tradition, following a consequence of the general theory of relativity.

Penrose, Hawking and Ellis deduced that Einstein's general theory of relativity predicts the incidence of singularities at the boundaries of the universe. There is now strong astronomical evidence for the existence of black holes, and Hawking and Ellis speculated that the big bang that initiated the emergence of the universe within the initial singularity might be understood as a "time reversed black hole". Hawking & Ellis (1975): The Large Scale Structure of Space-Time, Big Bang - Wikipedia

These singularities are considered to lie outside the laws of nature. Our experience with black holes, however suggests that they contain energy and mass which shape the space around them, so controlling the motions of nearby visible structures. This suggests that the pointlike singularity at the root of the universe may contain all the energy of the universe.

This seems to me hard to imagine. From a formal point of view, this initial singularity is indistinguishable from the traditional God of Aquinas: it is outside space and time, and so eternal; it has no structure, so it is absolutely simple; it is beyond the laws of nature, so can give no meaning to energy; and it is the source or creator of the universe. Tradition, dating from Aristotle 2350 years ago and descended to us through Aquinas and Catholic theology, holds that God is pure action actus purus. So here I will assume that the initial singularity is action rather than energy. Aquinas, Summa, I, 3, 7: Is God altogether Simple?

The absolute simplicity of the initial singularity, in the light the cybernetic principle of requisite variety, precludes the existence of any plan of the universe within it (§5 below). All we have is unbounded reproductive activity controlled by local consistency. What we need to look for is a mechanism for the emergence of the universe as we know it within the initial singularity.

What I want to create is model of the universe that embraces not only gravitation and the fundamental particles, but also embraces all structures in the universe including ourselves, no matter how complex, and shows that the universe like the traditional God, touches the bounds of possibility.

The ancients, like Plato, Aristotle and Aquinas divided the world into material and immaterial or spiritual. They thought that knowledge and intelligence are correlated with immateriality. Aquinas argued that God is maximally intelligent because they are maximally spiritual. Since that time, we have come to see information as a physical entity and intelligence, that is information processing, as a physical process. Intelligence is represented technologically by the work of a computing machinery, biologically as a process in complex neural networks like our brains and physically by the power of computation and communication embedded in the quantum world. Aquinas, Summa: I, 14, 1: Is there knowledge in God?, Rolf Landauer (1999): Information is a Physical Entity

Quantum theory began as a physical theory, but since the 80s we have learnt to see it as the description of computation and communication in the universe. It is now accepted as the foundation of our understanding of the world and opens the way to a vision of the world as an intelligent system responsible for its own creation. This idea first achieved widespread acceptance in biology, where the theory of biological evolution explains the enormous spectrum of living species that has arisen on Earth over the last four billion years. Nielsen & Chuang (2000): Quantum Computation and Quantum Information, Evolution - Wikipedia

We may think of quantum theory as a modern description of the ancient idea of spirit. It is rather diaphanous, invisible and exists in an environment of pure action and energy that precedes spacetime and classical physics. It is very similar to music, in perpetual motion, feeding the visible universe with the possibilities which serve as the foundation of evolution. Our "classical" world is selected by the system becoming aware of itself through self-interaction (aka "measurement"). The interaction between the Hilbert space of quantum theory and the Minkowski space of everyday life sets the scene for the measurement problem which has been with us almost since the beginning. Interpretations of quantum mechanics - Wikipedia

Quantum theory is often hamstrung by its derivation from classical physics. Each new breakthrough in quantum mechanics appears to have arisen from the rejection of a classical concept. Planck's discovery put us on notice that continuous mathematics in physics is not everything. The universe is radically discrete or integral, its life measured in discrete quanta of action.

Next was a qualification of the classical concept of determinism. While the need for consistency in quantum mechanics lead us to the "uncertainty principle" deeper results were revealed by Gödel and Turing: a large enough consistent system cannot be deterministic. Around the edges at least, both mathematics and thew world appear to be incomplete and incomputable. Uncertainty principle - Wikipedia, Gödel's incompleteness theorems - Wikipedia, Turing's Proof - Wikipedia

A next step seems to go deeper. Martinus Veltman writes:

In general, if we do a Lorentz or Poincare transformation then a state in Hilbert space will transform to another state . . . Thus, corresponding to a Lorentz transformation there is a complicated big transformation in Hilbert space. (page 20)

To sum up:

To every Lorentz transformation, or more generally a Poincaré transformation, corresponds a transformation in Hilbert space.
Martinus Veltman (1994): Diagrammatica: The Path to the Feynman Rulespage 21

This is (I think) the only boldface statement in the book. Is it true? It is based on the assumption that Minkowski space is the domain of Hilbert space. Is this true? Hilbert space is built on the domain of complex numbers which have no order but the fundamental theorem of algebra holds there. Minkowski space is built on the domain of real numbers which have an order but are limited by the theorems of Gödel and Turing. In this essay I will explore the idea that Hilbert space is both the predecessor and the quantum source of classical Minkowski space. I feel that these spaces are two distinct layers in the emergence of the universal structure from the initial singularity.

This idea introduces a new degree of freedom into quantum field theory which may solve many problems. Perhaps the most important difference is that the vectors of Hilbert space of Hilbert are normalized rays, and the operators are element of a ring which is like the integers naturally quantized, since it embodies addition, substraction and multiplication, but has no fractions resulting from division.

Let us therefore propose that the initial singularity is a vector in a zero dimensional Hilbert space, best represented by the empty set, ∅. To delve deeper into how this singularity reproduces itself, we take another look at the cognitive explanation of the procession of the Son from the Father in theology of the Trinity. Here we look at the thirteenth century work of Aquinas as revised and expanded by Bernard Lonergan in the twentieth century.

In Roman Catholic theology God is pure act, a fact first explained by Aristotle. The term "act" has evolved rapidly through the eras of classical and quantum physics, but the term as used in Catholic theology has experienced little change in more than a thousand years. Pius X tried to contain the philosophy and theology of Aquinas in 24 theses. The motivation is obvious: Aquinas is still the standard; the "modernists" are heretics. The 24 theses of Pope Pius X

Now it is time for physics and theology to meet by agreeing on the meaning of the term action. We therefore begin with a history of action from the time it was coined by Aristotle about 2350 years ago until the present.

The modern version of this idea is quantum field theory, which proposes a space of invisible fields to guide the behaviour of the observable world. This theory is beset by serious problems. Practically, the most acute is the 'cosmological constant problem'. One interpretation of quantum field theory predicts results that differ from observation by about 1o0 orders of magnitude, ie 10100. One point of this essay is to re-interpret the relationship between mathematical theory and reality in a way that points to a solution this problem. Quantum field theory - Wikipedia, Cosmological constant problem - Wikipedia

To explain this I follow in some detail the long and winding trail from the absolute simplicity of the initial singularity to the majestic complexity of the present universe. This all happens inside God, not outside as the traditional story tells us. We are part of the divine world, owe our existence to it, share its conscious intelligence, created, as the ancients said, in the image of God. The most powerful product of this intelligence is reflected in the mathematical formalism that shapes our selves and our world. Mathematics as we know it is embodies in the mathematical community and its literature. The fact that mathematics serves as the skeleton of science implies that it is also represented in the Universe. Eugene Wigner (1960): The Unreasonable Effectiveness of Mathematics in the Natural Sciences

Back to toc

3: Action: from unmoved mover to the quantum

Theology and astronomy have a very long history. There are close cultural connections between Earth and the Heavens. Plato produced the classic description of a spiritual heaven of eternal, perfect forms which shaped the Earth and our knowledge of Earth. For Plato, however, the Earth and our knowledge are very poor copies of their heavenly paradigms.

An ancient Greek interface between poetic like Homer who described the behaviour of the gods and later more philosophically oriented writers is found in the work of Parmenides (5th - 6th century bce). We have fragments of a didactic poem describing his education by a Goddess. In this poem he developed a durable foundation for science, based on the proposition that permanently true knowledge is only possible if the subject is immutable or invariant.

Like many philosophers since, Parmenides sought to use the possibility of knowledge to constrain the nature of the world, an application of the anthropic principle. His "way of truth" is that we can only have true knowledge of eternal realities. Knowledge of ephemera, the way of opinion, is useless because as soon as we have it the subject changes, invalidating the new knowledge. Anthropic principle - WikipediaBack

Parmenides' Goddess announces the core of her teaching:

You must needs learn all things, both the unshaken heart of well-rounded reality and the notions of mortals, in which there is no genuine trustworthiness. (Fr. 1.28b-32)

The Goddess dismisses the way of opinion and turns to describe true reality, "what is". First, it is "ungenerated and deathless". Further it it is "whole and uniform", "motionless", and "perfect":

But since there is a furthest limit, it is perfected from every side, like the bulk of a well-rounded globe, from the middle equal every way: for that it be neither any greater nor any smaller in this place or in that is necessary; for neither is there non-being, which would stop it reaching to its like, nor is What Is such that it might be more than What Is here and less there. Since it is all inviolate, for it is equal to itself from every side, it extends uniformly in limits.

This suite of attributes have remained central to the description of what is for 2500 years. Parmenides student Zeno produced a series of argument to show that motion is impossible to support Parmenides' position.

This old idea is invalidated by the more modern Nyquist-Shannon sampling theorem: we can have true knowledge of changing situations if we update fast enough. The key to catching a moving ball is to watch it closely and move accordingly. Perhaps Parmenides did not play ball games. Nyquist-Shannon sampling theorem - Wikipedia

Parmenides work was taken up by Plato, which greatly increased its visibility. Plato guessed that the structure of the observable world is shaped by invisible, eternal, Our world, he thought, is just a pale shadow of these forms. Plato's student Aristotle brought these forms down to Earth with his theory of hylomorphism. Theory of Forms - Wikipedia, Allegory of the cave - Wikipedia, Hylomorphism - Wikipedia

Hylomorphism may be seen as a description of the static structure of the world which nevertheless enables change. The same matter may take different forms: a mass of bronze may be formed into a sword or a ploughshare. Thomas Ainsworth: Form vs. matter

Aristotle saw matter as potentially something and form as the element that made it actually something. He was the first to make a careful study of action. He coined two words for it, energeia (ενεργεια) and entelecheia (εντελεχεια). Energeia may be translated as being-at-work and entelecheia as completeness, the end of work. Both these active terms are contrasted to potentiality dynamis (δυναμις), which can mean either active power or passivity. These two ideas comprise the essence of his doctrine of potency and actuality with the addition of an axiom: no potentiality can actualize itself. Using this axiom Aristotle deduced the existence of an unmoved mover which is pure actuality and the driver of the world. Unmoved mover - Wikipedia

Aristotle's works entered the newly formed Christian Universities of Europe when they were brought back fro the Muslim East by the Crusaders. Albert the Great and Thomas Aquinas used Aristotle's work to build a new philosophical foundation for Christian theology. Aquinas developed a new Catholic model of God (which has since become standard) from Aristotle's theological treatment of the first unmoved mover in his Metaphysics:

But if there is something which is capable of moving things or acting on them, but is not actually doing so, there will not necessarily be movement; for that which has a potency need not exercise it. Nothing, then, is gained even if we suppose eternal substances, as the believers in the Forms do [ie Plato], unless there is to be in them some principle which can cause change; nay, even this is not enough, nor is another substance besides the Forms enough; for if it is not to act, there will be no movement. Further even if it acts, this will not be enough, if its essence is potency; for there will not be eternal movement, since that which is potentially may possibly not be. There must, then, be such a principle, whose very essence is actuality. Further, then, these substances must be without matter; for they must be eternal, if anything is eternal. Therefore they must be actuality. Aristotle Metaphysics XII, vi, 2

Aristotle thought that the unmoved mover was an integral part of the Cosmos and that the world is eternal so had no need for a creator. Aquinas used Aristotle's argument to establish the existence of God, but faithful to his religion placed this creator outside the Universe. The doctrine of the Summa has never been supeseded in the Church. It remains officially endorsed in Canon Law Aristotle, Metaphysics 1072b3 sqq., Aquinas Summa I, 2, 3: Does God exist?, Holy See: Code of Canon Law: Canon 252 § 3

The Catholic theologian Bernard Lonergan set out to reconceive Aquinas' arguments for the existence of God in his treatise on Metaphysics Insight in epistemological rather than physical terms. Lonergan's story follows the time honoured path. We all agree that the world exists, but we can see (they say) that it cannot account for its own existence. There must therefore be a Creator to explain the existence of the world which we might all agree to call God. Lonergan: Insight, A Study of Human Understanding

Lonergan set out to argue that God is other than the Universe by following the epistemological path pioneered by Parmenides, using the act of human understanding, insight, as his starting point. God, he said, must be perfectly intelligible. But the world is not perfectly intelligible. It contains meaningless data, empirical residue, so it cannot be divine. I think the weak spot in this argument is the idea that the world contains meaningless data. The theory of evolution suggests that there is a reason for every detail. The world is dense with meaning.

Although Lonergan fails to proving that God is not the Universe, his work led me to think of the universe in epistemological terms. The story presented here is intended to consolidate the view that the Universe plays all the roles attributed to traditional Gods.

Implicit in the ancient views is the idea that matter is dead and inert and cannot move itself. It cannot be the seat of understanding. It cannot be creative. Since the advent of modern physics, founded on relativity and quantum theory, these ideas are untenable. Physics based on quantum theory describes a universe in perpetual motion as a gigantic network of communication equivalent to a mind.

The Medieval universities gradually developed the notion that ancient texts and pure reason could not fully explain the world. This attitude was strongly supported by astronomy, a science based on observation. Galileo's telescope led to radical developments in astronomy, and some conflict with ancient religious beliefs. Galileo's opinion that mathematics is the language of the universe reached a high point in Isaac Newton's description of gravitation which showed that the the Moon in the heavens and apples on Earth were moved by the same force. Galileo affair - Wikipedia, Isaac Newton (1972); Philosophiae Naturalis Principia Mathematica

This new picture of the world derives from a new understanding of action which emerged when Joseph-Louis Lagrange sought a new and more comprehensive statement of classical Newtonian mechanics to make it easier to study many body problems like the solar system. His work, Mecanique Analytique placed mechanics on an algebraic rather than a geometric foundation. Mécanique analytique Volume 1

Hamilton's principle - WikipediaIn the Lagrangian method the action S associated with an event x that takes place between times t1 and t2 is expressed by the action functional

S(x) = ∫L dt.

The Lagrangian L = (T(t) −V(t)), where T and V are functions of the kinetic and potential energy of the system. Lagrangian mechanics postulates Hamilton's principle that the actual trajectory taken by a particle whose motion is constrained by T and V coincides with a stationary value of S (a fixed point in the action) which may be found using Euler's calculus of variations. Lagrangian mechanics - Wikipedia, Hamilton's principle - Wikipedia, Calculus of variations - Wikipedia

Lagrangian mechanics has been found to be very versatile and serves as a bridge between classical and quantum mechanics, quantum field theory and physical problems in general. On this basis, we might understand mechanics in spacetime as the study of action in the relationship between kinetic and potential energy.

Quantum mechanics began with Planck's discovery, in 1900, that action is quantized and that the quantum of action,his the constant of proportionality between the energy of radiation and its frequency. This is now a fundamental equation of quantum theory, E =ω where ℏ is the reduced Planck constant, h / 2πand the frequency is expressed in radians per second, ω, understood as the time rate of change of the phase of a quantum state |φ> / ∂ t. Max Planck: On the Law of Distribution of Energy in the Normal Spectrum

The quantum of action is very small and has the same dimensions as angular momentum, in classical physics: ML2T-1, since energy has the dimension ML2T-2 and frequency the dimension of inverse time T-1. In classical mechanics the Lagrangian action S (x) is a continuous variable, whereas in quantum theory it is discrete, so that every event is associated with an integral number of quanta, nh.

The quantum of action is a now precisely fixed natural constant which isused as a foundation for a natural set of units. We might look upon it as an invariant solution to a Lagrangian variation problem somehow established at the very root of universal structure. NIST: Kilogram, Mass and Planck's Constant

For Aristotle and Aquinas action is a metaphysical term, but here we see that it has a physical realization, providing a bridge between physics and metaphysics in a way analogous to its role in coupling classical and quantum mechanics. Dirac found that this role goes deeper and Feynman used it to create a new expression of quantum mechanics, the path integral formulation.

Quantum mechanics came of age in the 1920's in two versions known as wave mechanics (the Schrödinger equation, see above) and matrix mechanics. These were shown to be equivalent by Schrödinger, given a clear abstract symbolic expression by Dirac and a sound mathematical foundation by von Neumann using linear operators in abstract Hilbert space. Dirac notes that the major features of quantum mechanics are linearity and superposition. Matrix mechanics - Wikipedia, Paul Dirac (1983): The Principles of Quantum Mechanics, chapter 1., John von Neumann (2014): ,i>Mathematical Foundations of Quantum Mechanics

Feynman introduced a third approach to quantum mechanics which has since found favour because it provides a more direct route to quantum field theory and string theory. His path integral formulation seeks a stationary superposition of the contributions of all possible space-time paths between an initial and a final state. In principle, this set of paths spans the whole classical universe so the formulation depends implicitly on the idea discussed below in section 11, that the quantum world is prior to and independent of Minkowski space. Feynman & Hibbs (1965): Quantum Mechanics and Path Integrals, Path integral formulation - Wikipedia

Feynman began with Dirac looking for a feature of quantum theory corresponding to classical Lagrangian mechanics. Dirac found that the classical action could be used in a complex exponential to describe the evolution of a quantum state. Feynman imagined, by analogy with the two slit model, that the actual path taken by a particle is a stationary superposition of all possible paths where the contribution from a particular path is postulated to be an exponential whose (imaginary) phase is the classical action, in units of h. P. A. M. Dirac (1933): The Lagrangian in Quantum Mechanics, Richard P. Feynman (1948): Space-Time Approach to Non-Relativistic Quantum Mechanics

The path integral relies on the three general principles of quantum mechanics formulated by Feynman:

1. The probability that a particle will arrive at x, when let out at the source s, can be represented quantitatively by the absolute square of a complex number called a probability amplitude — in this case, the “amplitude that a particle from s will arrive at x.”

2. When a particle can reach a given state by two possible routes, the total amplitude for the process is the sum of the amplitudes for the two routes considered separately. There is interference.

3. When a particle goes by some particular route the amplitude for that route can be written as the product of the amplitude to go part way with the amplitude to go the rest of the way.

Feynman lectures on physics III Chapter 3: Probability Amplitudes

The path integral computes the probability amplitude for a particle to go from s to x by dividing every possible path into infinitesimal segments and multiplying the amplitudes for the particle to cross each segment according to principle 3 to get the amplitude for the whole path, adding these amplitudes as required by principle 2 and computing the probability represented by the resulting amplitude by principle 1. The process works and contributes to computations in quantum field theory which precisely match observation. But, we might ask, does nature really work this way? If we are to consider the quantum of action as an atomic event, can we trust the mathematical fiction that a quantum path can be sliced into an infinity of infinitesimal events?

Back to toc

4: Classical gravitation and singularities

Penrose, Hawking and Ellis showed that Einstein's classical general theory of relativity predicts the existence of singularities at the boundary of spacetime. Beyond such boundaries, spacetime structure ceases to exist. Such boundaries within the universe are known to be surrounded by event horizons which give them the name black holes. Many have been observed. These authors also speculated that the Universe began from such a singularity, expanding from it like a time reversed black hole. Hawking & Ellis (1975): The Large Scale Structure of Space-Time

This raises two problem. First, we know from the behaviour of the material orbiting them, that black holes and therefore the singularities within them have mass and energy. This might lead us to suspect that the classical initial singularity is a massive and energetic particle containing the energy of the universe, a condition which is difficult to reconcile with the absence of any space-time structure associated with it.

The second is that, as Hawking has suggested, quantum processes can lead to the "evaporation" of black holes, but it is an exceedingly slow process. Hawking radiation - Wikipedia>

Since we now generally understand that classical physics is the result of underlying quantum processes, we will proceed on the basis that the initial singularity is a quantum of action, and so formally identical to the God of Aristotle and Aquinas, which they define as actus purus.

Back to toc

5: God's ideas, cybernetics and singularity

The traditional story of creation starts with the idea that God had a plan for the world they were about to create. The modern idea, however, is that is began from a structureless initial singularity.

Aristotle and his contemporaries considered that the intelligence is associated with immateriality, and Aquinas argued that since God is maximally immaterial, they are maximally intelligent. Since the development of computation and information theory, however, this idea has fallen out of favour. Instead we understand that information is carried by marks, like the printed letters that constitute this essay. Information stored in computers, discs and solid state memories is written into billion of memory locations, each of which has a specific address and carries one bit on information. The amount of information a memory can hold is exactly equal to the number of locations it has. Each one of these locations can be assigned one of two states, represented by either a 1 or a 0, called a bit. Another name for this number is the entropy of the memory. Entropy is the simplest measure in science: it is nothing other than a count of states and may be applied to any entity that has discrete states, from a mob of sheep to a boiler full of steam.

A second modern development about which the ancients knew nothing is cybernetics, defined by one if its founders, Norbert Wiener, as the science of control and communication in the animal and the machine. Norbert Wiener (1996): Cybernetics or Control and Communication in the Animal and the Machine, Cybernetics - Wikipedia

Following Galileo, mathematical modelling has become a primary tool of modern physics. Mathematics has progressed well beyond what was available to Galileo. Aquinas believed that an omniscient and omnipotent God has total deterministic control of every event in the world. Gödel found that logically consistent formal systems are not completely determined and Chaitin interpreted Gödel's work to be an expression of the limits to control known as the cybernetic principle of requisite variety: One system to control another only if it has equal or greater entropy than the system to be controlled. This principle suggests that a completely structureless initial singularity has no power to control its behaviour, so that its primordial acts are random events. Gregory J. Chaitin (1982): Gödel's Theorem and Information, W Ross Ashby (1964): An Introduction to Cybernetics

This same principle invalidates the idea that the traditional God would have planned the universe from the beginning. One of the first attributes of God that Aquinas studied was their simplicity. According to Aquinas (and long mystical history) God is absolutely simple. Since there are no structures or marks in this God, it cannot store information in the modern sense. Aquinas, Summa, I, 3, 7: Is God altogether simple?

As we shall see, the divine initial singularity gradually builds up its structure, becoming more and more complex by the process of trial and error we call evolution. Our universe was not planned beforehand, therefore, but has evolved through its own unlimited random activity. Some products of this activity do not last, other are are consistent and durable, and these are the ones selected to exist, to give us the more or less permanent features of the world we see.

Back to toc

6: Evolution, variation and selection
The divine initial singularity gradually builds up its structure, becoming more and more complex by the process of trial and error we call evolution. Some products of this activity do not last, other are consistent and durable, and give us the permanent features of the world we see.

Random reproduction is the first step in the evolutionary process. On the whole plants and animals do not breed true, meaning that the children are very rarely precisely identical to their parents. The sources of this variation are to be found deep in the biological mechanisms of reproduction. Darwin was aware of this, but he was also aware, from his own experience and the experience of other breeders of plants and animals that is is possible, by carefully selecting breeding stock, to have some control over the characteristics of the offspring, so that over thousands of years of domestication people have developed a very wide variety of flowers, vegetables, dogs, cats pigeons, sheep and every other species of human interest. Ewen Callaway: Big dog, little dog: mutation explains range of canine sizes

Darwin also hypothesised that there could be natural selection. Some of the children of any parents would have a better chance of surviving and breeding in the environment in which they found themselves. These characteristics would be passed on to their children and possibly eventually become established as new species.

We now have detailed information about these genetic processes that lie behind the evolution of life. Below we suggest that a similar process of natural selection, building on the random generation of quanta of action, could be the source of the fundamental particles which constitute our universe. We now turn to a more detailed exploration of the emergence of the universe driven by the unlimited power of initial quantum of action, which we take to be analogous to the traditional divine creator.

Back to toc

7: Networks, brains and intelligence

The ideas proposed in this essay have been developed to shed light on some problems inherent in the theology developed by Aquinas derived from Christian doctrine and the picture of God he found in Aristotle's Physics and Metaphysics. Physics (Aristotle) - Wikipedia, Aristotle: Metaphysics

Aristotle traced a path from the physical world of everyday experience to a divine unmoved mover which drives the world. Aquinas followed Aristotle's path to produce a model of the Christian God. Aristotle believed that the world is eternal. Christianity, in contrast, believes that the world was created by an eternal God other than the world. The ancient source of this belief is Genesis, the first book of the Hebrew Torah which the writers of the Christian New Testament subsumed as their Old Testament the ancient forerunner of the theology they built around the life of Jesus of Nazareth.

Three problems I see with the Christian story are:

1. If God is the realization of all possibility, how can they create a Universe other than themselves?

2. How can we reconcile the eternity of God with the life of God, which we understand to be self motion?

3. How can we reconcile the omniscience and omnipotence of the creator with their absolute simplicity?

The answer I propose is to identify the creator with the universe we have revealed to ourselves by modern astronomy and cosmology. There now appears to be a strong consensus that the general theory of relativity combined with particle physics suggests that the Universe originated from an eternal initial singularity formally identical to the Christian God within which the universe as we know it has emerged.

This emergence is imagined to have begun with a big bang about fourteen billion years ago followed by a relatively well understood series of events which have brought us to our current condition. The big bang theory assumes that all the energy of the Universe is concentrated in the initial singularity. If the initial singularity is action, however we may see the first step in the development of the universe is the creation of energy itself by the repeated action of action suggested by the fundamental equation of quantum mechanics, E = hf. Planck-Einstein relation - Wikipedia

This identification solves all three of my problems:

1. There is but one world, and it is divine.

The initial singularity shares the properties of the traditional God: both are eternal; both are absolutely simple; and both are the source of the world. The universe as we know it exists inside the initial singularity, that is inside God. Its complexity does not therefore compromise the unity and simplicity of the world, which provides us with a starting point from which to understand the universe as a single entity. The omniscience and omnipotence of God is the omniscience and omnipotence of the universe.

2. Eternity and motion are logically connected by fixed point theory

A plausible next step in the creation of a stable universe like the one we experience within the initial singularity may be motivated by fixed point theory, which entered the world of topology with Brouwer's theorem in 1910. Brouwer fixed point theorem - Wikipedia (ref link above)

Suppose X is a topological space that is both compact and convex and let f be a continuous map of X to itself. The f has a fixed point, that is there is a point x* such that f(x*) = x*. John Casti (1996): Five Golden rules (page 71)

Does the initial singularity explained above fulfil the conditions of this theorem? It is a space, and because it is generated randomly it does not have a metric, so we might say that it is topological. Is it convex? Since it is structureless it is unlikely to have "holes" so we might assume convexity. Is it compact? We define the initial singularity to be all that there is, with nothing outside it and so it must contain its own boundary. From a logical point of view, this is to say that the initial singularity is internally consistent, like mathematics. "Outside" it is inconsistency, which an application of the principle of non-contradiction assures us cannot exist.

Brouwer's theorem is not-constructive, which raises both a problem and a useful feature. The problem arises because it does not provide a means to find a fixed point. Subsequent mathematical developments have dealt with this problem. The useful feature of non-constructivity is that it respects the ancient belief that we cannot say what God is, only what God is not, the via negativa of Dionysius (§12).

Motion and eternity and are therefore to be understood as two sides of the one coin. The quantum layer of the universe is in perpetual motion whose fixed points appear to us as the classical world in which we live.

3. The omniscience and omnipotence of the creator may emerge by local evolution in a network

Action creates energy, energy facilitates quantum theory which describes a universe of perpetual motion. From here the mathematical theory of fixed points takes us to the generation of observable particles which are able to communicate and bond to form the universe we see.

The overall framework for this picture is a communication network, and I see much value in the fact that network model, like quantum mechanics, is symmetrical with respect to complexity. The fundamental properties of quantum mechanics are the same where we are working in a Hilbert space of 2 or a countable infinity of dimensions. The fundamental properties of a communication network are the same whether we are considering the "network atom" of two sources communicating through a channel, or the a network of a countable infinity of sources communication through a set of channels that connects each one to all the rest.

High energy physicists have found that by accelerating pieces matter to very high energies making them collide they can create small bubbles of almost pure and structureless energy which then rapidly materialize into a spectrum of particles. Our knowledge of the fundamental physics of the universe comes from comparing the properties of the particles input to a bubble with the particles that come out and trying to understand the transformation that links output to input. Martinus Veltman (2003): Facts and Mysteries in Elementary Particle Physics chapter 6

We might understanding this process is by comparing a fertile human egg to the information processing system that grows out of it. I am much more complex that the egg I grew from. Although this is consistent with the second law of thermodynamics that entropy increases, it seems to contradict the idea that nothing comes from nothing.

The information in my egg is encoded in my genome, a DNA string of some three billion symbols (A, T, G, C) each representing 2 bits of information for a total of approximately 1 gigabyte, about 1010 bits.

Life is a electrochemical process based on insulating membranes and ionic motors which create and utilize electrical potentials across the membranes. This system is closely analogous to the techniques of electrical engineering. Multicellular plants rely on electro-chemical signalling to coordinate the operations of individual cells. All but the simplest of animals use neural networks, both for internal housekeeping and for interaction with the world around them.

Neural networks are constructed from neurons, cells (sources) adapted to receive, process and transmit electrical signals. The connectivity in the network is high. A neuron may have many inputs and output to many other neurons or motor cells. Neurons fall into three classes, sensory neurons which provide input to the neural network, motor neurons which convey the output of the network to effectors, and interneurons, which transform sensory input into motor output. Neuron - Wikipedia

Signals are transmitted along the fibres in the neural network by a discrete voltage spike known as an action potential which propagates along the fibre at quite high velocity. All these action potentials are effectively identical, like the digits in a computer network. Their information content is a function of their timing and frequency.

The principal functional connections between neurons and the fibres connecting them are synapses. A synapse is a complex structure which, on receiving an input from a connected fibre, releases neurotransmitters which interact with the membrane of the neuron. This interaction may be excitatory or inhibitory. The neuron algebraically integrates this input over time taking into account the “weight” associated with each synapse. When it reaches a certain threshold it “fires” sending an action potential along its axon.

Processing and memory in a neural network are modulated by synaptic weights which are a measure of the level of influence, positive or negative, a particular synapse may have on the neuron to which it is connected. The details of a neural network may be extraordinarily complex, there are many different neurotransmitters and many varieties of cells which perform auxiliary functions associated with the neural network.

One of the principal research tools used to understand the functions of neural networks are computer systems which model the synaptic connections in a network and enable the adjustment of synaptic weights by various training algorithms in attempts to model the behaviour of various subsets of neural networks. This work is of great interest to the artificial intelligence community, but is far from achieving equivalence to the human brain.

The ontological development of an individual human human brain poses an interesting problem in network creation. An important source of formal guidance in the develop of any living creature is the genome. The expression of the genome occurs in a living cell, and depends on transformations executed by the physical and chemical processes embodied in the cell.

Formally, programmed deterministic development is subject to the cybernetic principle of requisite variety. This law establishes the conditions for completeness and computability that render any process deterministic enough to have a high probability of good success.

The human nervous system comprises some 100 billion neurons each with possibly 1000 connections to other neurons.

In the specification of a standard engineered computer network, every physical connection is precisely specified by source and destination. Measured in bits of information this is at a minimum twice the logarithm to base 2 of the number of connections. Such precise specification in the case of the n connections of the human nervous system is n log n, where n = 100 billion (neurons) x 1000 (connections per neuron), ie 1014. n log n is therefore about 1016 bits, approximately a million times greater the information content of the genome.

It is necessary, therefore, that some other mechanism must account for the connective structure of the brain, which is to say that to a large degree this system must define itself. The human brain must have a self-structuring property.

The explanation appears to be a form of evolution by natural selection. The neurons in an infant brain seek out synaptic connections with one another, a process which is to a large degree random, creating an excessive number of connections. There follows a process of pruning which continues through the teenage years, eliminating little used connections.

As well as determining the wiring of the brain over a period of years, experience determines the synaptic weights connecting neurons. Changes in weight may occur in milliseconds during the real time processing of speech, and over a lifetime during the acquisition of knowledge and experience. The physical development of a brain is thus closely related to the reception of information from the environment via the senses and feedback from the results of actions (like learning to walk). IT serves as a microcosm of the development of the universe. Our minds are the product not just of our genes, but of the environment in which we find ourselves.

Mental evolution provides us with an enormous advantage, since thought is usually much cheaper than action. In the natural world of evolution by natural selection many newborns fail to reproduce for one reason or another. In some species this failure rate may be very high, thousands being born for every one that survives and reproduces. In more complex species like ourselves most children are carefully nurtured by their parents, leading to a high rate of survival.

The relationship between the quantum world and the real world is rather like the relationship between mental modelling and actual construction. We see this at work in the phenomenon called the collapse of the wave function (§10). When quantum systems observe one another, only one of a large number of possibilities is physically realized in each case. From this point of view, the quantum world works like the mind or imagination of the universe, thinking many things but only investing real resources in constructing a few. Here we see a cosmic process which brings an advantage analogous to thought, education and imagination in the development of human society and technology.

The standard Christian God of Aquinas is supreme in every particular: supreme intelligence, omniscient, knowing ever every detail of everything, past present and future. On the other hand this god is absolutely simple, so that it has no means of representing all this information.

The ancients equated knowledge with spirituality and spirituality with simplicty, an impossible situation from the point of iew of a modern understanding of information, and the point at which it becomes neceseary to reconceive the divinity. God remains omniscient and omnipotent, but now their omniscience and omniptotence is the omniscience and omnipotence of the universe irself.

Cognitive cosmology sees the universe as a mind, a creative mind, and we are the ideas in that mind, created over many billions of years by a long and complex process of evolution that we have really only become aware of in the last two centuries.

Human cultural evolution seems slow. In particular we have found that a century is a short time in the development of theology. But compared to the biological evolution of the world, we see cultural, scientific and technological changes occurring in centuries where evolutionary changes require thousands or millions of years.

On the other hand, we can imagine a very fast process of evolution in a high energy particle collision. In general there is not enough information in the input to determine the output, but the output, although random, comprises a spectrum of precisely defined and well known particles that have perhaps been selected out of a very wide spectrum of possibilities. We return to this discussion below wit the help of quantum field theory.

Back to toc

8: The classical trinitarian theology of action

The existence of the modern initial singularity is in the first instance a consequence of the general theory of relativity, but it has very ancient roots in the Hebrew notion of the one God, monotheism, which was carried over into Christianity. A remarkable development in Christian New Testament theology is the transformation of the Hebrew God Yahweh into the Christian God, and the emergence of the New Testament Trinity of three divine persons, Father, Son and Spirit.

Here I pause for a moment to review the doctrine of the Trinity for some clues about the development of a quantum initial singularity into the modern Universe. Many might feel that ancient theologians did not have a clue about reality but we have to face the fact that intelligence in Homo sapiens (as estimated by crude cranial capacity) has been a persistent quality since we first evolved and historians and archaeologists are often finding that the ancients knew more than we might have suspected. Brain size - Wikipedia

A first step in the study of the emergence of the universe within the initial singularity is suggested by the ancient reconciliation of the unity and simplicity of the Hebrew God with the Christian Trinity. The Christian doctrine was explicitly enunciated in the Nicene Creed based on the authority of the New Testament. The reconciliation began with Augustine and was further developed by Aquinas and Lonergan. This work was intended to explain how three could be seamlessly combined into one, but it seems that some of the basic ideas could be extnded to any numbers of persons or (in communication theoretical terms) sources. Nicene Creed - Wikipedia, Augustine (419, 1991): The Trinity

Aquinas derived all the traditional properties of God from the conclusion that God is pure act, actus purus, a consequence of the proof for God's existence which he received from Aristotle. How could the unity of God be reconciled with the triplicity of the Trinity? Initially, this was just considered one of the many mysteries associated with the gods, but explanations slowly emerged. Trinity - Wikipedia, Hindu - Wikipedia, Hebrew Bible - Wikipedia, New Testament - Wikipedia

The first clue may be in John's gospel, which begins: "In the beginning was the Word, and the Word was with God, and the Word was God." (John, 1:1). This sentence may allude to the ancient psychological belief that the source of the words that we speak are the mental "words" (ideas, forms) that enter our consciousness as we think about what we are going to say. Because God is absolutely simple, theologians hold that attributes which are accidental in the created world are substantial in God. God's concept of themself, God's word (Latin verbum), is therefore considered identical to God. The author of the Gospel identifies this word with Jesus, the Son of God, the second person of the Trinity, who "was made flesh and dwelt among us". The relationship of this Word of God to God has been discussed at length by the twentieth century theologian Bernard Lonergan in his book Verbum. John the Evangelist: The Gospel of John (KJV), Bernard Lonergan (1997): Verbum : Word and Idea in Aquinas

The human psychological foundation of the procession of the Son from the Father is therefore the mental image each one of us has of ourselves. The love of self manifested in ourselves as a mental accident when we contemplate our self image becomes, in the Trinity, a real divinity, the Holy Spirit. This emergence of the Son from the Father is called procession. The Holy Spirit is understood to be the real manifestation of the love of the Father for the Son. Is there procession in God?

So far we have three identical gods generated within the initial divinity of pure action. The next step in the model is the idea that the distinctions of the persons are maintained by the relations between them. Once again the principle is invoked that while relationships between created beings are accidents, in God they are substantial. Aquinas, Summa, I, 40, 2: Do the relations distinguish and constitute the persons?

In each case, the person is truly divine, that is truly pure act, so the processions of the persons may be conceived as pure actions producing pure actions. This process is limited to the production of three persons by the Christian dogma of the Nicene Creed. If, however, we accept that every action may produce action, there is no limit to the process. With this historical psychological and metaphysical picture of the Trinity in mind, we can now turn to a more quantum mechanically oriented discussion of the multiplication of the initial quantum of action into the universe.

Back to toc

9: The active creation of Hilbert space

It is difficult to associate energy with a hypothetical structureless entity prior to the emergence of spacetime. One approach may be through Landauer's claim that information is physical. We can extend this idea to embrace the idea that logic, the means of computationally processing information is also physical, and conversely, that physics is an instance of information processing. This is consistent with current ideas of quantum computation and quantum information. Lo, Spiller & Popescu (1998): Introduction to Quantum Computation and Information

In its simplest incarnation, we may consider the quantum of action as a not operator. This operator changes some situation p into some not-p. In the binary realm of propositional logic, we understand that not-not-p = p, but in the wider world we can see, for instance, that there are about seven billion instance of not-me. The effect of this definition to is to interpret action in terms of logic which may be understood to be purely formal, like Hilbert, Whitehead and Russell's mathematics, needing no spacetime support. With emergence within the initial singularity of of energy, spacetime and momentum action obtains the dimensions of angular momentum. This transformation may provide insight into the role of logic in physics. Hilbert's program - Wikipedia, Whitehead (1910, 1962): Principia Mathematica

This logical approach suggests that action in its primordial form is inherently dimensionless. This may account for the fact that quantum coupling constants are dimensionless, making renormalization possible. At present a fundamental barrier to a quantum theory of gravitation is that it is not renormalizable because the gravitational constant has dimensions of M -1 L3 T -2. We will return to this in §10. Renormalization - Wikipedia: Renormalizability

Let us assume, then, that the quantum initial singularity comprises action, identical to the traditional God, and that it has the power to reproduce itself indefinitely, free of the dogmatic limitations of any religion and without concern for the conservation of energy, which has yet to emerge. We may guess that this action creates Hilbert space, dimension by dimension, the orthogonality of the dimensions being guaranteed by the no-cloning theorem. No-cloning theorem - Wikipedia

Von Neumann showed that quantum mechanics is best described using an abstract Hilbert space, a complex linear vector space analogous to Cartesian space with a metric defined by an inner product. We assume for the time being that this space may have at most a countably infinite number of dimensions, 0. Physical states are represented by rays in this Hilbert space, and we assume that the initial state of the quantum initial singularity has 1 mathematical dimension represented by the complex plane. Inner product space - Wikipedia

Von Neumann defines abstract n dimensional Hilbert space with three axioms:

α)A “scalar product,” i.e., the product of a (complex) number a with an element f of Hilbert space: af;
β) Addition and subtraction of two elements f, g of Hilbert space: f ± g;
γ) An “inner product” of two elements f, g in Hilbert space. Unlike αand βthis operation produces a complex number, and not an element of Hilbert space: (f, g.

Each element, f, g, . . . defines the orientation of a complex plane in Hilbert space. It is called a ray, with the property that e i θ f = f so that the orientation of element within its associated plane is only relevant when vectors associated with the same plane are added (superposed).

Basis elements of a Hilbert space f and g are normalized by the property (f, f) = 1 and are said to be orthogonal when (f, g) = 0.

Each vector in a Hilbert space represents a quantum of action , and we assume that since the specific dynamic property of a quantum of action is to act, the initial singularity will eventually become populated with a 0 states. The result is an 0 dimensional Hilbert space of orthonormal basis vectors analogous to the vacuum of quantum field theory.

Von Neuman points out that:

The noteworthy feature of the operationsaf, f ± g, (f, g) is that they are exactly the basic operations of the vector calculus: those which make possible the establishment of length and angle calculations in Euclidean geometry or the calculations with force and work in the mechanics of particles.

Back to top

10: the emergence of quantum mechanics

We are attempting to construct a universe from a primordial quantum of action. In broadest terms we might imagine that the mechanism for this construction has two stages, familiar to us from Darwinian evolution, variation and selection.

The action of a quantum of action is to act, reproducing itself. Since it is completely simple it cannot control itself and so its actions are both random and in some way connected by their common source to form complex structures. Like the traditional God, the primordial quantum has the power to explore all consistent structures.

These structures are subject to selection since no self contradictory structure can exist. From this point of view, the space available to the universe is formally equivalent to the space available to mathematics. The only constraint we place on mathematical structures it that they be consistent. Hilbert thought that mathematics so understood would be able to solve all problems, but Gödel and Turing showed that this is not the case. Consistent mathematics embraces both incompleteness and incomputability. Complete theory - Wikipedia, Computability theory - Wikipedia

There is a problem with the traditional notion that the universe exists outside God, since God is understood to be the fullness of being, so that there can be nothing outside them. In the previous section we explained the idea that there is a trinity of divine persons inside God. We define "person" in abstract communication terms as sources, that is entities capable of sending and receiving messages. To maintain a connection with traditional theology, I want develop an analogy with the Trinity which envisages the procession of a transfinite number of sources rather than just three, each a quantum of action proceeding within the initial singularity, our analogue of the traditional God.

Like the action of the traditional god, the action of the initial singularity is to reproduce itself within itself. The difference is that while the trinity stops at three the procession of action has no limit.

In both classical and quantum physics energy is created by repeated action. A new layer of structure, energy and time thus emerges from the natural behaviour of the primordial divine action. Energy and time are the ingredients for the Hamiltonian of non-relativistic quantum mechanics. Hamiltonian (quantum mechanics) - Wikipedia, Feynman Lectures on Physics III: Chapter 8: The Hamiltonian Matrix

This scenario suggests a source of the vacuum of quantum field theory. Since the initial singularity is acting at random, we can imagine that there are random intervals between events, which intervals correspond inversely to frequencies and energies, so we may imagine the initial singularity as the source of an unlimited spectrum of random energies which looks like the conventional vacuum.

In the case of the Trinity, the persons are considered to be distinguished by their relationships to one another. We might understand the distinctions between the quanta of action proceeding from the initial singularity to be explained by the quantum no-cloning theorem. From this point of view the action of a quantum of action is identical to the universal logical operator NAND (not-and), producing a new state orthogonal to the original state. No-cloning theorem - Wikipedia

The creation of new orthogonal quantum states in the initial singularity is in effect the creation of new dimensions of a Hilbert space. We may imagine each new quantum of action as a pure quantum state, a basis vector of the space, represented by a ray in the space. This structure sets up a foundation for the operation of quantum mechanics as described in §9. Quantum state - Wikipedia, Projective Hilbert space - Wikipedia

At this point the interior of the initial singularity stands as a one dimensional spectrum of energy, a harmonic oscillator whose states are represented by orthogonal rays in a Hilbert space of countable dimension, the orthogonalities dictated by the non-cloning theorem. We may see this as an alphabet for music or speech which may be used to write music, analogous to the vacuum of conventional quantum mechanics, which provides an alphabet to write a universe. Quantum harmonic oscillator - Wikipedia

The action of action is to create action, and the fundamental equation of quantum mechanics, E = hf relates energy to the frequency of action, so that the repeated actions in the initial singularity create energy which is the basic input into quantum mechanics through the action of the Hamiltonian or energy operator.

A serious difficulty arising in quantum field theory is the cosmological constant problem which is a consequence of the idea that the universe in built on a vacuum with infinite of degrees of freedom, each with a finite zero point energy. The standard exposition of the theory yields a total energy about 100 degrees of freedom greater than that actually observed. The approach taken here, generating the vacuum "in situ" may provide a solution to this problem. We will return to this question below.

Near the beginning of his classic exposition of quantum theory, Dirac notes that one of the principle peculiarities of the new theory is superposition or interference. Superposition is a feature of waves, and we can see it and hear it. If we throw two stones into a smooth pond we will see that when the circles of waves spreading from each impact meet they add and subtract from one another to form a complex pattern of waves which seem to pass through one another unaffected. The phenomenon is also clear in sound. We can usually distinguish voices of different instruments or people sounding together. Jessica Haines: Two stones wave patterns, Catherine & Johnathan Karoly: Heitor Villa-Lobos: The Jet Whistle

In his lectures on physics Feynman uses the double slit experiment to demonstrate the radical difference between the quantum mechanics of fundamental particles and the behaviour of classical particles like bullets. He summarizes it is three simple propositions:

. The probability of an event in an ideal experiment is given by the square of the absolute value of a complex number φ which is called the probability amplitude:

P = probability,
φ = probability amplitude,
P = |φ|2.

. When an event can occur in several alternative ways, the probability amplitude for the event is the sum of the probability amplitudes for each way considered separately. There is interference:

φ = φ1 + φ2,
P = |φ1 + φ2|2

. If an experiment is performed which is capable of determining whether one or another alternative is actually taken, the probability of the event is the sum of the probabilities for each alternative. The interference is lost:

P = P1 + P2

One might still like to ask: “How does it work? What is the machinery behind the law?” No one has found any machinery behind the law. No one can “explain” any more than we have just “explained.” No one will give you any deeper representation of the situation. We have no ideas about a more basic mechanism from which these results can be deduced. Feynman lectures on physics III:01: Quantum behaviour

The quantum amplitudes referred to here are invisible and undetectable. They are assumed to exist, since quantum mechanical computations based on these ideas invariably match our experience, and we assume that the invisible amplitudes of quantum theory behave mathematically just like the visible and audible interference of real physical waves of water and sound. In the early days of wave mechanics, physicists often found themselves studying sound waves to gain insight into quantum waves.

Back to toc

11: Quantization: the mathematical theory of communication

We assume that the structure of the universe is maintained by communications between it various components. If the universe is to be stable, we further assume that these communications are at least so some degree error free. The mathematical theory of communication developed by Shannon establishes that quantization and error prevention are very closely related.

From a mathematical point of view, a message is an ordered set of symbols. In practical networks, such messages are usually transmitted serially over physical channels. The purpose of error control technology is to make certain that the receiver receives a string comprising the same symbols in the same order as that transmitted. The enemy of error free transmission is confusion of symbols and scrambling of their order. The fidelity of a channel can be checked by the receiver transmitting the message received back to the sender, who can compare the original with the retransmitted version.

Shannon's mathematical theory of communication shows that by encoding messages into discrete packets, we can maximize the distance between different signals in signal space, and so minimize the probability of their confusion. This idea enables us to send gigabytes of information error free over noisy channels. The specific constraint is that the ratio of signal to noise in a channel governs the rate of error free transmission. Claude E Shannon: A Mathematical Theory of Communication, Alexandr Khinchin (1957): Mathematical Foundations of Information Theory

Shannon's theory relies on the classical statistics of distinguishable and countable events and is an application of the mathematics of function space.

A system that transmits without error at the limiting rate C predicted by Shannon’s theorems is called an ideal system. Some features of an ideal system are embodied in quantum mechanics, particularly quantization.

1. To avoid error there must be no overlap between signals representing different messages. They must, in other words, be orthogonal, as with the eigenfunctions of a quantum observable. Observable - Wikipedia

2. The basis signals or letters of the source alphabet may be chosen at random in the signal space, provided only that they are orthogonal. The same message may be encoded into any satisfactory basis provided that the transformations (the codec) used by the transmitter to encode the message into the signal and receiver to decode the signal back to the message are inverses of one another. Quantum processes are reversible in the sense that the unitary evolution of an isolated quantum system acts like a lossless codec. Codec - Wikipedia, Unitary operator - Wikipedia

3. The signals transmitted by an ideal system have maximum entropy and so are indistinguishable from random noise. The fact that a set of physical observations looks like a random sequence is not therefore evidence for meaninglessness. Until the algorithms used to encode and decode such a sequence are known, nothing can be said about its significance.

4. Only in the simplest cases are the mappings used to encode and decode messages linear and topological. For practical purposes, however, they must all be computable with available machines. How this applies in quantum theory is closely related to the measurement problem and the collapse of the wave function (§10).

5. As a system approaches the ideal, the length of the transmitted packets, the delay at the transmitter while it takes in a chunk of message for encoding, and the corresponding delay at the receiver while the message is decoded , increase indefinitely. Claude Shannon (1949): Communication in the presence of noise

In computer networks the implementation of error free communication requires coding for transmission by the sending source which is decoded by the receiving source to reveal the original message. Entropy must be conserved and the codec must be complete if the transmission process is to be lossless.

Quantum systems described by algorithms such as the Schrödinger equation are constrained to evolve through time in a unitary and reversible manner. The Schrödinger equation defines an error free communication channel which is nevertheless invisible to us. This process is interrupted when systems interact, just as computers in a network are interrupted when they are required to deal with an incoming message. Unitarity (physics) - Wikipedia

Back to toc

12: The quantum creation of Minkowski space

We are inclined to take spacetime as given and see it as somehow emerging (like a time reversed black hole?) from the initial singularity. On the other hand we are convinced that the proper explanation of the nature of the world is quantum mechanics, so it seems reasonable to expect spacetime to be a layer of reality constructed from elements provided by quantum mechanics. This is consistent with the idea that the quantum world precedes the spacetime world. We suppose that quantum mechanics is the underlying symmetry that is applied (and broken) to create the transition from energy-time to spacetime, accompanied by the parallel emergence of momentum-distance and gravitation.

Here we come to the most interesting part of this story, the interface between the abstract invisible quantum world that drives the universe from behind the scenes and the world of space and time in which we live and which serves as the stable screen upon which we exist and observe everything that happens. Quantum mechanics works in the world of energy — time; we live in the world of energy — momentum — time — space — gravitation. How does structural information flow from Hilbert space to Minkowski space and back? This question, often known as the measurement problem, has been debated since the beginning of quantum mechanics.

The key idea here is tautological or zero-sum complexification. This has already appeared unremarked in section 10. There, since it is of the nature of action to act, and we define energy as the rate of action, we may see that action of its nature (ie tautologically) creates energy. From this point of view the emergence of energy has added nothing new to the initial singularity. The idea here is that each step in the complexification of the universe steps through a phase of randomness and uncertainty to create new entities that add up to nothing, like potential and kinetic energy or positive and negative charge. Primordial symmetries are broken to create new features of the world.

The principle at issue here is that causality requires contact. Isaac Newton was forced by circumstances to admit that gravitation was some sort of "action at a distance", which we understand to be impossible in ordinary space. Quantum entanglement in Hilbert space (a mathematical space), however, led Einstein to imagine "spooky action at a distance". We shall suggest in §13 that this is possible because quantum mechanics occupies world where there is no spatial distance in the Newtonian sense. In the space-time world contact is maintained by particles moving at the speed of light which follow "null geodesics" whose beginnings and ends coincide in spacetime. (see §16) Geodesics in general relativity - Wikipedia

The most peculiar feature of Minkowski spacetime is its metric ημν, which is diagonal 1, 1, 1, -1. This suggests that zero bifurcation is at work, so that in some sense space + time = 0 The principal ingredients of a model of the emergence of spacetime are therefore symmetry, zero bifurcation and the speed of light. The null geodesic, made possible by the Minkowski metric, is the accommodation made in spacetime to maintain contact after the emergence of space. The velocity of light is an artefact of this accomodation and enables contact in the quantum world to continue uninterrupted despite the emergence of space. How can this happen? We invoke the evolutionary principle that uncontrolled action can try everything, and that consequence of these trials that are self sustaining are selected and can become fixed.

Interaction is local. Before space enters the world, contact is inevitable and quantum systems can evolve unitarily without interruption. To correlate their evolution, spatially separated systems must communicate to maintain contact. The metric of Minkowski space enables the existence of null geodesics whose endpoints are in contact because the observable space-time interval between them is zero. The unitary contact of spatially separated systems can thus be maintained if the messenger travelling between them proceeds at the speed of light in Minkowski space. In other words the speed of light makes space possible by maintaining the integrity of the contact and unitarity that is essential to the work of quantum mechanics, and this "trick" explains the Minkowski metric. Kevin Brown (2018): Reflections on Relativity, page 693.

It has been generally assumed that Minkowski space is the domain of Hilbert space so that it is necessary to apply Lorentz transformations to both Hilbert spaces and particles in quantum field theory. I have suggested above that this my not be necessary because Hilbert space is prior to, independent of and the source of Minkowski space. Martinus Veltman (1994): Veltman (1994) op cit page 20

Back to toc

13: Is Hilbert space independent of Minkowski space?

Since the advent of special relativity, classical physical field theories are usually described in flat Minkowski space-time. The special principle of relativity holds that every observer sees the same laws of physics, including the same speed of light, in their own rest frame. This defines the Lorentz transformation which enables each observer to compute their local equivalent of the spacetime intervals observed between events in moving frames. This transformation is expressed succinctly by the 1, 1, 1, -1 metric of Minkowski space so that if we set the speed of light c to 1, all observers see an invariant interval ds2 = dx2 + dy2 + dz2 - dt2. Minkowski space - Wikipedia, Tests of special relativity - Wikipedia

It seems to be generally accepted in quantum field theory that the Lorentz transformation applies equally to states in Hilbert space and particles in Minkowski space. This implies that the domain of Hilbert space is Minkowski space. Martinus Veltman (1994): Diagrammatica: The Path to the Feynman Rules page 20.

If, however, the quantum world constitutes a layer of the universe built on the initial singularity before the emergence of energy, time, space and momentum, this convention may need revision. The phenomenon of entanglement suggested that the Hilbert quantum world exists prior to and independent of the Minkowski classical world. It seems more reasonable to attribute the apparent propagation of correlations associated with entanglement to the absence of space than to the propagation of thee correlations at infinite velocity.

If this is the case, it opens up a new degree of freedom lying between quantum and classical dynamics which may make it possible to remove some of the confusion in quantum field theory noted by Kuhlman in the epigraph to this essay.

Einstein appears never to have been truly happy with quantum mechanics, and often sought to demonstrate its weaknesses. This may have been because he felt that nature should go its own way independently of any observers. In the quantum world, however, observation is part of the physics. Although some have felt that 'observer' implied a conscious being, we can equally well imagine the universe observing itself to create real events. In the course of a paper on what is now known as the EPR paradox the authors identified 'spooky action at a distance' which is a now seen as a consequence of entanglement. Einstein, Podolsky & Rosen: Can the Quantum Mechanical Description of Physical Reality be Considered Complete?, Quantum entanglement - Wikipedia, Gabriel Popkin (2018): Einstein's 'spooky action at a distance' spotted in objects almost big enough to see

EPR equate reality with predictability: If, without in any way disturbing a system, we can predict with certainty (i.e. with probability equal to unity) the value of a physical quantity, then there exists an element of physical reality corresponding to this physical quantity.

Experimental tests of entanglement have gradually improved until now it is widely believed that spooky action at a distance is real. It has been shown that this correlation at a distance operates at many times the velocity of light. Pan, Bouwmeester, Daniell, Weinfurter & Zeilinger: Experimental test of quantum nonlocality in three-photon Greenberger–Horne–Zeilinger entanglement, Salart, Baas, Branciard, Gisin, & Zbinden: Testing spooky action at a distance, Juan Yin et al: Lower Bound on the Speed of Nonlocal Correlations without Locality and Measurement Choice Loopholes

EPR argue from the quantum formalism that a measurement on one part of an entangled system enables us to predict with certainty the outcome of a measurement on the other system even though they are spacelike separated. They concluded that 'no reasonable definition of reality could be expected to permit this.' It turns out, however, that the quantum mechanical description has become established as the new reasonable definition of reality.

Here we exemplify entanglement and its consequences using a two state system. Electrons have two spin states which are often called up and down. In the singlet state, one electron has spin up, the other down, so that the total spin of the singlet is zero. Singlet - Wikipedia

Entanglement establishes that when these electrons spatially separated, they behave, when observed, as though they are still in contact. If one electron is observed to be spin up, the other will be observed to be spin down no matter how far apart they are. This is 'spooky action at a distance', but the fact that this correlation appears to be instantaneous suggests that although the electrons are distant in Minkowski space, they are still in contact in Hilbert space. If this is the case, it is a major break from conventional wisdom and opens the way for a pleasing new approach to theology, spirituality and quantum field theory.

Although the biggest shock that came with quantum mechanics is the inability to predict the precise timing of events, entanglement gives us a very definite method predict both the nature and timing of entangled events, since the observation of one half of an entangled system appears to have an immediate consequence in the other half.

Although the traditional observers Alice and Bob can communicate the fact of their observations via entanglement immediately and definitely, it cannot be used to communicate information faster than the speed of light. This is because Alice cannot control what she is going to observe, and therefore control what Bob receives, even though it is certain, in the binary case, that if Alice observes 0 Bob will observe 1. In other words, the pure quantum world is, as Einstein felt, incomplete. It is completed by observation.

The experiments of Pan, Salart and Yin referred to above have demonstrated that entangled particles could act upon one another at a distance even though their separation was spacelike, requiring something greater than the velocity of light to account for their correlation. This is called quantum non-locality. Quantum nonlocality - Wikipedia

Classical physics is founded on a belief in local realism. This has three features:

1. regularities in observed phenomena point to the existence of physical reality independent of human observers;

2. consistent sets of observations underlie 'inductive inference', the notion that we can use them to devise models of what is going on behind the scenes; and

3. causal influences cannot travel faster than the velocity of light.

Long experience and detailed argument as shown that quantum mechanics is not a local realistic theory. Bernard d'Espagnat (1979): The Quantum Theory and Reality

The EPR argument was perhaps the first hint that local realism is false. John Bell studied EPR and formulated a first version of Bell's theorem which would show that quantum mechanics was not a local realistic theory. John Bell (1987): Speakable and Unspeakable in Quantum Mechanics, Myrvold, Genovese & Shimony (Stanford Encyclopedia of Philosophy): Bell's Thorem

Back to toc

14: The measurement problem

The Hilbert space representation of a quantum state is a vector which may be the superposition of a number of orthonormal basis states corresponding to the dimension of a Hilbert space. In physical applications such spaces commonly have a countable infinity of dimensions but the simple two dimensional Hilbert space called the qubit tells us most of what we want to know because quantum mechanics, like the theory of networks, is symmetrical with respect to complexity. Orthonormal basis - Wikipedia, Qubit - Wikipedia

We cannot see the vectors in Hilbert space. Our conjectures about this hidden quantum mechanical structure are based on observations of the interactions of visible particles. What we do see are eigenvalues which correspond to the eigenfunctions of the operator we use to measure an unknown state. The theory predicts that there are as many possible eigenvalues as the dimension of the measurement operator. The terms collapse or reduction of the wave function refer to the fact that observations only ever reveal just one of the possible states of the unknown system. Measurement problem - Wikipedia

This situation is radically different from the superposition of real waves. It is as if we were to listen to the The Jet Whistle (§10) and at different times hear the flute and no cello or the cello and no flute. It seems that we can only see fragments of the information encoded in the amplitude representation of a quantum event yet we believe it is all there. When we make quantum computations we have to take all the possible invisible amplitudes into account if we are to get the right answer.

The radical problem facing the development of quantum computation is illustrated by the difference between a classical bit (binary digit) and its quantum analogue, the qubit. A classical bit has just two states, usually represented 0 and 1. These states are orthogonal, one is not the other. A qubit on the other hand is a vector formed in a two dimensional Hilbert space by adding the orthogonal basis states |0> and |1>. This vector has an transfinite count of states represented by the equation |qubit> = a|0> + b|1>, where a and b are complex numbers subject to the constraint that |a|2 + |b|2 = 1. When we observe a qubit, however, all we ever see is |0> or |1> with frequency P( |0> ) = |a|2, P( |1> ) = 1 - |a|2. The infinite amount of information which we suppose to be represented by the qubit turns out to be at best just 1 classical bit. It has collapsed. People designing quantum computers must try to devise some way to take advantage of this (allegedly) hidden information. Nielsen and Chuang write:

Understanding the hidden quantum information is a question that we grapple with for much of this book and which lies at the heart of what makes quantum mechanics a powerful tool for information processing. Nielsen & Chuang (2000): Quantum Computation and Quantum Information, page 16 (ref link above)

A possible answer is provided by Wojciech Zurek who takes the view that this collapse is a necessary consequence of the transmission of information between two quantum systems. He writes:

The quantum principle of superposition applies to isolated systems, but is famously violated in the course of measurements: A quantum system can exist in any superposition, but a measurement forces it to choose from a limited set of outcomes represented by an orthogonal set of states. . . . I show – using ideas that parallel the no-cloning theorem – that this restriction (usually imposed “by decree”, by the collapse postulate) can be derived when a transfer of information essential for both measurement and decoherence is modeled as a unitary quantum process that leads to records with predictive significance. Wojciech Hubert Zurek (2008): Quantum origin of quantum jumps: breaking of unitary symmetry induced by information transfer and the transition from quantum to classical

He begins with a concise definition of standard quantum mechanics in six propositions:

(1) the quantum state of a system is represented by a vector in its Hilbert space;

(2) a complex system is represented by a vector in the tensor product of the Hilbert spaces of the constituent systems;

(3) the evolution of isolated quantum systems is unitary governed by the Schrödinger equation:

i|ψ> / ∂t = H |ψ >

where H is the energy (or Hamiltonian) operator.

The other three show how the mathematical formalism couples to the observed world:

(4) immediate repetition of a measurement yields the same outcome;

(5) measurement outcomes are restricted to an orthonormal set { | sk > } of eigenstates of the measured observable;

(6) the probability of finding a given outcome is pk = |<sk||ψ>|2, where |ψ> is the preexisting state of the system.

Schrödinger equation - Wikipedia, Born rule - Wikipedia

Historically, the first three postulates of quantum mechanics have been considered uncontroversial, but there has been endless debate about the interpretation of the mathematical formalism encapsulated in postulates (4) - (6).

Zurek examines a system in 2D Hilbert space, noting that the complexity invariance of quantum mechanics enables an extension of the argument to a space of any dimension.

In the Hilbert space HS the state vector |ψS> is the superposition of two states: |ψS> = α|v> + β|w> (1).

Apparatus A0 measures state S:
|ψS>|A0> = (α|v> + β|w>)A0 = α|v>|Av> + β|w>|Aw> = |ΦSA>

|ΦSA> is a vector in the tensor product of the constituent Hilbert spaces – (2).

The composite system is normed and linear, so <A0||A0> = <Av||Av> = <Aw||Aw> = 1.

So

<ψS||ψS> - <ΦSA||ΦSA> = 2Rα*β<v||w> (1 - <Av||Aw> = 0

so <v||w> (1 - <Av||Aw> = 0.

then if <v||w> ≠ 0, information transfer must have failed since

<Av||Aw> = 1, so 1 - <Av||Aw> = 0

or else

<v||w> = 0 so <Av||Aw> may have any value. So |v>, |w> must be orthogonal.

Conclusion: 'The overlap <v||w> must be exactly 0 for <Av||Aw> to differ from unity.

'Selection of an orthonormal basis induced by information transfer – the need for spontaneous symmetry breaking that arises from the unitary axioms of quantum mechanics (i, iii) is a general and intriguing result.

von Neumann shows that quantum mechanical measurement creates entropy. This may seem counterintuitive: the annihilation of quantum states implicit in measurement process leads to the selection of a real state, the outcome of the measurement. We return to this below in a discussion of evolution. John von Neumann (2014): (ref link above)

Back to toc

15: Gravitation

Gravitation and quantum theory are both primordial and they are also distinct, so we would like to see them as two sides of a zero-sum bifurcation. How would this work? Quantum theory and gravitation are both closely related to energy. Energy, via phase and superposition, is the source of structure in quantum processes. Energy is also both the source and the subject of gravitation, which couples energy to energy, and determines the overall structure of the universe.

The The quantum mechanical explanation for the differentiation of fermions explains that because they have half integral spin the probability of the superposition of two fermion states is zero. If we assume that Hilbert is the primordial space and use the principle of simplicity to argue that in the beginning there are only two opposite phases of recursive function at any energy / frequency (a form of digitization) we can account for the primordial existence of spacetime and bosons and fermions. But where does gravitation come in?

The answer that appeals most to me is that the general theory is a measure of the 'ignorance' of the initial singularity. This is because all information and communication is digital, as shown by Shannon's theory of communication, Turing's theory of communication and Gödel's theory of logic discussed above (§§8, 6 & 7). Therefore, insofar as gravitation is described by the continuous functions of differential geometry it carries no information and is therefore consistent with the primordial simplicity of the universe. It is, in other words, a perfect description a system of zero entropy which is subsequently populated by quantum theory, describing the quantum of action which underlies all communication and the magnificent complexity of the universe which has grown within the gravitational shell. This ignorance is implicit in the notion of general covariance that all Gaussian cooordinate systems are equivalent for the description of physical 'law' which is the foundation of Einstein's work:

The following statement corresponds to the fundamental idea of the general principle of relativity: "All Gaussian coordinate systems are essentially equivalent for the formulation of the general laws of nature."] Einstein (2005): Relativity: The Special and General Theory, page 123

Gaussian coordinates do not provide a metric, which is why Einstein's equation does not provide a measure of the size of the universe, and therefore applies from the initial singularity to the Universe at any subsequent stage of is expansion. Einstein's equation provides a transformation, analogous to the Lorentz transformation, from coordinates supplied by the observer to an estimate of the coordinates of the object being observed. The details of these coordinates must be provided by the observer, and apply to all observers and all all objects to be observed. This is the point of general covariance. Gaussian curvature - Wikipedia

The details are provided by quantum mechanics which explains both the creation of observers and the objects to be observed. Initially we are only dealing with fundamental particles. Later, under the influence of gravitation these particles often coalesce into both observers and the objects to be observed such as planets, stars and galaxies. These objects are often formed under the influence of gravitation which causes may of these particles to coalesce into stars whose high temperatures result from the from the conversion of gravitational potential into the kinetic energy which serves to drive that high energy quantum mechanical processes that lead the creation of the more complex particles which are redistributed when some such stars explode and distribute their material through space. This material may be collected once more by gravitation into another generation stars and planets which eventually sustain the evolution of astronomers and other thoughtful particles interested in the structure of the Universe.

We take the fundamental equation of quantum mechanics E = ℏω to be primordial in that the action of action is to act and energy is repeated action. This, combined with the no cloning theorem, establishes each of these actions as equivalent to the universal logical nand gate. At this point quantum theory sees no distinction between potential and kinetic energy. Only energy differences feature in quantum mechanics and we can set the zero of energy wherever we wish.

In section 12 we introduced space as the dual manifestation of time transformed by the speed of light. Now we introduce momentum as the dual manifestation of energy, transformed once again by the velocity of light. For the massless photon, the transformer between time and space, energy is identical to momentum. For massive particles, the same idea is expressed by Einstein's equation E = mc2. Although mass and energy are numerically equivalent, they are conceptually quite different.

@@@

The most obvious feature of gravitation is the gravitational potential that holds us on the planet and kills us when we fall from significant heights. We may look upon potential energy as kinetic energy travelling in the form of a massless particle at the speed of light. This is the nominal speed for gauge particles like photons, gravitons (if they exist) and gluons. We may understand the potential energy of massive particles as arising from their internal motions moving at light speed, so that their interior world comprises null geodesics which account for their apparent zero size in spacetime. This seems consistent with the Wilczek's idea proposed above that the mass of baryons is produced by the kinetic motions of their internal components which generally much lighter than the baryons themselves. Mass, we might say, is energy trapped in a null geodesic. Potential energy - Wikipedia, Wilczek (2002) op. cit. chapter 10 sqq.

We can understand the 3D component of Minkowski space by thinking of the differences between wired and wireless in practical communication networks. Wireless communication is serial (one dimensional) and channels are distinguished by frequency or energy, as we find on quantum mechanics. Wired networks, on the other hand, need three dimensions for their existence in order to prevent wires intersecting in the same plane. We may consider the case of moving fermions by analogy with city traffic. In a two dimensional road system, time division multiplexing introduced by traffic lights enables traffic streams to cross. Three dimensional structures like overpasses and tunnels enable uninterrupted two dimensional traffic flow and separation of air traffic in three dimensional space is established by requiring vehicles travelling in different directions to operate at different altitudes. If we understand the emergence of new features in the universe as a matter of random variation and controlled selection, we may see that 3D space is adequate for complete wired connection, so spaces with 4 or more dimensions have no evolutionary raison d'etre and may be selected out.

Wired networks are therefore more like plumbing or electrical power networks.. Tuning is not required to discriminate sources but switching maybe necessary for one source to connect to many others. A wired network transmitting pure unmodulated power shares three properties with gravitation: it exists in four dimensions, three of space and one of time; it can deal indiscriminately with energy in any form; and the direction of motion of signals is determined by potentials.

From an abstract point of view a fixed point is the dual of the compact and convex nature of a set and we can expect to find a different fixed point corresponding to every continuous mapping of the set onto itself: fixed point theory is symmetrical with respect to the set of all continuous functions. In the case of quantum mechanics these fixed points are the solutions to the eigenvalue equation. Their existence is guaranteed by the Hermitian nature of the unitary operators in quantum mechanics. Agarwal, Meehan & O'Regan (2009): Fixed Point Theory and Applications

An important feature of the network model is that it is symmetric with respect to complexity. The atom of a network is two sources and a channel, which we may think of quantum mechanically as two bosons and a fermion. Sources and connections can exist at any scale. Communications between discrete sources become possible when they share a language or codec, that is a symmetry. Since gravitation is the universal codec which couples all sources without discrimination so long as they have energy, we can imagine that it emerges in the about same epoch of the evolution of the universe as quantum mechanics. Unlike quantum mechanics however, where connection is established by specific values of energy which we compute using the eigenfunctions of specific operators, gravitational connections are indiscriminate. This suggests that they represents a symmetry deeper and simpler than quantum mechanics which reflects the consistent unity of the initial singularity. We would expect gravitation to respond to all the energy levels present in a vacuum, for instance, which is why the cosmological constant problem is so troubling in a universe with infinite degrees of freedom each with attached zero point energy..

Consequently we might conjecture that gravitation is not quantized. In §8 above we have use Shannon's communication theory to connect error free communication with quantization. If gravitation is a universal code which cannot go wrong, there is no ground for quantization. Nevertheless gravitation imposes the precise structure on the universe.

The classical general theory of relativity predicts classical gravitational waves which have now been observed, giving us information about large cosmic events occurring at great distances. Great distance and the overall weakness of gravity mean that these waves require very large high precision interferometers for their detection. Gravitational-wave observatory - Wikipedia

In short perhaps we may see gravitation as a hybrid between classical and quantum reality before they became differentiated. Our cities are festooned with power cables, telephone lines and surrounded by clouds of wireless. Our bodies, on the other hand, are largely wired. But in all cases communication requires contact, and a symmetry that unites Hilbert and Minkowski space.

Given this picture, we might understand that energy attracts energy because it is all the same energy created by action. It is subsequently and being mapped into fixed potential energy equal and opposite to the kinetic energy from which it came. Lagrangian mechanics - Wikipedia (ref link above)

We might imagine that the coupling between the two spaces Hilbert and Minkowski which we ascribe to observation describes the inner life of particles, ie it is a story of consciousness in the same way as my conscious awareness is related to my physical actions. So I imagine that quantum computations in Hilbert spaces are to be found inside particles as my consciousness is to be found inside me. What I hope is that a clear distinction between Hilbert and Minkowski removes many of the difficulties in quantum field theory that arise from ignoring this distinction. So we think that every particle is a computer and the relationships between particles are networks.

This, and the idea that gravitation is not quantized, suggests that gravitation must be some primordial quality of the interior space of the initial singularity described by the classical general theory of relativity. This would be consistent with the idea that it is present from the beginning and guides the growth of the universe as a whole as Einstein found. As Einstein noted when he published his field equation for gravitation, the general theory of relativity is a closed logical structure and there is little choice for an alternative. From this point of view, we might see the interior of the initial singularity as a continuous Lie group fulfilling the hypotheses of fixed point theory. General relativity - Wikipedia, Abraham Pais (1982): 'Subtle is the Lord...': The Science and Life of Albert Einstein page 256, Lie Group - Wikipedia

We have built a Hilbert space inside the initial singularity by its propensity to act, our starting point being mathematical fixed point theory. The topological barrier constraining the universe being the boundary between being consistent inside and inconsistent outside.

In section 9 I proposed replacing the classical initial singularity derived from Einstein's theory of relativity with a quantum source which produced an unlimited random sequence of discrete quanta of action. This process could be understood both as the construction of a Hilbert space within the singularity and the creation of energy. Energy is both the foundation of quantum mechanics and the material, from which through quantum mechanics, everything is constructed (section 10, principle energy).

We study the quantum world by spectroscopy, stimulating it with means that range from bunsen burners to the Large Hadron Collider and measuring the species and energy of the particles that are emitted. Although physicists and machines in the real physical world are both the source of this stimulation and the observers, we know that what we are provoking is the interaction of invisible quantum states with one another, the process we call measurement. We assume that the world goes its own way in the absence of human observers, so that there is continual interaction in the Hilbert domain yielding events in the Minkowski domain which are separated by real spacetime distances which may be zero.

In the complex modern world we have considerable control over the measurement operators we use but the outcomes of our observations nevertheless remain uncertain (section 14). In the primordial system both the measurement operators and the states measured are predominantly random. Nevertheless the theory shows that there will be a spectrum of eigenvectors and associated eigenvalues yielding a spectrum of real results with a probabilities predicted by the absolute square of the complex amplitude | ψ |2 resulting from a computation in the Hilbert domain of the inner product of the interacting states.

We can imagine that as the number of states represented in the Hilbert space created within the singularity grows, the number of interaction will increase so that the size of the space created and the spectrum of particles occupying that space will also increase, as described in sections 10 and 12.

We imagine that this space is locally Minkowski and the particles existing in it are a random mixture of massless bosons following null geodesics and massive fermions guided by their interactions with one another mediated by bosons.

We might guess that two features of gravitation arise from this situation.

Current approaches to gravitation see the stress energy tensor as the source of gravitons which are believed to account for gravitation. An alternative view is that gravitation is not quantized, since is is represented by the continuous mathematics of a differential manifold and carries no differentiated signal. The only observable features of gravitation are 4 dimensions, free fall and geodesic deviation. The geodesic deviation observable between particles in free fall is the source of al our knowledge of gravitation.

Why does Minkowski spacetime have 4 orthogonal dimensions, that is four degrees of freedom? Does this property have anything to do with the creation of orthogonal dimensions in Hilbert space discussed in section? This seems improbable if Hilbert space is independent of Minkowski space.

It seems more probable that 4 dimensions are a necessary prerequisite to the motion and communication of physical particles. Time is inherently connected with motion and energy. We can understand that three spatial dimensions are necessary and sufficient for the establishment of interference free point to point communication by considering the problems faced by the designers of electrical circuits. One dimension enables only serial connection on a single line. Two dimensions enable us to make direct connection three points without "crossed wires". In three dimensions we can make direct connections between any number of points. To go to four dimensions introduces unnecessary redundancy so that one might expect that if space-time is the outcome of an evolutionary process, that three dimensions are necessary and sufficient for universal connection.

Einstein's gravitation in 4 dimensional spacetime in modelled as a continuous differentiable manifold, an instance of a continuous Lie Group. His happiest thought, the starting point for this theory, was the realization that a person in free fall would not feel their own weight. They would be moving inertially in a Minkowski space, unaware that they were being imperceptibly accelerated by gravitation. They are moving on a geodesic, along which the spacetime geometry is locally Minkowski. They could learn that gravitation is present by observing other bodies in free fall and noting that they too are in inertial motion, but nevertheless they were accelerating relative to one another. So the Moon is freely falling in the space surrounding Earth and appears from Earth to be accelerating. Newton computed the orbit of the Moon on the the assumption that the centrifugal force arising from its curved orbit is exactly balanced by the gravitational attraction between Earth and Moon. Lie Group - Wikipedia @@@

Maybe we can say inertial motion is nothing. 4 dimensions are required for point to point communication; and geodesic deviation is a consequence of the universe being logically closed. We wish to find these features present already in the initial singularity. An argument for this possibility is that the general theory allows for collapse to nothing, so, since we are dealing with a deterministic mathematical structure, it must be reversible which suggests that the gravitational structure carries no entropy at all, it is nothing.

From Einstein's point of view, both these forces are fictitious and we seek insight into this situation by considering the possibility of a particle orbiting inside the initial singularity after the emergence of spacetime. We begin with the observation that the "interior" of the initial singularity is all that exists insofar as to be "outside" the initial singularity is to be in a region of logical inconsistency which cannot exist. Inside is a Hilbert space or orthogonal rays, differentiated, like angels in the Christian heaven, by the fact that they are different species or states. Aquinas, Summa I, 50, 4: Is every angel a different species?

<

Gravitation appears to be a geometric feature of Minkowski space. The shell or backbone of the Universe. Einstein's mollusc refers to the soft interior of a differentiable manifold, but this mollusc may also have a hard shell that gives overall structure to the animal and grows as the animal grows. The enormous energies we see in cosmic events may be fixed points induced by the gravitational shell. Einstein, Lawson & Penrose (1916, 2005): Relativity: The Special and General Theory

Zurek shows that the selection process for systems to move from imaginary to real is the condition that information can be transferred from Hilbert to Minkowski. Now we have the creation of spacetime by the random interactions of vectors and operators and we imagine that two classes of particles are formed, bosons and fermions. We have two constraints: the system is closed and therefore the Minkowski space is curved; the fermion network requires interference free communication so we require three real dimensions. This, plus the continuity of spacetime takes care of gravitation. Now we turn to the question of the zero energy universe and potential and kinetic energy and introduce the Lagrangian and the quantum of action. And then we introduce zero charge, positive and negative, magnetism and the vector potential and give a final brief summary of QED. Then quantum chromodynamics. The mad explorer, crashing through the bush, hoping to discover something worthwhile and get through. Wojciech Hubert Zurek (2008): Quantum origin of quantum jumps: breaking of unitary symmetry induced by information transfer and the transition from quantum to classical

Back to toc

16: Quantum amplitudes and logical processes are invisible
@@@

The interiors of fundamental particles are invisible, but we can see inside the baryons although we cannot isolate the quarks and gluons. We see these invisible features as the Turing machines that give the particles their properties, ie mediate their output given a certain input. They are invisible because they are fully taken up with maintaining the integrity of the particle of which they are part.

page 113: 'Unfortunately it was found in the 1930s that the higher order correlation in the series for e and m are all infinite due to integrations over momentum that diverge in the large momentum (or small distance) limit [maybe an artefact of the fact that the relevant distance is not the distance in Minkowski space but the "distance" in Hilbert space where the action is].

page 113: '(. . . the natural scale of QED is the Compton wavelength

[page 196]

of the electron, 10-11 [or zero if we accept the idea (page 113) that the energy of an electron is infinite, part of the overall problem arising from using the real numbers to describe the quantized world].

What we are thinking is that every additional quantum of actin adds a dimension, that is an oriented complex plane, to the Hilbert space and we would like to correlate these with Turing machines of increasing complexity through superpositions which are feeding into more and more complex features in Minkowski space.

@@@

The machinery of quantum transformation through time is represented by the energy (or Hamiltonian) operator. This operator creates a time sequence of mapping of a Hilbert space onto itself. This mapping fulfills the hypotheses for mathematical fixed point theorems like that found by Brouwer: any continuous function f mapping a compact convex set into itself has a point x0 such that f(x0) = x0. The results of quantum observations may be understood as the fixed points predicted both by this theorem in general and by the more specific theorems of quantum mechanics like the eigenvalue equation. Brouwer fixed point theorem - Wikipedia

The evolution of quantum wave functions is invisible. We see only the fixed points revealed by "measurement". We presume here that measurement is not something specific to physicists, but that elements of the Universe represented in Hilbert space continually interact with one another, that is measure one another, and so communicate with one another through the exchange of particles. The creation and annihilation of particles is a reflection of the evolution of wave functions and also controls this evolution, so that we consider the two processes to be duals of one another, carrying the same information in different forms across the boundary between the quantum and classical worlds.

A similar level of uncertainty exists at all other scales, greater and smaller that the human individual. One of the most surprising discoveries of twentieth century physics is the uncertainty principle which holds at the quantum mechanical level in the physical network, our most fundamental theory of the Universe. The Born Rule is a quantum mechanical representation of this uncertainty. Born rule - Wikipedia (link ref above)

Until the advent of quantum mechanics, physicists were generally inclined to believe that the world was deterministic. They still attribute determinism to the invisible process that underlies quantum observations, but they now have to accept that even though this process may be deterministic, it does not determine the the actual outcome of events, but rather the relative frequencies of the spectrum of outcomes that may result from a particular event. Laplace's demon - Wikipedia

Invisibility can arise from three sources. The first is limited resolution. When we measure the diagonal of a unit square with a relatively precise instrument like a micrometer, we might see that it is 1.4142 units, an approximation to √2. With a ruler graduated in units, on the other hand the best we could say that the length is somewhere between 1 and 2. What we can and cannot see depends on the instrument we use. Measurement uncertainty - Wikipedia

We know that the physical Universe is graduated or pixellated in units of Planck's constant, the measure of the smallest action, the minimum possible event. This is the limit of the resolution at which we can study action in the world, and because it is imprecise details are invisible and the prediction of the nature and occurrence of particular actions is uncertain. Planck constant - Wikipedia

What is not uncertain is the exact nature of the action, because it is coupled to Planck's constant. When an electron moves from one orbital to another in an atom it emits or absorbs a photon with one quantum of angular momentum and the electron involved changed its orbital angular momentum by one unit also, in the opposite direction because angular momentum or action is conserved. Using quantum electrodynamics, we can sometimes compute the energy change associated with this transition to many decimal places, and there is no reason to suspect that it is not an exact constant of nature. It may be that in nature values like the Planck constant are implemented with unlimited precision. This is why we can construct atomic clocks accurate to one second in the age of the universe. Photon - Wikipedia, W. F. McGrew et al: Atomic clock performance enabling geodesy below the centimetre level (link ref above)

The second source of invisibility is that what we are looking at must cooperate in the process of being seen. The evolution of the wave function is invisible because it can only be seen if it is communicated, and the communication process also requires computation to encode and decode the message. If a computer were to stop to explain itself after every operation, it would be required to explain the operations of explaining itself and so would not be able to make any progress on the original computation. For this reason we can only see the results of halted processes.

There is, therefore, a logical limit to our knowledge of the world which is implicit in the scientific method of hypothesis and testing. We make hypotheses about what is happening in regions where we cannot see, and test them by exploring their consequences to see if they correctly predict phenomena that we can see. A successful test does not necessarily guarantee a correct hypothesis but a failed test tells us that the hypothesis must be revised.

The fixed points of a quantum dynamic system are revealed by the eigenvalue equation: MΨ = mΨ, where m is an eigenvalue corresponding to an eigenvector Ψ of the operator M, often called the measurement operator. The measurement operator models the extraction of information from the quantum system measured. Eigenvalues and eigenvectors - Wikipedia, Jim Branson: Eigenvalue Equations

It took physicists nearly thirty years, from 1900 to the late 1920s, to bring quantum mechanics to its definitive form. An important step forward was made by Heisenberg who pointed out that our only task is to explain the observable phenomena. We need not be bound by the classical historical picture of the world but are free to explore all possibilities to find satisfactory explanations. Werner Heisenberg: Quantum-theoretical re-interpretation of kinematic and mechanical relations

Why can't we see the mechanism that yields these results? Here we are proposing that the Universe is digital 'to the core'. We understand this by analogy with computer networks like the internet, and there we find an explanation for the invisibility of process and the visibility of the results of processes. We asume that the observable fixed points in the Universe are the states of halted computers and the invisible dynamic of the Universe are executed by invisible computers. We suspect the presence of deterministic digital computers because of the precision with which nature determines the eigenvalues of various observations. Bastin & Kilmister (1995): Combinatorial Physics

The third source of invisibility is symmetry. A snowflake is symmetrical, with six identical 'arms'. Because they are identical we cannot tell which is which. If we look away and someone turns the snowflake around, we have no way of telling how far it was turned or if it is was turned all all.

Traditional theology holds that God is completely mysterious to us and beyond our ken.

Having established the existence of something, the next question is how it exists in order that we learn its nature. But because we are unable to know the nature of God, but only what what God is not, we are not able to study how God exists, but rather how God does not exist. . . .

We can show how God does not exist by removing from him inappropriate features such as composition, motion and other similar things. . . . Thomas Aquinas, Summa, I, 3: Introduction, Thomas Aquinas, Summa I, 3 Proemium (Latin)

This is the famous via negativa.

Symmetries are situations where nothing observable happens. They are the practical boundaries of the dynamic Universe. We may picture this to a degree by imagining the string of a piano or guitar. When struck, the string vibrates at every point except at the ends, which are held still by the structure of the instrument, and the nodes, which are fixed by the symmetrical motion of the overtones. Symmetry - Wikipedia

When we consider the Universe as divine, we can imagine the symmetries discovered by physics as the boundaries of the divinity. From a logical point of view, the dynamics of the Universe is consistent. The boundaries of the dynamics are the points beyond which it would become inconsistent, that is non-existent,.

All our experience is experience of God, and all our experiences are in effect measurements of God, that is events that we see as fixed points in the divine dynamics. We can learn a lot more about the natural God than the Christians can learn about their god. The natural God is only partially visible, but since we are continually in contact with it, we have a good chance of learning how it works. We can know nothing about the completely invisible Christian God but what the Christian Churches choose to tell us. Nevertheless true knowledge of God is necessary for survival.

Back to toc

17: Energy bifurcates into potential and kinetic, enabling a zero energy universe. e = mc^2

Back to toc

18: Some models: transfinite classical and imaginary computer networks

Cantor's theory of transfinite numbers may be used to describe a transfinite computer network. The key to this description is the fact that the cardinal of the set of Turing machines is the same as the cardinal of the set of natural numbers, so that that we can establish a one-to-one correspondence between the two sets. Here we concentrate on the development of the second transfinite cardinal 1 from the first o. Since, as Cantor explained, the generation of the transfinite cardinals is a recursive process driven by a "unitary law" this first step is easily extended to all the generation of all subsequent transfinite numbers.

To each sequence of natural numbers in the set of permutations there corresponds a sequence of computers. Aristotle studied space and provided a definition of continuity in his Physics which supposes that two lengths are continuous if their ends overlap (network property b above). This definition is reminiscent of Aristotle’s syllogistic logic in which the premises leading to a conclusion overlap by a middle term. This suggests the name logical continuity.

Here I use the concept of logical continuity to apply Cantors insights to the structure of the universe using a transfinite computer network as a backbone for the idea that the universe is a divine mind.

%%%

Each of the systems outlined above contains an invisible isolated process: Aristotle’s god enjoys the pleasure of thinking about itself [1072b15] while remaining completely independent of the world; to communicate with an inertial system would be to exert a force upon it so that it was no longer inertial; the initial singularity is isolated since it contains the Universe outside which there is, by definition, nothing; according to the theory, isolated quantum systems cannot be observed without changing them; one cannot observe the internal state of a Turing machine without breaking into its process and so destroying its determinism.

Here we propose that motions of these isolated systems are equivalent to Turing machines and therefore isomorphic to one another despite their differences in definition and their historical cultural roles. This proposition is based on the premise that the motion of a universal Turing machine embraces all computable functions, that is all classically observable transformations.

%%%

Back to toc

19: Space-time is the operating system of the universe

Back to toc

20: Dynamics, creation fixed points and particles

In modern physics the deepest layers are understood to be one or more vacua which are often explained as seething seas of energy perpetually creating and annihilating real and virtual fundamental particles. These vacua are the source of a serious problem. Calculations using the standard theory suggest that their energy density should be approximately one hundred orders of magnitude greater than that observed. No other physical theory has ever been so wrong! It might be claimed that the vacuums are only virtual, but we also believe that gravitation sees energy in all its forms, so the calculated energy density of the vacuum would make the universe very different from the place we inhabit. Cosmological constant problem - Wikipedia

Frank Wilczek, one of the developers of quantum chromodynamics, has popularized his views in a book trying to explain what is going on in the depths of the universe. He proposes a new version of the classical aether (now called condensate) which many thought to have been slain by special relativity. His story seems plausible, until we come to page 109 where he lists a few numbers suggesting that the condensate is denser that we actually measure by factors ranging from 1044 to infinity. If the universe is to be divine, physics and theology must be mutually consistent. On the one hand Christian theology, with its angry and murderous God, serpents, demons and sin is quite incredible, and on the other we have the equally absurd dreams of highly respected physicists like Wilczek. It is clear that some revision is necessay in both fields. Standard model - Wikipedia, Frank Wilczek (2004): Nobel Lecture: Asymptotic Freedom: from Paradox to Paradigm, Wilczek (2008): The Lightness of Being: Mass, Ether, and the Unification of Forces

It may be that the approach outlined above sheds some light on the cosmological constant problem since the creation of new states is constrained by the no-cloning requirement that they all be distinct in a system that as yet does not allow the spatial distinction that we associate with fermions.

The culprit seems to be the zero point energy E = ½ℏω at very high frequencies ω. Wilczek explains: 'This so called zero point motion is a consequence of the uncertainty principle.' Obviously, given the cosmological constant problem, zero point energy as understood by standard theory needs correction. This revision might be consistent with independence of Hilbert and Minkowski space discussed above. How does does the uncertainty principle relate to Hilbert space? Frank Wilczek (1999): Quantum Field Theory (page 3)

In spacetime we have the uncertainty relations: ΔE.Δt ≡ Δx.Δp ≡ ℏ/2 which relate to the non-commutative quantities energy - time and momentum - space. It might be argued that the momentum-space uncertainty relation would be violated by a massive particle sitting motionless at the bottom of a potential well, its momentum and spatial uncertainty thus both being zero. If we are to honour the uncertainty relation above, this situation is impossible. This relation may be used, for instance, to explain why an electron does not fall to the bottom of the potential well created by the proton in a hydrogen atom. We assume that the spatial relationship between the electron and the proton minimizes the sum of the potential and kinetic energy of the electron. Quantum mechanics explains that an electron confined to be close to a nucleus would necessarily have a large kinetic energy so that the minimum total energy (kinetic plus potential) actually occurs at some positive separation. In this spacetime case, where energy is coupled to momentum, zero-point energy is essential for atomic stability. Zero-point energy - Wikipedia

The quantum of action is a very precisely defined natural constant which has recently been coopted to place our system of physical measurements on a completely natural footing. We need no longer depend on a metallic object to define the kilogram, the unit of mass. In Hilbert space, the quantum of action is a measure of the distance between orthogonal base states whose inner product is zero. The transition from one state to another requires one quantum of action. It is in effect, the opposite of the continuum. A continuum is a state where nothing happens, the foundation of symmetry. A quantum of action, as defined above, changes some p into some not-p, it is the generator of orthogonality and distinction (§8).

The current universe within the singularity emerges as a series of differentiations, the first being the differentiation of action into energy and time, bringing us to the fundamental equation of quantum theory, ℏ = E/2πω where ω, is a measure of frequency, that is inverse time. We follow this process up to the emergence of spcetime, guided by the set of principles listed at the beginning of this essay.

We have noted that the action of action is to create action, a process analogous to the traditional procession of God the Son from God the Father. The principle of requisite variety (§3.12 above) suggests the because of its absolute simplicity the initial singularity has no control over its action, that is the procession of action is a random event insofar as the interval between events is unconstrained although the outcome of each event, a new quantum of action, is precisely defined.

This is a general feature of quantum mechanics: the eigenvalue equation defines quantum states precisely, while the Born rule provides only a statistical measure of their occurrence. From a statistical point of view, the frequency of these random events is a measure of energy. We may imagine that the local interval (Δt) between events is a measure of local energy (ΔE), consistent with the "uncertainty" relation ΔE.Δt ≡ ℏ.

What are the fixed points in the initial singularity? The quantum theory of measurement provides a clue. A "measurement" is in fact a contact between two quantum systems, one the measurer, the other the measured. Even though some understand the act of measurement as the contact between a classical system (or a physicists) and a quantum system, here we understand that all systems in the universe are quantum systems.

Quantum measurement involves two equations. The first, the eigenvalue equation, which predicts that a measurement will yield one of the spectrum of possibilities corresponding to the fixed eigenvalues of the eigenvectors of the measurement operator.

The second is the Born rule which predicts the probability of each of these results (§10). In a laboratory situation experimentalists may take some care to devise measurement systems that provide the answers that they want, but these answers are still subject to probabilistic distribution. In the case of the initial singularity whose simplicity precludes control (§3, principle 12) we imagine that the system itself solves the eigenvalue equation by the random meeting of two states that share an eigenvector. This event may be exceedingly rare, but the theory suggests that the result will be a physically observable particle in a certain state.

We have here a foundation for the evolution of the "particle zoo" which has revealed itself to the physics community over the last century or so. Particles are selected by the "mating" of randomly generated quantum states that happen to share stationary points, an amplitude φ such that |φ|2 yields the probability of the emergence of an observable particle. Particle Data Group - U of California

Quantum theory envisages a half formed and unconstrained reality in perpetual motion which, like the primordial quantum of action, provides unlimited variety by contact superposition, a process analogous to Cantor's generation of transfinite numbers. Selection occurs by 'measurement', when quantum states meet, communicate and define one another, analogous to a roulette wheel stopping and the ball falling into a stationary slot. Roulette - Wikipedia

Back to toc

21: Quantum electrodynamics: QED
>

Back to toc

22: Quantum chromodynamics: QCD

Back to toc

23: Methodology: the mathematical community
23: Methodology: the mathematical community

Because the network model is symmetrical with respect to complexity we can use it to move back and forth from the featureless divine initial singularity to the enormously complex world of daily experience. From an abstract point of view a community is a communication network built from a threesome like the Trinity, the Father, the Son and the Spirit who binds them. We are enmeshed in a web of networks beginning with our relationships to ourselves and expanding through individual personal relationships to the huge array of long distance networks established by travel, postal services and telecommunications.

Following Hilbert, we understand mathematics to be a formal symbolic game whose only constraints are that is is interesting and consistent. The static formal mathematical relationships from which such models are constructed are brought to life in the minds of the mathematical community. The history of this community is documented since about 5000 years ago and we can see its footprints in older artefacts. George Gheverghese Joseph (2010): The Crest of the Peacock: Non-European Roots of Mathematics

We imagine that the creative force that drives mathematics to grow is the same as the creative force that drives the universe to create itself. Mathematics is cumulative, each new discovery or invention building on what is already known. Its formal structures, if correctly proven, do not decay and need no maintenance, apart from copying texts that deteriorate and the birth and education of new members of the mathematical community to carry on the work. Of course much may have been lost in times of disaster.

The source of this force is entropy, which we may see as the attraction of the future embodied in imagination. In a nutshell, wide open spaces are attractive. We are attracted to the future because of the second law of thermodynamics: entropy (that is complexity) generally increases. Georg Cantor's theory of transfinite numbers shows us how big a wide open space can become. Transfinite numbers - Wikipedia

Throughout recorded history, mathematics as been an important tool not only for understanding our world but also for dealing with social issues like fair trading, distributive justice and the management of democracy. It begins with counting and arithmetic, accounting for discrete classes of objects like coins and sheep. It extends to the measurement of continuous quantities like land and fabric which inspired geometry. The relationship between physics and mathematics was sealed in Galileo's time when he claimed that mathematics was the language of the universe. Newton took a great step forward by inventing calculus to describe the motion of the Solar System. Gauss and Riemann extended calculus to define differentiable manifolds which became the mathematical foundation upon which Einstein built the general theory of relativity and opened the way to a picture of the universe as a whole. Differentiable manifold - Wikipedia

Calculus has since become the backbone of mathematical physics. Weyl notes that it provides an an exceedingly fruitful mathematical device of making problems "linear" by reverting to infinitely small quantities. If we take quantum theory seriously, the smallest practical quantity in the universe is the quantum of action. A recurrent theme in this essay is the idea that the use of continuous mathematics in quantum field theory could be the source of many of its problems. Hermann Weyl (1985): Space Time Matter

Newtonian physics ruled the world until the middle of the nineteenth century when electricity and magnetism opened up a new field of study. Maxwell's application of calculus showed that light is a form of electromagnetic radiation. About the same time spectroscopists were discovering the close relationships between light and matter and laying the foundations for quantum theory, which began to lead us down into the microscopic inner mechanisms of the universe.

An important development in twentieth century mathematics was the attempt by Whitehead and Russell to express mathematics in purely logical terms. Their approach laid the foundation for Gödel's discovery that formally consistent mathematics is incomplete. Turing's invention of the programmable computer and subsequent engineering developments helped to implement Whitehead and Russell's idea by showing that logic and arithmetic meet naturally at the level of binary arithmetic and binary logic. Whitehead & Russell (1910, 1962): Principia Mathematica

Here I wish to draw a methodological analogy between the mathematical community and the current state of quantum theory, based on the idea that the network model has a foot in both camps. The people in the community are structural particles and the sources of messages, the fermions. The space-time web of communications, conversations, conferences and the literature that connects the players are the bosons, the messengers, the public potential that motivates mathematicians. The interaction of fermions through bosons binds the community into a functioning whole. By being part of our own communities, we may get some feeling for how the system of the world works. Members of any community feel the community "field" through communication.

The Platonic view of mathematics is that its theorems have some sort of independent existence outside the human world so that mathematicians producing new proofs are not so much creating something completely new as discovering something previously unknown. Where these theorems come from we do not know, but in the Platonic world we may see them as ideas in the mind of an eternal God. The common Christian view is that the Universe was intelligently designed and created by an omniscient and omnipotent God. Here I like to think that the discovery of mathematical theorems, initially non-existent, is analogous to the world creating itself from the initial singularity by an evolutionary process of unlimited trial, occasional success and frequent failures. Sometimes we have waited millennia for particular mathematical discoveries like the complex numbers to emerge. Complex number - Wikipedia

We may say that the observable output of the mathematical community is published theorems. All the flow of education, exploration and discussion that leads to the theorems is from this point of view invisible, the symmetry which is eventually broken by the emergence of a theorem. We see a similar phenomenon in our parliaments. Their practical output is legislation. Behind the legislation is an invisible mountain of talk and dealings that feed the meat into legislative sausages, the dynamics behind the fixed points that appear as written laws, the political equivalent of theorems.

The mathematical field in the mathematical community exists in the minds of the mathematicians and their communications. How do we model this field as a local gauge symmetry? One point to note is that given all the human languages and arbitrary symbolic systems that may be involved, each core mathematical idea has a large space of representations which can be translated or transformed into one other. Here I will assume, by analogy with the mathematical community, that the information attributed to fields is stored in or represented by particles. I presume that an electron, like a mathematician, has a personality that guides its interaction with other particles. Further, I assume that every particle is a dynamic sub-network of the universe represented by an integral number of atomic quanta of action.

The classical computer network model models reality in that each local component executes its operations in Minkowski space-time driven by inputs from other local components. These all operate on the local substratum of quantum process. We might say that a classical computation is pixellated by the work of the transistors, capacitors and resisters connected by classical communication channels all of whose behaviour is explained by quantum mechanics. We can see this as a model of a classical world pixellated by fundamental particles whose underlying interactions we try to model by quantum field theory. A computer network is in effect a large scale model of a network of atoms.

Back to toc

24: Some principles

1: Nothing comes from nothing, which suggests that the world, or its source, must be eternal.

2. Everything comes from action.

This is the fundamental principle of this essay, derived from the fundamental Christian definition of God: actus purus derived by Aquinas from the work of Aristotle and endorsed by the Roman Catholic Church.

This lays the foundation from an evolutionary process. Because the initial singularity is without structure cybernetics tells is that it has no control and so is able to try anything. The only constraint on this control is the physical implementation of the logical principle that it is impossible for a contradiction to exist.

All that follows in this essay is my own exploration of how this fundamental principle is worked out in physics to create the world we know.

3. Energy is and immediate consequence of action (section 9)

4. Everything is made of energy [section 15]

4. Imaginary systems are independent of real systems (secion

5. All systems may be modelled as trees, that is as layered networks.

6. Imaginary trees are modelled by quantum networks which are in perpetual motion

7. Real trees are an immediate consequences of interactions in invisible imaginary networks

8. We model interactions in imaginary networks as logical processes implemented by the contacts between imaginary states which we call observations.

9. 2: The inevitable and irresistible passage of time represents the perpetual motion at the heart of the universe. Aristotle defined time as the number of motion with respect to before and after. We map time to numbers with clocks. Clocks based on quantized changes in atomic energy are our most precise scientific instruments, W. F. McGrew et al: Atomic clock performance enabling geodesy below the centimetre level 3: We can model the universe as a communication network whose lowest "hardware" layer is the initial singularity, the creator. Theology is the science of God, traditionally understood to be love. Love comprises communication and bonding which I approximate with a computer network. In such networks more complex layers are built using the services provided by the simpler layers beneath them. Practical networks comprise many software layers build on physical layers that may themselves involve many layers of complexity. Tanenbaum (1996): Computer Networks, Computer network - Wikipedia 4: Communication requires contact Isaac Newton was forced by circumstances to admit that gravitation was some sort of "action at a distance", which we understand to be impossible. Quantum entanglement led Einstein to imagine "spooky action at a distance" but we shall see in §11 that this happens because quantum mechanics occupies world where there is no space or spatial distance. In the space-time world contact is maintained by particles moving at the speed of light which follow "null geodesics" whose beginnings and ends coincide in spacetime. (see §16) Geodesics in general relativity - Wikipedia 5: Symmetry with respect to complexity We may model the generation of complexity from simplicity using the representation of the transfinite numbers constructed by Cantor. Each new representation is constructed recursively using permutations and combinations of the elements of the previous representation. Cantor writes: out of ℵ0 proceeds, by a definite law, the next greater cardinal number ℵ1, out of this by the same law the next greater ℵ2, and so on. Georg Cantor (1897, 1955): Contributions to the Founding of the Theory of Transfinite Numbers page 109 6: Respect for our history requires construct our new pictures on a foundation of the old We see the genesis of science as analogous to the genesis of the universe, a gradual progression from feeling, music, poetry and mythology toward evidence based and economically valuable knowledge about how the world works. Our lives have been facilitated enormously through the provision of energy created by physics, health care and food production based on biology, and peace and cooperation through psychology and politics. In the time of Galileo empirical science began to gain ascendancy over dogmatic theology, but ancient theology has yet to escape the influence of politics and flourishes in many religious institutions. In this essay I wish to extend theology toward the realm of science. This opens the way for theology to bring the benefits of empirical knowledge of divinity into every aspect of our lives. 7: Given that the universe is divine and visible to everybody, theology can become a real science. Following Aristotle, Aquinas perceived theology as a deductive process working from obvious principles (per se nota) and the articles of faith developed by the Christian Churches through their imaginative interpretations of the Bible. The modern view of science includes careful testing and requires that our deductive insights are consistent with the Universe we observe. Aquinas: Summa I, 1, 2: Is sacred doctrine a science?, Fortun & Bernstein (1998): Muddling Through: Pursuing Science and Truths in the Twenty-First Century 8: God reveals themself through every human experience. Traditional Christianity understands revelation as the content of the Bible. The Bible is held to be the work of writers inspired by God to explain matters which are invisible to us. In a divine universe, every experience may be interpreted as revelation of the divinity both within and around ourselves, a vast trove of information exceeding all the literature of the world. 9: We identify the initial singularity predicted by relativity with the classical God. Like this God, the initial singularity is the source of unbounded action. Penrose, Hawking and Ellis understand Einstein's general theory of relativity to imply the existence of a structureless initial singularity as the source of the Universe. Astronomical observation of black holes provides evidence for the existence of such singularities. Hawking and Ellis speculate that the Universe was originated by a "time reversed" black hole. The initial singularity and the traditional model of God developed by Aquinas share three properties: they exist; they are completely without structure; and they are the source of the universe. Aquinas maintains that despite their simplicity God possesses a complete plan for the creation of the universe. Here, following modern ideas of the representation of information, I identify the divine mind with the actual universe. We can say no more about the origin of the initial singularity than we can say about the traditional God, and so we assume the traditional position that both are eternal. 10: Given its simple origin, interpretation of our observations of the world may be guided by a 'heuristic of simplicity'. Modern physics attempts to discern the structure of the Universe by observing its current enormously complex state. Quantum entanglement arising from the initial state suggests that every event is influenced to some degree by every other event, so that precise computation of physical parameters requires the superposition of an unlimited number state vectors. The layered network model, guided by the initial simplicity of the universe, may provide clearer view of the basic structures established by quantum mechanics and relativity which persist in the present as observable symmetries. 11: The power of creation is limited only by physical instances of the logical principle of contradiction. Discussing the power of God, Aquinas pointed out that God is limited only by their inability to create logical contradictions, such as that Socrates should be both sitting and standing at the same moment. In a model of God, this condition places a first level of constraint on the emergence of a Universe within the initial singularity, which may thus be understood to house only locally consistent structures. Structures breaking this constraint cannot exist permanently. Nevertheless the uncertainty introduced by the quantized structure of the universe (at all scales) seems to allow temporary excursions into undefined situations which may enable the creative power of the world. From this point of view momentum, like jumping, can bridge otherwise impossible gaps. Aquinas, Summa I, 25, 3: Is God omnipotent? 12: Logically consistent mathematics leads, via Gödels theorem to incompleteness and to the cybernetic principle of requisite variety. Following Galileo, mathematical modelling has become a primary tool of modern physics. Mathematical insights and methods have progressed well beyond what was available to Galileo. Aquinas believed that an omniscient and omnipotent God has total deterministic control of every event in the world. Gödel found that logically consistent formal systems are not completely determined and Chaitin interpreted Gödel's work to be an expression of the limits to control known as the cybernetic principle of requisite variety. This principle suggests that a completely structureless initial singularity has no power to control its behaviour, so that insofar as it acts it acts at random. Gregory J. Chaitin (1982): Gödel's Theorem and Information, W Ross Ashby (1964): An Introduction to Cybernetics 13: An entropic force attracts the emergent universe toward increasing complexity. The general theory of relativity shows that the expanding Universe is driven by a strong tendency to create space. This is consistent with the second law of thermodynamics which tells us that on the whole entropy does not decrease, meaning that the Universe has a tendency to increase its degrees of freedom. We might see this reflected in human aspirations for freedom, and we can see a mathematical foundation for this tendency in Cantor's theory of transfinite numbers. Like gravitation and the traditional God, freedom and increasing entropy are fundamentally attractive, acting like the old Aristotelian final cause. Andrea Falcon (Stanford Encyclopedia of Philosophy): Aristotle on Causality, Cantor's theorem - Wikipedia 14: Zero sum complexification. It may be imagined that conservation of energy requires that all the energy in the current universe is present in the initial singularity to power the big bang. An alternative view is that the total energy of the universe is zero. The potential energy carried by gravitons and other bosons being exactly equal and opposite to the energy of the fermions that emerge as discrete entities within the initial quantum of action. The idea here is that each step in the complexification of the universe steps through a phase of uncertainty to create new entities that add up to nothing, like potential and kinetic energy or positive and negative charge. Zero-energy universe - Wikipedia 15: Information and logical operations are a physical entities. Aristotle and Aquinas recognized two principles of being, matter and form. Form or idea was a key concept in Plato's picture of the world. Aristotle brought Plato's forms down to Earth to devise an explanation for change: change occurs when an element of matter assumes a new form. Matter thus constrains form to create a particular object. Quantum mechanics envisages an invisible world of quantum states analogous to Plato's forms. Physical particles are created when quantum states interact with one another in the process called (in laboratory situations) measurement. These particles are physical representations of the quantum processes that create them. 16: Only fixed points are observable We cannot see a photons because they have no rest frame. Their existence between their creation and their annihilation is invisible and they are in effect a quantum mechanical entity whose existence is purely speculative. On the other hand, we can observe massive particles because they have rest frames. We may illustrate this with a classical analogy. While the roulette wheel is spinning, we cannot see which sector the ball occupies. Only when the wheel comes to rest, can we observe the number the ball has chosen. This principle establishes the difference between the quantum and classical worlds. The quantum world is invisible because it is in perpetual motion. The classical world represents the fixed points of the quantum world, many of which feature in our everyday experience like houses and trees whose components, like atoms and molecules, are also massive and observable with suitable instruments.

Back to toc

25: Some conclusions
18: PCT and all that

The emergence of space-time introduces new degrees of freedom and new constraints into the system of the universe. The Minkowski metric and the Lorentz transformation are interpreted as consequences of the special principle of relativity that all observers on inertial frames of reference see the same physical symmetries, particularly the same velocity of light. Einstein arrived at this conclusion by studying Maxwell's equations. As we have seen above, the structure of space is intimately connected to the velocity of light, and the velocity of light is intimately connected to the electromagnetic properties of space-time. Special relativity - Wikipedia, Maxwell's equations - Wikipedia

As Eintein discovered, gravitation is the foundation of the large scale structure of the universe, controlling its structure from the first instant until (probably) the last. The very small scale structure of the universe is controlled by the strong and weak forces, but the intermediate structure, including human physiology, is controlled by electromagnetism. The foundation of electromagnetism is electric charge, which is quantized and comes in two equal and opposite varieties, positive and negative.

Einstein felt that gravitation is a logically complete phenomenon and there is really only one possible gravitational field equation. A similar completeness is to be found in the world of electromagnetism, closely related to both the velocity of light and the structure of space, since magnetism is a relativistic effect of moving electric charge. It is not surprising that both Einstein and Weyl both sought (in vain) for a unification of gravitation and electromagnetism. Is this possible if we consider Hilbert and Minkowski spaces as independent degrees of freedom? Herman Weyl (1985): Space Time Matter (link ref §7)

The electrical force is roughly 1040 times stronger than gravity, but we rarely feel it because positive and negative charges are exquisitely balanced on a microscopic scale so that the world is mostly electrically neutral. This raises a question: is electromagnetism strong, or is gravitation weak. I will assume here that gravitation is weak and that the characteristic strength of the electrical force is closely connected to quantum contact interactions such as we find between the electrical and magnetic phases of the quantum harmonic oscillator which Einstein showed to be a physical particle, the photon.

From a relativistic point of view, photons are outside space and time, since the Lorentz transformation shows that an observer would see a particle travelling at c to have zero spatial extension and zero time, the characteristics of a null geodesic, the geometric representation of massless bosons. The sources of photons are electrically charged particles, electrons and positrons which are massive, observable fermions,

Nineteen century physicists realized that if electromagnetic radiation is a wave in some ether, the ether must be exceedingly rigid to account for the velocity of light and the forces between the electrical and magnetic components of the waves correspondingly high. We calculate the energy of a photon using the quantum mechanical formula E = ℏω, which suggests that the work done by each cycle of a photon is closely related to quantum of action, which happens to have the dimensions of angular momentum.

For many the discovery of the photon made aether unnecessary, to be replaced by the notion of field which some, like Auyang quoted above, see to be the invisible ontological foundation of the world. Here we see field as an abstract representation of the network of communication between fermions through the medium of bosons.

Historically, quantum field theory encountered many difficulties arising from mathematical beliefs in point particles and infinite degrees of freedom in spacetime carried across from classical physics. Feynman, Schwinger and Tomonaga solved this problem by substituting measured values of particle properties for mathematical assumptions, thereby suggesting the the infinities encountered in computations were artefacts of the theory. Richard P. Feynman: Nobel Lecture: The Development of the Space-Time View of Quantum Electrodynamics

The approach explored here is to restore the integrity of the quantum of action and give it a natural role in restoring finite integer values to the parameters of physics, leading to an understanding of the Hilbert world as a description of a communication process in perpetual motion constrained by logical consistency. This is based on the idea that a quantum of action is a physical representation of the universal logical operator not-and. Logically, a quantum of action transforms some p into some not-p which the no-cloning theorem requires to be orthogonal to the original p. This quantum mechanical process we take to be the source of spacetime, as described above.

We explore this spacetime using the Lorentz transformation which unites the dualities known as parity (P), charge (C) and time (T) which taken together are a single symmetry, but each of which can have two states, odd and even parity, positive and negative charge, and forward and reverse time.

The description of the relativistic transformation laws in the first chapter of Streater and Wightman's book PCT, Spin, Statistics and All That is based on the Heisenberg picture which honours the requirements of special relativity by treating time and space on the same basis:

to each state of the system under consideration there corresponds a a unit vector, say Φ in a Hilbert space, H. The vector does not change with time, wheres the observables, represented by hermitian linear operators on H, in general do. The scalar product of two vectors Φ and Ψ is denoted by (Φ, Ψ), called the transition amplitude of the corresponding states.

Two vectors which differ only by multiplication by a complex number of modulus one describe the same state, because the results of physical experiments an a state described by Φ may be expressed in terms of the quantities

| (Φ, Ψ) |2

which gives the probability of finding Φ if Ψ is what you have. The set Φ of vectors e where α varies over all the real numbers, and the norm of Φ (written || Φ || and defined as [(Φ , Φ )] ½ is unity, is called a unit ray. . . . The preceding remarks can be summarized: states of a physical system are reperesented by unit rays.

Although the mathematical forms of Hilbert space place little constraint on the possible vectors and operators apart from unitarity, observations in Minkowski space have led to the development of super-selection rules which are understood mean that not every unit ray in Hilbert space represents an actual physical system. As discussed above, this suggests that Minkowski space places constraints on the physical realization of the possibilities implicit in Hilbert space, establishing a system analogous to evolution where the survival of individual organisms places constraints on the possibilities of generic variation, since many genomes are not viable enough to be reproduced.

In the realm of fundamental particles, viability appears to be controlled by group structures such as the Lorentz and Poincare groups and their subgroups. The details of these constraints are to be found in the enormous literature of particle physics.

Although we can now see the speed of light as a universal constant, in the beginning we can imagine that it was an arbitrary evolutionary solution to the problem of maintaining quantum contact in a spatially extended system. As we have noted, the velocity of light and null geodesics are aspects of the same innovation. Before the advent of space quantum interactions were instantaneous, but as the discussion of Alice and Bob above explains, entanglement is not suited to transmitting defined information, which is only possible with the preparation and reception of bosons travelling at c.

Back to toc

19: Space-time is the operating system of the universe

Our interface with the quantum world is generally called measurement or observation, and we are inclined to think of it as a carefully designed experiments aimed at answering particular questions like Does beta decay conserve parity? This is a very narrow conception of our interface with the quantum world, since every event that occurs in the classical world, from kicking a football to baking a cake is an input to the quantum world, which returns its answer in the form of the observable event.

Cooking a cake, for instance requires the assembly and mixing of specific ingredients, often with specific techniques in specific order, and the application of the right amount of heat for the right time. The outcome if everything is done well is a cake, as planned. The transformation from ingredients to cake is a very complex quantum mechanical event. Failures are possible if things are not done right. Beta decay - Wikipedia

A very common and easily understood interface between the classical and quantum worlds is a computer. I type on the keyboard, which initiates to a complex sequence of simple binary logical operations which display my typing on the screen. Most of this transformation is performed by integrated logical circuits which contain millions of simple electronic components like transistors, resistors and capacitors. Each of these acts as an interface between the spacetime world of wires and electric currents and the quantum mechanical marshalling of the electrons which represent the logic of the computation as electric potentials. My own body works in a similar way, my mind controlling my muscles as I type, my eyes transmitting the results to my mind for further appraisal.

Quantum mechanics began life in 1900 as a physical theory, but by about 1980 people began to think of it as the computational basis for the life of the universe. Transistors, the principal functional units in modern classical computers, are quantum devices and they are designed using quantum mechanics. As components are made smaller we approach quantum limits on making them predictable and deterministic enough to be components of relatively error free classical machines. David Deutsch (1997); The Fabric of Reality: The Science of Parallel Universes - and its Implications

The quantum world is inherently dynamic, in perpetual motion. This motion is considered to be continuous, deterministic and unitary (reversible) until it is interrupted by the by the interaction of two quantum states initiated by input from the classical world. Quantum mechanics has developed over the last century to understand this motion. The foundations of this picture of the world are quantum mechanical vacuums which are understood to be a continuous fields of random motion from which various symmetries select the fundamental particles which construct the universe (§5.2). The vacua are the seen as the ground states of the universe, the lowest possible energy states, often represented by the symbol | 0 >.

In modern physics vacua are home to two sorts of energetic excitations which represent real and virtual particles. A real particle is an energetic excitation of the vacuum like an electron or a proton which can be observed in the spacetime world. Virtual particles are not observable but are believed to exist in the quantum world. They have a sort of half reality as concentrations of energy which mediate interactions between real particles. A real photon is a particle of light which can be observed and measured. A virtual photon is a representative of the electromagnetic field that binds electrons and protons together to form the atoms of the elements of the periodic table.

Virtual particles are said to be 'off mass shell' which means that they have temporary lives made possible because quantum uncertainty enables brief evasions of the constraints of classical spacetime physics. The are in effect a sort of creative wild card in quantum field theories which enable systems to get to places they otherwise could not go. A common name for this phenomenon is 'tunnelling' which enables particles to cross barriers that are classically forbidden. The price paid for such events comes in the form of low probability. Despite the intense internal activity of a uranium238 nucleus, measured by its mass, the probability of an alpha particle (helium nucleus) tunnelling out of this nucleus is so low that it takes 4.5 billions of years for half of a mass of U238 to decay by this route to thorium 234. On shell and off shell - Wikipedia

A principle assumed for this essay is that only fixed points are observable (§2.15). This idea flows from the notion that the quantum world is invisible because it is in perpetual motion (§13). Mathematical fixed point theory tells us, however, that given certain conditions moving systems have fixed points, parts of the motion that do not move. These fixed points are the particles we can see in our world. Fixed points have a spectrum of lifetimes, since changes in motion lead to changes in fixed points. Protons live for a very long time, perhaps forever. Other real particles have fleeting lives, billionths of a second or less.

In the network model we understand particles not just as a fixed points but as sources, entities capable of sending and receiving messages in the universal network. On this definition I am a particle with a lifetime of about a century and you are reading some of my output right now. Particles are sources because they stand in the midst of a flow of information entering and leaving them. Food and air flow through me, sustaining my life. Information flows into me through my senses and out through my muscles.

The operating system of my computer handles input and output to the processor, connecting it to the world of data which flows through it. This data is transformed by the computational process defined by the software stored in the memory of my machine. Here, by analogy, we understand spacetime play the role of an operating system with respect to events in the quantum world, handling input, output and memory. Operating system - Wikipedia

Fixed point theory reveals how fixed points appear in motion. The role of science is to identify fixed points in our moving world. This knowledge is valuable, because the relationship between motion and fixed points works in both directions: changing motion gives us changing fixed points; changing fixed points changes motion. I manipulate my computer and my world by manipulating fixed points, seeking to achieve other fixed points through changing the invisible motions that generate fixed points. This is how I make cakes.

Back to toc

Back to toc

21: Conclusion

This essay began as a theory of peace published on 2BOB Public Radio Taree in 1987. From my point of view the downside of religion is the sanction of war. By invoking an omnipotent deity, theologians justify a "religious exemption" to large the scale murder and genocide executed by political communities seeking hegemony. Jeffrey Nicholls (1987): A theory of Peace

History shows that the spread of religions has often been facilitated by imperial wars. In many religions, killing an unbeliever has been considered a virtuous act. The Catholic Church had its Crusades and since that time Christian nations have been warring against one another and systematically murdering, raping and pillaging all the communities around the globe that lack the military forces to resist them. We might consider religion and mass murder as a reciprocally related: wherever we see mass murder we might suspect a religious motivation, hidden perhaps by colour, race, caste, wealth, power or some other distinction between human groups.

It is unfortunate that this situation is to some extent guaranteed by the nature of evolution. Ultimately hegemony is gained by reproduction and in zero sum situations where survival of the fittest is played out murder, rape and pillage are (at least temporarily) effective strategies. The message of my theory of peace was that the antidote to this terrible reality is to take ourselves out of zero-sum situations. We can do this by taking advantage of the transfinite creation of human spiritual space and the consequent creation of economic space which are facilitated by creative communication and cooperation.

This appears to be the mechanism, present in systems at all scales from fundamental particles to human communities, that has brought the universe to be from an almost infinitesimal beginning. This beginning, identical to the traditional Christian God, has unbounded power of action bounded only by local consistency.

Two ideas have made it possible for me to add some solid flesh to this abstract idea.

First, theology is the traditional theory of everything that we use to understand our position in the world. Ultimately theology is about love and bonding, that is about the creation of networks. An important feature of networks is that they are scale invariant. Every network is built from an atomic structure which comprises two sources and a communication link between them. This structure is invariant, whether the sources are fundamental particles, people or galaxies.

Second, the scale invariance of networks means that we can identify two very common sets of events in our world which occur at very different scales. The first is the mental act of insight which pivots between ignorance and understanding. It occurs randomly but often. At one moment we are faced by a situation we do not understand. Some time later we suddenly see what is going on. In the healing professions, for instance, a practitioner is faced by a series of symptoms. Sometimes the cause is obvious, like a broken bone. At other times considerable diagnostic effort may be required to reveal the cause.

The second set of events is quantum observations or measurements. This is the ubiquitous phenomenon studied in physical laboratories. Physicists set up situations where different pieces of matter can interact with one another. They control what is going in and observe what is coming out, and try to understand the invisible quantum mechanical processes that join output to input. Here quantum mechanics plays the role of the mind of the universe, and the work of the physicist is to discover what the universe is thinking. Michio Kaku (2021): The God Equation: The Quest for a Theory of Everything (link ref §2)

All this is discussed in some detail in this essay and provides us with glimpses of a way to understand our world as the mind of God, a basis for a scientific theology to put our religious beliefs on a realistic basis.

I am getting old now and my days are numbered. I hand these ideas over to others to develop as a step toward world peace. I am not the first nor the last to have this ambition, but we must keep trying if we are to take advantage of the five billion years of usable sunshine that lie ahead of us. I am already wealthy enough, so I don't need to profit from my ideas. They are yours to develop so long as you approach them with scientific honesty.

Theology, the theory of everything, tells us what we are as a function of our environment. Politics, the source of collective action, seeks to determine how we should act, given what we know about ourselves. Together they represent the confluence of knowledge and power whose principal historical manifestations have been the wars that have shaped human history. Margaret Macmillan (2020); War: How Conflict Shaped Us

In the context of evolution, war is an aspect of the selection processes that sift though the possible human cultural futures at the imperial scale of politics. We have seen a history of collapsing empires from ancient times to the destruction of the British and American empires in the last few centuries. In almost every case, vast military supremacy has failed in the face of theological and religious unity. The most recent examples being the failure of the military resources of the now extinct Soviet Union and the United States in Afghanistan.

Unfortunately power in all its forms, theological and physical, corrupts. The military aspects of both theology and physics have perverted them. Both the Catholic Church and the nuclear weapons establishment think that the raw power that they wield is a good thing, to be developed without limit. In fact it is largely useless because their power makes no contribution to humanity, and in fact degrades it. Christianity died when it sold out to Constantine. Physics died when it sold out to nuclear weapons. Jeff Tollefson: US achieves laser-fusion record: what it means for nuclear-weapons research

By uniting physics and theology I hope to bring them back to life by showing how creation creates peace. We are living on a benign planet in the midst of a universe of enormous violence. If we can understand how this happened, we can understand how to create peace for ourselves. The simple answer is to create space using sunshine, a boundless source of energy. Jeffrey Nicholls (July 2019): Entropy and metaethics

What is the difference between Christianity and a Fairy Tale. The stories are equally fantastic. The difference is a political institution asserting the truth of the Christian fairy tale, historically on pain of death, so that people accept it as real and live by it, even though there is no evidence whatever for the story as it applies to us before we are born and after we die. In effect we come from nowhere, go nowhere and our lives are governed by fictions about these two nowheres. The scientific story begins with an initial singularity, big bang and evolution, bringing us to be in an observable context into which we are absorbed and recycled when we die. One may say that all indigenous stories follow a similar pattern, the only difference being how we deal with the here and now bracketed between the two mysteries. Grace James (1912): Green Willow & Other Japanese Fairy Tales

Since time immemorial it seems that people have imagined that invisible forces control the world. Over the same period, some people have claimed the ability to control these invisible forces and developed businesses based on their claims. Traditionally these businesses often trade under the name religion, justified by a body of theory called theology. The picture of natural theology presented here broadens this picture of agency by providing a clear picture of how we, and every other agent in the world, handle the invisible computational side of the world in order to achieve our desires. This is the central idea of both cognitive cosmology and the technologies that derive from it, covering every aspect of human culture.

Religions are a bit like Star Wars and similar movie series, which eventually expand the narrative to produce a prehistory and a posthistory of an original central story. In real life, we only have access to the central story. The prehistory and posthistory are largely invisible to us, imaginative mythology rather than imaginative science built on measured experience.

Insofar as we are guided by some reality while we are alive , we can take a scientific attitude to life and manage our fantasies of before and after in order to optimize our lived, a sort of Lagrangian approach, which forms, for me, the essence of the scientific theology of which I wish to become a doctor before I die.

Christianity is built around the life of Jesus who spun a story about his origin as the Son of the Creator and his end sitting at the right hand of his Father, judging the living and the dead. From a personal point of view he was unlucky to fall foul of the Roman forces occupying Jerusalem but his gruesome death added weight and credibility to his story so that it has entrained billions over millennia. Now that I have reached a preachable story I wish to take a similar path to glory leaving a legacy of peace without unnecessary pain by pointing out that in a well planned life pain is not particularly necessary and is in many cases due to unscientific beliefs about reality. This is the deepest message of cognitive cosmology, my conclusion.

%end potential and kintic energy#
4: Bugs and patches I: Theology

I wish to identify the universe with the creative mind of God. Practically this requires a union of physics and theology in a cognitive cosmology. Here I identify some theological impediments to this union. §5 deals with problems on the physical side. With the decks cleared we can then proceed to construct a picture of the world. Worldview - Wikipedia

The principal question for theology is "what does it mean to be human". I was brought up in the Irish Roman Catholic tradition in a small town in Australia. I first faced this question explicitly when I started school at the age of four. The nuns got us to learn the Catechism by recitation: Q. Why did God make me? A. God made me to know Him, to love Him, and to serve Him here on Earth, and to be happy with Him for ever in heaven. Until my 40s my humanity was defined by this and other doctrines of the Catholic Church. Now approaching my 80s, I have different ideas.

What do the Taliban, the Communist Party of China and the Catholic Church have in common? The simple answer is the enforced by indoctrination of children. I speak here to the Roman Catholic Church because I am a baptized member. I am also an ex-cleric well educated in its ways. One may see analogous positions in many other religious institutions. I see the chief conflicts between the Church and its human environment in the following positions:

1: The Roman Catholic Church's claim of divine right, infallibility and papal supremacy:

As well as claiming infallibility, the Pope enjoys supreme, full, immediate and universal ordinary power in the Church, which he can always freely exercise. Such power has often enabled the Church to ignore human rights and evade justice. Not only has it been responsible for widespread sexual abuse of children, it has frequently attempted to pervert the course of justice to hide these crimes. This evil has cost the Church not only its claim to moral and ethical superiority in the human community, but civil claims against the Church continue to absorb a large fraction of the funds donated by the faithful for its upkeep.

The alternative to this approach is a scientifically based theology which provides evidence based foundations for human rights, equality, the rule of law and human agency through democratic politics. John Paul II (1983): Code of Canon Law: §331: Papal Power, Kieran Tapsell (2014): Potiphar's Wife: The Vatican's Secret and Child Sexual Abuse, Papal supremacy - Wikipedia

2: The distinction between matter and spirit:

The Church depends for its livelihood on a claimed monopoly on communication with its God. Part of the cosmology that goes with this claim is that human spirits are specially created by God and placed in each child during gestation. This leads it to claim that the theory of evolution does not fully explain human nature and that neither we nor the Church are really indigenes in this world, but alien pilgrims destined for a post mortem life in a different state. The Church claims, in effect, that there are two truths: the truth of science and its own picture of the world which it considers to be superior to science.

This position is not tolerable in a scientific community which seeks evidence to justify public opinion. Insofar as spirit is real, it must be observable and open to study. If the universe is divine there is no reason to demand special creation for the human spirit. Pope Paul VI (1964): Dogmatic Constitution Lumen Gentium § 48 sqq., Pope John Paul II (1996): Truth Cannot Contradict Truth

3: Deprecation of the world

The Church holds that the God of the Old Testament created a perfect world and then, angered by the disobedience of the first people (the Original Sin), punished us all for the duration of human existence by subjecting us to death, pain, the need to work for a living and the domination of reason by passion. In addition we no longer enjoy the supernatural grace of God considered necessary to admit us to our eternal post mortem reward in heaven. Original sin - Wikipedia

The Christian New Testament attempts to provide a limited happy ending to this divinely dictated disaster. It claims that the Father's murder of his divine Son, Jesus of Nazareth, "saved" us by giving them satisfaction for our crime. Those who are baptized in the Christian rite may now reach heaven. None of the other damage that God did to us and our world will be repaired until the end. We continue to live lives of pain and work in the face death. At the end of the world God the Father will repair the damage they did to punish us. Those judged to have lived in the required manner will enjoy an eternal life of the blissful vision of God. The rest are damned an eternity of suffering in Hell.

This whole scenario is purely fictitious and is in effect a fraud on the human race. It is of great value to the Church, since we pay for our salvation by supporting this flawed institution. Insofar as the church promotes social welfare it is valuable, but it must base its claims on demonstrable truth and become a law abiding corporate citizen of the human world.

4: Misunderstanding of pain:

In a similar vein, the Church holds both that pain is punishment for sin, and that endurance of pain, even self inflicted pain, is a source of merit. It overlooks the fact that pain is in general an error signal that enables us to diagnose and treat errors, diseases, corruption and other malfunctions that impair our lives. Included here is the unnecessary pain caused by the false doctrines of the Church.

Often it is necessary to suffer a certain amount of pain to achieve an objective, such as having a baby, which means going beyond our comfortable limits, but there is little real value in pain for its own sake apart from its diagnostic role.

5: Absolutism:

From a scientific point of view, the Catholic model of God and the world is an hypothesis, to be accepted or rejected on the evidence. From the Church's point of view, the fundamentals of its model are not negotiable, and anybody who chooses to disagree with them is ultimately a heretic to be excommunicated from the Church. Historically the Church has tortured or murdered dissidents. The principal sanction in modern times is dismissal. There is no room in the Church for the normal scientific evolution of our understanding of our place in the Universe.

6: Sexism:

Within the Roman Catholic Church, the glass ceiling for women is at ground level: women are excluded from all positions of significant power and expected to play traditional subordinate roles. Even in recent times the Papacy has emphasized that, for reasons probably based on a misunderstanding of history, women must still be barred from the priesthood. Pope John Paul II (1994): Ordinatio Sacerdotalis: Apostolic Letter to the Bishops of the Catholic Church on Reserving Priestly Ordination to Men Alone.

7: Belief in active evil agents, aka Satan and other demons:

The Church claims to save us from both the original sin and from the dangerous activities an evil being, Satan, whom it claims to have been responsible for the Fall. This is a purely fictitious position that falls under the political ploy often known as a "paper tiger", a claim to protect us from a non-existent threat. Catholic Catechism: §§ 385-412: Satan

This is not to deny that there is evil in the world, much of which arises from the nature of evolution. Evolutionary success is achieved by reproduction. In the zero sum situations where survival of the fittest is played out evils like murder, rape and pillage are (at least temporarily) effective strategies. The antidote to this terrible reality is to take ourselves out of these situations. We can do this by taking advantage of the expansion of human spiritual and economic space which results from realistic global theology. This task is closely related to promoting systems of governance that enable individual agency and control the corrupting tendencies of uncontrolled power.

8: Marketing and Quality:

The New Testament constitution of Catholic Church claims a right and a duty to induce everyone to hear and accept its version of human existence. This is a natural policy for every organization whose size and power increases in proportion to its membership. The modern world, however, expects any corporation promoting itself in the marketplace to deliver value for value. People contributing to the sustenance of the Church and following its beliefs and practices need to be assured that they will indeed receive the eternal life promised to them. The only potential evidence for this is miracles attributed to saints. Saint - Wikipedia, Matthew Schmaltz: What is the Great Commission and why is it so controversial?

In addition, we need to be assured that the information provided to us by the Church is true judged by modern standards. It has been traditional to exempt churches from the usual requirements of consumer protection legislation but this is inconsistent with modern good marketing and advertising practice. The Church needs to be particularly careful that it is not passing on unverifiable information to children and others with limited critical ability. Ad Gentes (Vatican II): Decree on the Mission Activity of the Church, Lumen Gentium (Vatican II): Dogmatic Constitution on the Church

Back to toc

5: Bugs and patches II: Physics

1: Observers have no role in true science?

A fundamental lesson of quantum theory is that what we see depends on how we look. One of the saddest stories in science is that of Albert Einstein, whose total dedication to the notion of objective reality was perhaps the source of a lifetime of scepticism about quantum theory. The root of the belief in objective reality lies in the idea that the frames of reference that we use to measure nature are not part of nature. Therefore, all the measurements we take of a particular system using different frames of reference must be identical once we adjust for the differences between reference frames.

We might think that the claim that what we see depends on how we look destroys the concept of certain knowledge. The answer to this objection is the central point in Einstein's general theory and all his relativistic work. In order to get an arithmetic grip on the geometry of nature Einstein used Gaussian coordinates which are to a large extent arbitrary, compared for instance to the Cartesian coordinates which impose a fixed metric structure on a Euclidean space. The key to the general theory is, in Einstein's words: The Gauss co-ordinate system has to take the place of a body of reference. The following statement corresponds to the fundamental idea of the general principle of relativity: "All Gaussian co-ordinate systems are essentially equivalent for the formulation of the general laws of nature". Gaussian curvature - Wikipedia, Albert Einstein (2005); Relativity: The Special and General Theory, page 123

The key to getting a deterministic mathematical theory out of this somewhat arbitrary coordinate system is that the only observable points in nature are events and the space-time intervals between them. Whatever coordinate systems we choose must be constrained to give a one to one correspondence between identical distances and identical differences in Gaussian coordinates. An observation is an event, so that the foundation of science is equivalent to the foundation of general relativity: all observers, no matter what their state of motion must, to make sense, agree on what they actually see when it is transformed to the rest frame of the observed system

The difference between classical and quantum physics is that while a classical human observer may be considered to be completely independent of the phenomenon measured, quantum observations are in effect the universe observing itself. In the early days of quantum mechanics quantum measurement was taken to be the interaction between a classical observer and a quantum system, but the fact is that all systems in the universe are quantum systems so all events in the universe, (which include "measurements") are quantum interactions.

Beneath the real space layer of certainty reflected in classical relativity we have the quantum mechanical layer which introduces uncertainty because it encompasses invisible states which can only be brought into the realm of certainty by measurement, which means a somewhat unpredictable interaction between two invisible quantum states. In a laboratory situation the state to be measured is unknown; the state used to measure it is to some degree under the control of the experimenter. Because of this uncertainty we cannot predict what a measurement will reveal as we can in a macroscopic classical situation. This, I feel, is why Einstein was convinced that quantum mechanics is incomplete. Every event is in fact completed by the information obtained by a particular observer.

We will encapsulate this idea in a layered network model, the layer of identifiable observations being the outcome of interactions within a layer of invisible states which yield an amplitude ψ. What we observe in the classical layer are particles whose nature is determined by the interacting quantum states and whose probability of appearance is equal to the absolute square of the amplitude: P = |ψ|2. The interaction of invisible quantum events, ie an observation, gives rise to visible events, ie particles. What we require in a quantum theory is that linear transformations of the underlying states to different bases do not change P. This criterion is used to demonstrate the equivalence of the matrix, wave and path integral formulations of quantum mechanics.

2: Particles or fields?

Einstein derived the general theory of relativity from the assumption that reality remains the same no matter how we look at it so long as we have an algorithm to convert between different points of view. In general relativity this algorithm is the Einstein field equation and the general process of changing points of view is known as a "symmetry transformation": A symmetry transformation is a change of point of view which has no effect in the underlying reality: no matter how we look at a perfect sphere, it remains a perfect sphere: walking around it does not change how it looks any more that travelling around in the universe changes its reality.

Einstein was concerned with the universe as a whole. Fundamental physics is concerned with "fundamental particles", the simplest observable entities in the universe from which all the large visible objects like ourselves and planets are made. In the 1930's Eugene Wigner developed "Wigner's theorem", which used Einstein's idea to define a particle as something that remains the same no matter how we look at it. Wigner's theorem - Wikipedia

The philosopher Auyang writes:

"According to the current standard model of elementary particle physics based on quantum field theory, the fundamental ontology of the world is a set of interacting fields. Two types of fields are distinguished: matter fields [fermions] and interaction fields [bosons]. . . . In fully interactive field theories, the interaction fields are permanently coupled to the matter fields, whose charges are their sources. Fundamental interactions occur only between matter and interaction fields and they occur at a point. . . ." : How is Quantum Field Theory Possible? pp 45-46.

"Field has at least two senses in the physical literature. A field is a continuous dynamical system, or a system with infinite degrees of freedom. A field is also a dynamical variable characterizing such a system or an aspect of the system. Fields are continuous but not amorphous: a field comprises discrete and concrete point entities each indivisible but each having an intrinsic characterization. The description of field properties is local, concentrating on a point entity and its infinitesimal displacement. Physical effects propagate continuously from one point to another and with finite velocity. The world of fields is full, in contrast to the mechanistic world, in which particles are separated by empty space across which forces act instantaneously at a distance." (Auyang page 47)

We see the particles. The fields of quantum field theory are understood to be entities represented by mathematical functions ψ(x) at every point x in space-time, which create and annihilate particles at x. As we see from the article by Kuhlman quoted at the beginning of this essay, quantum field theory is a vast and difficult labyrinth of theory intended to explain the simplest entities in our world. Can it be simplified? Steven Weinberg (1995): The Quantum Theory of Fields Volume I: Foundations, page 31.

Simplification is achieved by symmetry. All electrons, for instance, are identical: if you have seen one, you have seen them all. Physicists would say that there is just one electron field throughout the universe which create and annihilates electrons. This essay is built around a much broader symmetry which we call action. The idea is that every discrete entity in the universe is a source, able to act by sending and receiving messages. Every particle, whether it be an electron, a person or a galaxy, has properties which determine how it interacts with its neighbours.

A source is an element of a network which speaks and listens to other sources. Here we proceed on the assumption that the fundamental source of the universe is the quantum of action, identical to the classical divinity. The multiplication and differentiation of the initial singularity is modelled initially on the Christian doctrine of the Trinity to be extended to the transfinite domain in both the classical and quantum theoretical worlds by the network model introduced in §6.

3: The anthropic principle?

When we study the evolution of the universe from a gas of hydrogen and helium to its present state embracing carbon based life forms, we sometimes encounter points at which very improbable events are required to make further development. in he theoretical world, we may see this as analogous to Einstein's formulation of the general theory of relativity which finally opened our eyes to the universe as a whole. If Einstein crossed an abyss. If had not lived, would we now be in possession of this theory? The history of science points to long relatively arid periods between the paradigm changes that mark major scientific steps forward. Thomas Kuhn (1996): The Structure of Scientific Revolutions

One way to understanding the past through the empirical present is the anthropic cosmological principle. The idea here is that the Universe was deliberately constructed by a designing creator to allow for our evolution. This conclusion arises because some see evolutionary bottlenecks which require precise tuning of various physical parameters to arrive at conditions conducive to life. One of these concerns is the creation of carbon itself. We understand that heavier elements are synthesized by fusion of lighter ones. It turns out that there is no way to make carbon except by the fusion of three helium nuclei. Nuclear physics suggests that at first sight this very improbable event, which may nevertheless have been made possible by a couple of coincidences designed in by a creator. Anthropic principle - Wikipedia, John Barrow & Frank Tipler (1996): The Anthropic Cosmological Principle

The first of these is a resonance of beryllium-8 which increases the probability of fusion of two helium-4 nuclei. The second is the existence, predicted by Hoyle, of an excited state of carbon-12 which encourages the fusion of beryllium-8 and helium-4. Without these resonances it may be that the formation of carbon and carbon based life would be impossible. There are other scenarios where different universes with different fundamental physical constants would not have enabled the existence of life, and so prevented the existence of curious people asking questions like these. Triple-alpha process - Wikipedia

Here I assume that the intelligent universe I am trying to describe has overcome these apparent obstacles by its its own ingenuity without the help of a pre-existing creator. Here we meet the ancient theological problem: is the creator identical to the universe, or other than it? As we go along I shall try to point out that the universe as we know it is capable of exploring the same space of consistent possibility as is open to any divinity. As Aquinas notes, the only bound on the power of God is their inability to create inconsistency.

4: Does the world comprise discrete events or continuous processes?

Measurement is basically a matter of counting. We measure sets of discrete objects like trees, people and beans by bringing them into correspondence with the natural numbers. We measure continuous quantities by counting appropriate standard units of length, time, mass and so on. Here we may encounter fractional units which we usually express as decimal fractions carried to a precision appropriate to the task in hand. It was long ago recognised that there are formal quantities like the length of the diagonal of a unit square that can only be represented precisely by real numbers which can in theory be represented by decimals of countably infinite length. Completeness of the real numbers - Wikipedia

Real numbers and real and complex vectors constructed from real numbers are essential components of mathematical physics. This leads us to think that the physical world uses real and complex numbers to represent itself and that it is real in the sense of continuous. We feel free to use differential and integral calculus in our computations and to linearize complex phenomena by imagining them at an infinitesimal scale and then integrating the result to get the bigger picture.

Although complex functions are valuable intermediaries in modelling and computation, we feel that realities, eigenvalues for instance, must be represented by real numbers. Everything we actually observe in detail comes in discrete units, ranging from galaxies and stars, trees, people, atoms, fundamental particles and ultimately quanta of action. Insofar as quanta of action are atomic, their fragmentation into infinitesimals and integration into continuous measures may be convenient, but may misrepresent reality. Logical processes, which I propose here as the foundation of cognitive cosmology, are sets of discrete actions executed as time ordered sequences which we model mathematically as computing machines. I assume therefore, that all events represent integral numbers of quanta of action. The probabilities of events, on the other hand, may be represented by real numbers which are the absolute values of complex amplitudes which may themselves be sums of the amplitudes of a large number of superposed subsidiary events.

5: Are the boundaries of physics the same as the boundaries of mathematics?

The formalist approach to mathematics promoted by Hilbert sees mathematic as a symbolic game divorced in principle from physical reality and constrained only by the need for internal consistency. He felt that consistent formal mathematics would be complete and computable and was surprised when Gödel and Turing showed that this is not so. Formalism (mathematics) - Wikipedia, Gödel's incompleteness theorems - Wikipedia, Turing machine - Wikipedia

If we assume that mathematics is capable of a faithful representation of the physical, computational and communication structure of the universe, this would lead us to suspect that these theorems also account for uncertainties in the universe and further suggest that a God limited by consistency alone is not capable of complete deterministic knowledge and control of the world.

Uncertainty opens two ways to understand infinity. Literally it means without boundary, unfinished. This places no constraint on size but on definiteness. Colloquially, it also means very big, as we might say the universe is infinite. We can also imagine another understanding of infinity which relates to the theological concept of omnipotence. Aquinas asks if God is omnipotent. Yes, he says, God can do anything possible. The only limit is that they cannot establish the existence of inherent contradictions, such as that Socrates should be simultaneously sitting and standing.

Let us assume that mathematics and physics are both equally omnipotent in the theological sense, constrained only by the non-existence of actual local contradiction (principle §3.11 above).

6: What is the value of Planck units and the Planck scale: why should h, c, G and kB be 1?.

The Planck units are a set of physical units devised by setting Planck's constant ℏ, the speed of light c, the gravitational constant G and the Boltzmann constant kB to the scalar value 1. This convention establishes a Planck time (5 x 10-43 second), Planck length (1.6 x 10-35 metre) and Planck mass (2 x 10-43 kilogram) corresponding to the conventional dimensions M, L and T. Apart from their elegance, these units are relatively impractical. Planck units - Wikipedia

Some expect that gravitation, macroscopically quite a weak force, will become significant in a quantum theory of gravitation expressed at the Planck scale, but since we have no quantum theory of gravitation, there is little evidence for this expectation. The discussion the evolution of the physical universe below will contain some suggestions for the origin of these and other constants of nature.

Like the reference frames with which they may be used, units are simply conventions that established the numerical correspondences between the continuous quantities and the units. From a theoretical point of view, it is very satisfying to use natural constants such as the velocity of light and Planck's constant as units for measurement, although their actual values may make them inconvenient and secondary standards are usually established for practical use.

7: Increases in entropy do not just happen, they must be constructed

Entropy is one of the simplest an most useful measurable quantities in the world, since it is intimately connected to communication which is the foundation of universal structure. It is simply a count of states, and a state, in Cartesian terms, can be any definite and separate entity ranging from a quantum state to a sheep to a galaxy.

On the whole entropy has received a bad press in engineering thermodynamics since it appears to place limitations on the efficiency of heat engines. Entropy is not responsible for this: it is conserved in an ideal Carnot engine. The problem lies with the temperature differences available between the hot and cold sources and the fact that the zero of temperature is so far below the normal exhaust temperature of the average heat engine. On the other hand, once Boltzmann showed that entropy could be understood as a count of the states of a system its reputation has been revised in the light of Shannon's communication theory and it has become a standard measure of information. Entropy - Wikipedia

While entropy was considered a "bad thing" the second law which tells us that entropy generally increases was not felt to require explanation. From the information theoretical point of view, however, entropy is a scarce and valuable resource which can only be increased by constructing more states, a very important long term goal in the design of integrated circuits and image sensors. Von Neumann has pointed out that quantum measurement increases entropy so that it is effectively creation of new states. The assumption here is that the entropy of the structureless initial singularity is zero, and progress in constructing the universe can be measured by the increase of entropy produced by the quantum universe observing itself. Second law of thermodynamics - Wikipedia, John von Neumann (2014): op. cit

8: Are the infinities appearing in quantum field theory real features of nature or are they artefacts suggesting the need for revision of the theory?

The success of the Standard Model has led the idea that the universe is built on a quantum mechanical vacuum to become a settled doctrine. It has had a chequered but triumphant career from Dirac's equation and his sea of positrons through the discovery of renormalization, beta functions and running couplings to its present supremacy marred, perhaps, by some difficult questions: where did the vacuum come from; how does it relate to the initial singularity predicted by the general theory of relativity; why is the cosmological constant computed using the standard model about 100 orders of magnitude distant from measured reality; why can't we renormalize gravitation? This has led to a situation reminiscent of the tough times between Dirac and Shelter Island when vacuum polarizarion was finally brought under control. Kerson Huang (2013): A Critical History of Renormalization, Beta function (physics) - Wikipedia, Shelter Island Conference - Wikipedia

These problems have a history going back to apparently infinite electromagnetic self mass of the classical point electron. Its reappearance in quantum elecrodynamics has been to some extent removed from the spotlight by renormalization group theory. Nevertheless, the unification of the standard model and gravitation is still blocked by the non-renormalizability of possible theories of quantum gravity. Now we are engaged in the desperate speculation we call string theory which, if nothing else, seems to violate the principle 10 above, the heuristic of simplicity which seems to be implicit in the notion that the universe is derived from an initial singularity. Renormalization group - Wikipedia, Michio Kaku (1998): Introduction to Superstrings and M-Theory

The fundamental problem, it seems to me, is the attempt to describe a universe built from inherently discrete atomic quanta of action using continuous mathematics. This leads, one way or another, to integrals which have zero in the denominator. Much of this essay is devoted to discussing this problem and proposing a solution which suits my desire to unify theology and physics: that the fundamental mechanism of the universe is best described using the logical network of communication and computation implicit in quantum theory.

9: Where does the vacuum come from?:

The quantum field theoretical vacuum is considered to be the base state of the universe from which all the details of particle physics are selected by the operation of various symmetries. Anthony Zee (2010): Quantum Field Theory in a Nutshell

Here we assume that the initial singularity is a quantum of action and that the action of action is to act. From the fundamental equation of quantum mechanics, E = ℏ ∂φ / ∂t we assume that such actions create energy. The absolute simplicity of the initial singularity, coupled with the principle of requisite variety suggests that the initial singularity has no control over its action, so repeated actions lead to a random spectrum of energies, which we may identify with the quantum mechanical vacuum.

This feature of quantum mechanics carries through to all scales. There is no mechanism available for us to predict exactly when a quantum event will occur. At best, through the Born rule, we can estimate the probability of an event occurring in a given time interval, often measured by a half-life.
%Further comments on path integral%

Feynman's method establishes that the true path x(t) will be the one where the phase does not change as a result of small variations of the path. Classical Lagrangian mechanics uses the calculus of variations to identify stationary paths and finds that these coincide with the paths predicted by Newton's method. As we get further from the true path the phases become relatively random and so cancel one another in the superposition. We might assume from this that the true path is equivalent to one complete cycle of phase (φ = 2π), equivalent to precisely one quantum of action. This idea may serve as a connection between physical and logical events. An advantage of the Lagrangian approach (noted by Dirac) is that it it is relativistically invariant. Given Feynman's description of the quantum world, we can ask: how does this structure come to be within an initial singularity which is effectively a quantum of action? Before Michelson and Morley and Einstein, Maxwell and others saw the ether as the representative vehicle of electromagnetic radiation. Now the spirit of formalist mathematics may have penetrated physics and people seem to have become relaxed about formal structures with no representative vehicles. In the realm of field theory many see pure mathematical fields as the ontological foundation of reality. Michelson and Morley: On the relative motion of the earth and the lumeniferous ether The ancient doctrine of immaterial spirits seems to have taken root here, although its justification in terms of the need for intellect to be unhindered by matter has been overridden by neurophysiology, and the quantum foundation of mental process is tens of orders of magnitude finer than neural synapses and action potentials. Human brain - Wikipedia The root of the problem here seems to be the set of mathematical problems which began with Zeno and came into the mathematical and physical mainstream two thousand years later with Newton's invention and application of calculus to describe the heavens. The core issue is the attempt to represent a continuum by an ordered sequence of infinitesimal but discrete points. This approach seems implicitly self-contradictory as Zeno clearly illustrated. How does nature represent an infinitesimal point? How does a set of discrete points make a continuum? Zeno's paradoxes - Wikipedia Aristotle, ever practical, defined continuity as having endpoints in common, rather like the way the links in a chain embrace one another. This deals with Zeno's paradoxes and suggests that there is contradiction in Auyang's claim that "Fields are continuous but not amorphous: a field comprises discrete and concrete point entities . . ." Auyang op cit page 47, Aristotle (continuity): Physics V, iii, 227a10 sqq (ref link above). The physical reality appears to be that every physical interaction involves at least one quantum of action, a real event of finite size. Fractional quanta, particularly infinitesimal fractions, do not exist. The world proceeds stepwise, like logic. Mathematical continuity is a handy symmetry for dealing with probabilistic issues where the law of large numbers is applicable, but logical processes, such as those involved in cognition, are inevitably discrete, represented by physical entities like electrons, photons and action potentials. Action potential - Wikipedia, Andrey Kolmogorov (1956): Foundations of the Theory of Probability

In quantum mechanics Planck, Dirac and Feynman arrived at a much subtler approach. We begin with Planck's discovery that while the quantum of action is it is a real number it is, from a physical point of view, an integer and with the dimensions of angular velocity and all real physical events comprise an integral number of quanta of action.

In 1933 Dirac set out to understand the role of action in quantum mechanics. Quantum mechanics began from the Hamiltonian formulation of classical mechanics. Lagrange provided an alternative formulation which has two advantages: it enables on to collect all the equations of motion and express them as the stationary point of a certain action function; and it is relativistically invariant. Since there is no direct route from the Lagrangian to quantum mechanics, Dirac sees that we must take over the ideas of classical Lagrangian theory, not the equations.Paul Dirac (1933): The Lagrangian in Quantum Mechanics

Dirac chooses a route through the theory of contact transformations and finds that the Lagrangian, S plays the role of an exponent in a unitary transformation between two diagonal representations of a quantum state. In passing he finds that the classical definition of S does the job.

%Generation of the universe from the initial singularity%

We have guessed that massless bosons travelling at the speed of light on null geodesics enable to creation of space, that is a venue in which discrete massive entities can exist independently of one another through spacelike separation. At the fundamental level, we see such entities as fermions and the exclusion principle as constitutive of the spatial metric. The spacetime approach to this structure is the spin statistics theorem whose proof derives from the the existence of spacelike separation in Minkowski space. In the underlying Hilbert space, bosons are distinguished by their energy or frequency. The emergence of space is accompanied by an increase in the potential entropy of the universe, since energy differences are no longer required to differentiate particles and spatial separation makes room for large numbers of identical fermions, like electrons. Streater & Wightman (2000): PCT, Spin, Statistics and All That

The Hilbert space explanation for the differentiation of fermions explains that because they have half integral spin the probability of the superposition of two fermion states is zero. If we assume that Hilbert is the primordial space and use the principle of simplicity to argue that in the beginning there are only two opposite phases of recursive function at any energy / frequency (a form of digitization) we can account for the primordial existence of spacetime and bosons and fermions. But where does gravitation come in?

The answer that appeals most to me is that the general theory is a measure of the 'ignorance' of the initial singularity. This is because all information and communication is digital, as shown by Shannon's theory of communication, Turing's theory of communication and Gödel's theory of logic discussed above (§§8, 6 & 7). Therefore, insofar as gravitation is described by the continuous functions of differential geometry it carries no information and is therefore consistent with the primordial simplicity of the universe. It is, in other words, a perfect description a system of zero entropy which is subsequently populated by quantum theory, describing the quantum of action which underlies all communication and the magnificent complexity of the universe which has grown within the gravitational shell. This ignorance is implicit in the notion of general covariance that all Gaussian cooordinate systems are equivalent for the description of physical 'law' which is the foundation of Einstein's work:

The following statement corresponds to the fundamental idea of the general principle of relativity: "All Gaussian coordinate systems are essentially equivalent for the formulation of the general laws of nature."] Einstein (2005): Relativity: The Special and General Theory, page 123

We take the fundamental equation of quantum mechanics E = ℏω to be primordial in that the action of action is to act and energy is repeated action. This, combined with the no cloning theorem, establishes each of these actions as equivalent to the universal logical nand gate. At this point quantum theory sees no distinction between potential and kinetic energy. Only energy differences feature in quantum mechanics and we can set the zero of energy wherever we wish.

The modern approach to the "laws of nature" is to see them as symmetries. The modern view of symmetry was pioneered by Emmy Noether, based on the properties of continuous groups. The network model accommodates symmetry naturally by seeing each symmetry as a feature of a particular layer in the network. We take the Universe to be pure action. Symmetry, on the other hand, is absence of action, so that we understand symmetries to be the boundaries of the Universe, the borderlines between action within the Universe and inaction (ie nothing) outside it. Noether's theorem - Wikipedia

The origin of energy through repeated action seems to violate the conservation of energy, the action of action apparently creating energy without bound, the source perhaps of the big bang. To prevent the violation of the conservation of energy and to honour the principle of zero sum complexification (§3, principle 14) we need an inverse to the kinetic energy of the big bang which we understand to be potential energy, manifest as the gravitational potential. Potential energy, we guess, is the fixed point dual to kinetic energy, identical to mass.

As a consequence, the Lagrangian of the universe as a whole, ∫ (KE - PE) dt approximates, within one quantum of action, to 0. In Newtonian terms for [almost] every action there is an equal and opposite reaction.

In the previous section we introduced space as the dual manifestation of time transformed by the speed of light. Now we introduce momentum as the dual manifestation of energy, transformed once again by the velocity of light. For the massless photon, the transformer between time and space, energy is identical to momentum. For massive particles, the same idea is expressed by Einstein's equation E = mc2. Although mass and energy are numerically equivalent, they are conceptually quite different.

The most obvious feature of gravitation is the gravitational potential that holds us on the planet and kills us when we fall from significant heights. We may look upon potential energy as kinetic energy travelling in the form of a massless particle at the speed of light. This is the nominal speed for gauge particles like photons, gravitons (if they exist) and gluons. We may understand the potential energy of massive particles as arising from their internal motions moving at light speed, so that their interior world comprises null geodesics which account for their apparent zero size in spacetime. This seems consistent with the Wilczek's idea proposed above that the mass of baryons is produced by the kinetic motions of their internal components which generally much lighter than the baryons themselves. Mass, we might say, is energy trapped in a null geodesic. Potential energy - Wikipedia, Wilczek (2002) op. cit. chapter 10 sqq.

We can understand the 3D component of Minkowski space by thinking of the differences between wired and wireless in practical communication networks. Wireless communication is serial (one dimensional) and channels are distinguished by frequency or energy, as we find on quantum mechanics. Wired networks, on the other hand, need three dimensions for their existence in order to prevent wires intersecting in the same plane. We may consider the case of moving fermions by analogy with city traffic. In a two dimensional road system, time division multiplexing introduced by traffic lights enables traffic streams to cross. Three dimensional structures like overpasses and tunnels enable uninterrupted two dimensional traffic flow and separation of air traffic in three dimensional space is established by requiring vehicles travelling in different directions to operate at different altitudes. If we understand the emergence of new features in the universe as a matter of random variation and controlled selection, we may see that 3D space is adequate for complete wired connection, so spaces with 4 or more dimensions have no evolutionary raison d'etre and may be selected out.

Wired networks are therefore more like plumbing or electrical power networks.. Tuning is not required to discriminate sources but switching maybe necessary for one source to connect to many others. A wired network transmitting pure unmodulated power shares three properties with gravitation: it exists in four dimensions, three of space and one of time; it can deal indiscriminately with energy in any form; and the direction of motion of signals is determined by potentials.

From an abstract point of view a fixed point is the dual of the compact and convex nature of a set and we can expect to find a different fixed point corresponding to every continuous mapping of the set onto itself: fixed point theory is symmetrical with respect to the set of all continuous functions. In the case of quantum mechanics these fixed points are the solutions to the eigenvalue equation. Their existence is guaranteed by the Hermitian nature of the unitary operators in quantum mechanics. Agarwal, Meehan & O'Regan (2009): Fixed Point Theory and Applications

An important feature of the network model is that it is symmetric with respect to complexity. The atom of a network is two sources and a channel, which we may think of quantum mechanically as two bosons and a fermion. Sources and connections can exist at any scale. Communications between discrete sources become possible when they share a language or codec, that is a symmetry. Since gravitation is the universal codec which couples all sources without discrimination so long as they have energy, we can imagine that it emerges in the about same epoch of the evolution of the universe as quantum mechanics. Unlike quantum mechanics however, where connection is established by specific values of energy which we compute using the eigenfunctions of specific operators, gravitational connections are indiscriminate. This suggests that they represents a symmetry deeper and simpler than quantum mechanics which reflects the consistent unity of the initial singularity. We would expect gravitation to respond to all the energy levels present in a vacuum, for instance, which is why the cosmological constant problem is so troubling in a universe with infinite degrees of freedom each with attached zero point energy..

Consequently we might conjecture that gravitation is not quantized. In §8 above we have use Shannon's communication theory to connect error free communication with quantization. If gravitation is a universal code which cannot go wrong, there is no ground for quantization. Nevertheless gravitation imposes the precise structure on the universe.

The classical general theory of relativity predicts classical gravitational waves which have now been observed, giving us information about large cosmic events occurring at great distances. Great distance and the overall weakness of gravity mean that these waves require very large high precision interferometers for their detection. Gravitational-wave observatory - Wikipedia

In short perhaps we may see gravitation as a hybrid between classical and quantum reality before they became differentiated. Our cities are festooned with power cables, telephone lines and surrounded by clouds of wireless. Our bodies, on the other hand, are largely wired. But in all cases communication requires contact, and a symmetry that unites Hilbert and Minkowski space.

Given this picture, we might understand that energy attracts energy because it is all the same energy created by action. It is subsequently and being mapped into fixed potential energy equal and opposite to the kinetic energy from which it came. Lagrangian mechanics - Wikipedia (ref link above)

We might imagine that the coupling between the two spaces Hilbert and Minkowski which we ascribe to observation describes the inner life of particles, ie it is a story of consciousness in the same way as my conscious awareness is related to my physical actions. So I imagine that quantum computations in Hilbert spaces are to be found inside particles as my consciousness is to be found inside me. What I hope is that a clear distinction between Hilbert and Minkowski removes many of the difficulties in quantum field theory that arise from ignoring this distinction. So we think that every particle is a computer and the relationships between particles are networks.

This, and the idea that gravitation is not quantized, suggests that gravitation must be some primordial quality of the interior space of the initial singularity described by the classical general theory of relativity. This would be consistent with the idea that it is present from the beginning and guides the growth of the universe as a whole as Einstein found. As Einstein noted when he published his field equation for gravitation, the general theory of relativity is a closed logical structure and there is little choice for an alternative. From this point of view, we might see the interior of the initial singularity as a continuous Lie group fulfilling the hypotheses of fixed point theory. General relativity - Wikipedia, Abraham Pais (1982): 'Subtle is the Lord...': The Science and Life of Albert Einstein page 256, Lie Group - Wikipedia

We have built a Hilbert space inside the initial singularity by its propensity to act, our starting point being mathematical fixed point theory. The topological barrier constraining the universe being the boundary between being consistent inside and inconsistent outside.

-->

Copyright:

You may copy this material freely provided only that you quote fairly and provide a link (or reference) to your source.

Further reading

Books

Agarwal (2000), Ravi P., and Maria Meehan, Donal O'Regan, Fixed Point Theory and Applications, Cambridge University Press 2009 'This book provides a clear exposition of the flourishing field of fixed point theory. Starting form the basics of Banach's contraction theorem, most main results and techniques are ceveloped. . . . The theory is applied to many areas of current interest in analysis. . . . ' 
Amazon
  back

Ashby (1964), W Ross, An Introduction to Cybernetics, Methuen 1956, 1964 'This book is intended to provide [an introduction to cybernetics]. It starts from common-place and well understood concepts, and proceeds step by step to show how these concepts can be made exact, and how they can be developed until they lead into such subjects as feedback, stability, regulation, ultrastability, information, coding, noise and other cybernetic topics.' 
Amazon
  back

Augustine (419, 1991), Saint, and Edmond Hill (Introduction, translation and notes), and John E Rotelle (editor), The Trinity, New City Press 399-419, 1991 Written 399 - 419: De Trinitate is a radical restatement, defence and development of the Christian doctrine of the Trinity. Augustine's book has served as a foundation for most subsequent work, particularly that of Thomas Aquinas.  
Amazon
  back

Auyang (1995), Sunny Y., How is Quantum Field Theory Possible?, Oxford University Press 1995 Jacket: 'Quantum field theory (QFT) combines quantum mechanics with Einstein's special theory of relativity and underlies elementary particle physics. This book presents a philosophical analysis of QFT. It is the first treatise in which the philosophies of space-time, quantum phenomena and particle interactions are encompassed in a unified framework.' 
Amazon
  back

Barrow (1996), John D., and Frank J. Tipler, The Anthropic Cosmological Principle, Oxford University Press 1986, 1996 'This wide-ranging and detailed book explores the many ramifications of the Anthropic Cosmological Principle, covering the whole spectrum of human inquiry from Aristotle to Z bosons. Bringing a unique combination of skills and knowledge to the subject, John D. Barrow and Frank J. Tipler - two of the world's leading cosmologists - cover the definition and nature of life, the search for extraterrestrial intelligence, and the interpretation of the quantum theory in relation to the existence of observers.' 
Amazon
  back

Bastin (1995), Ted, and C W Kilmister, Combinatorial Physics, World Scientific 1995 About this book (World Scientific) 'The authors aim to reinstate a spirit of philosophical enquiry in physics. They abandon the intuitive continuum concepts and build up constructively a combinatorial mathematics of process. This radical change alone makes it possible to calculate the coupling constants of the fundamental fields which — via high energy scattering — are the bridge from the combinatorial world into dynamics. The untenable distinction between what is ‘observed’, or measured, and what is not, upon which current quantum theory is based, is not needed. If we are to speak of mind, this has to be present — albeit in primitive form — at the most basic level, and not to be dragged in at one arbitrary point to avoid the difficulties about quantum observation. There is a growing literature on information-theoretic models for physics, but hitherto the two disciplines have gone in parallel. In this book they interact vitally.' 
Amazon
  back

Brown (2018), Kevin, Reflections on Relativity, 2018 ' . . . general relativity teaches us that the principles of special relativity are applicable only over infinitesimal regions in the presence of gravitation, so in a sense the general theory restricts rather than generalizes the special theory. However, we can also regard special relativity as a theory of flat four-dimensional spacetime, characterized by the Minkowski metric (in suitable coordinates), and the general theory generalizes this by allowing the spacetime manifold to be curved, as represented by a wider class of metric tensors. It is remarkable that this generalization, which is so simple and natural from the geometrical standpoint, leads almost uniquely to a viable theory of gravitation.' (page 700) 
Amazon
  back

Cantor, Georg, Contributions to the Founding of the Theory of Transfinite Numbers (Translated, with Introduction and Notes by Philip E B Jourdain), Dover 1895, 1897, 1955 Jacket: 'One of the greatest mathematical classics of all time, this work established a new field of mathematics which was to be of incalculable importance in topology, number theory, analysis, theory of functions, etc, as well as the entire field of modern logic.' 
Amazon
  back

Cantor (1897, 1955), Georg, Contributions to the Founding of the Theory of Transfinite Numbers (Translated, with Introduction and Notes by Philip E B Jourdain), Dover 1895, 1897, 1955 Jacket: 'One of the greatest mathematical classics of all time, this work established a new field of mathematics which was to be of incalculable importance in topology, number theory, analysis, theory of functions, etc, as well as the entire field of modern logic.' 
Amazon
  back

Casti (1996), John L, Five Golden Rules: Great Theories of 20th-Century Mathematics - and Why They Matter, John Wiley and Sons 1996 Preface: '[this book] is intended to tell the general reader about mathematics by showcasing five of the finest achievements of the mathematician's art in this [20th] century.' p ix. Treats the Minimax theorem (game theory), the Brouwer Fixed-Point theorem (topology), Morse's theorem (singularity theory), the Halting theorem (theory of computation) and the Simplex method (optimisation theory). 
Amazon
  back

Cercignani (2006), Carlo, Ludwig Boltzmann: The Man Who Trusted Atoms, Oxford University Press, USA 2006 'Cercignani provides a stimulating biography of a great scientist. Boltzmann's greatness is difficult to state, but the fact that the author is still actively engaged in research into some of the finer, as yet unresolved issues provoked by Boltzmann's work is a measure of just how far ahead of his time Boltzmann was. It is also tragic to read of Boltzmann's persecution by his contemporaries, the energeticists, who regarded atoms as a convenient hypothesis, but not as having a definite existence. Boltzmann felt that atoms were real and this motivated much of his research. How Boltzmann would have laughed if he could have seen present-day scanning tunnelling microscopy images, which resolve the atomic structure at surfaces! If only all scientists would learn from Boltzmann's life story that it is bad for science to persecute someone whose views you do not share but cannot disprove. One surprising fact I learned from this book was how research into thermodynamics and statistical mechanics led to the beginnings of quantum theory (such as Planck's distribution law, and Einstein's theory of specific heat). Lecture notes by Boltzmann also seem to have influenced Einstein's construction of special relativity. Cercignani's familiarity with Boltzmann's work at the research level will probably set this above other biographies of Boltzmann for a very long time to come.' Dr David J Bottomley  
Amazon
  back

Cohen (1980), Paul J, Set Theory and the Continuum Hypothesis, Benjamin/Cummings 1966-1980 Preface: 'The notes that follow are based on a course given at Harvard University, Spring 1965. The main objective was to give the proof of the independence of the continuum hypothesis [from the Zermelo-Fraenkel axioms for set theory with the axiom of choice included]. To keep the course as self contained as possible we included background materials in logic and axiomatic set theory as well as an account of Gödel's proof of the consistency of the continuum hypothesis. . . .'  
Amazon
  back

Dauben (1990), Joseph Warren, Georg Cantor: His Mathematics and Philosophy of the Infinite, Princeton University Press 1990 Jacket: 'One of the greatest revolutions in mathematics occurred when Georg Cantor (1843-1918) promulgated his theory of transfinite sets. . . . Set theory has been widely adopted in mathematics and philosophy, but the controversy surrounding it at the turn of the century remains of great interest. Cantor's own faith in his theory was partly theological. His religious beliefs led him to expect paradox in any concept of the infinite, and he always retained his belief in the utter veracity of transfinite set theory. Later in his life, he was troubled by attacks of severe depression. Dauben shows that these played an integral part in his understanding and defense of set theory.' 
Amazon
  back

Davies (1992), Paul, The Mind of God: Science and the Search for Ultimate Meaning, Penguin Books 1992 'Paul Davies' "The Mind of God: Science and the Search for Ultimate Meaning" explores how modern science is beginning to shed light on the mysteries of our existence. Is the universe - and our place in it - the result of random chance, or is there an ultimate meaning to existence? Where did the laws of nature come from? Were they created by a higher force, or can they be explained in some other way? How, for example, could a mechanism as complex as an eye have evolved without a creator? Paul Davies argues that the achievement of science and mathematics in unlocking the secrets of nature mean that there must be a deep and significant link between the human mind and the organization of the physical world. . . . ' 
Amazon
  back

Deutsch (1997), David, The Fabric of Reality: The Science of Parallel Universes - and its Implications, Allen Lane Penguin Press 1997 Jacket: 'Quantum physics, evolution, computation and knowledge - these four strands of scientific theory and philosophy have, until now, remained incomplete explanations of the way the universe works. . . . Oxford scholar DD shows how they are so closely intertwined that we cannot properly understand any one of them without reference to the other three. . . .' 
Amazon
  back

Dirac, P A M, The Principles of Quantum Mechanics (4th ed), Oxford UP/Clarendon 1983 Jacket: '[this] is the standard work in the fundamental principles of quantum mechanics, indispensible both to the advanced student and the mature research worker, who will always find it a fresh source of knowledge and stimulation.' (Nature)  
Amazon
  back

Feynman (1965), Richard P, and Albert P Hibbs, Quantum Mechanics and Path Integrals, McGraw Hill 1965 Preface: 'The fundamental physical and mathematical concepts which underlie the path integral approach were first developed by R P Feynman in the course of his graduate studies at Princeton, ... . These early inquiries were involved with the problem of the infinite self-energy of the electron. In working on that problem, a "least action" principle was discovered [which] could deal successfully with the infinity arising in the application of classical electrodynamics.' As described in this book. Feynman, inspired by Dirac, went on the develop this insight into a fruitful source of solutions to many quantum mechanical problems.  
Amazon
  back

Feynman (2002), Richard, Feynman Lectures on Gravitation, Westview Press 2002 Amazon Editorial Reviews Book Description 'The Feynman Lectures on Gravitation are based on notes prepared during a course on gravitational physics that Richard Feynman taught at Caltech during the 1962-63 academic year. For several years prior to these lectures, Feynman thought long and hard about the fundamental problems in gravitational physics, yet he published very little. . . .. Characteristically, Feynman took an untraditional non-geometric approach to gravitation and general relativity based on the underlying quantum aspects of gravity. Hence, these lectures contain a unique pedagogical account of the development of Einstein's general theory of relativity as the inevitable result of the demand for a self-consistent theory of a massless spin-2 field (the graviton) coupled to the energy-momentum tensor of matter. This approach also demonstrates the intimate and fundamental connection between gauge invariance and the principle of equivalence.' 
Amazon
  back

Feynman (2005), Richard Phillips, Feynman's Thesis: A New Approach to Quantum Mechanics, World Scientific Publishing Company 2005 Amazon editorial review: 'Editorial Reviews Review 'The young Feynman revealed here was full of invention, verve, and ambition. His new approach to quantum mechanics, after simmering for decades beneath the surface of theoretical physics, burst into new prominence in the 1970s. Now its influence is pervasive, and still expanding. Feynman's original presentation is not only uniquely clear, but also contains insights and perspectives that are not widely known, and might well provide ammunition for another explosion or two.' Frank Wilczek 
Amazon
  back

Fortun (1998), Mike, and Herbert J Bernstein, Muddling Through: Pursuing Science and Truths in the Twenty-First Century, Counterpoint 1998 Amazon editorial review: 'Does science discover truths or create them? Does dioxin cause cancer or not? Is corporate-sponsored research valid or not? Although these questions reflect the way we're used to thinking, maybe they're not the best way to approach science and its place in our culture. Physicist Herbert J. Bernstein and science historian Mike Fortun, both of the Institute for Science and Interdisciplinary Studies (ISIS), suggest a third way of seeing, beyond taking one side or another, in Muddling Through: Pursuing Science and Truths in the 21st Century. While they deal with weighty issues and encourage us to completely rethink our beliefs about science and truth, they do so with such grace and humor that we follow with ease discussions of toxic-waste disposal, the Human Genome Project, and retooling our language to better fit the way science is actually done.' 
Amazon
  back

Hawking (1975), Steven W, and G F R Ellis, The Large Scale Structure of Space-Time, Cambridge UP 1975 Preface: Einstein's General Theory of Relativity . . . leads to two remarkable predictions about the universe: first that the final fate of massive stars is to collapse behind an event horizon to form a 'black hole' which will contain a singularity; and secondly that there is a singularity in our past which constitutes, in some sense, a beginning to our universe. Our discussion is principally aimed at developing these two results.' 
Amazon
  back

Hodges (1983), Andrew, Alan Turing: The Enigma, Burnett 1983 Author's note: '. . . modern papers often employ the usage turing machine. Sinking without a capital letter into the collective mathematical consciousness (as with the abelian group, or the riemannian manifold) is probably the best that science can offer in the way of canonisation.' (530) 
Amazon
  back

James (1912), Grace, Green Willow & Other Japanese Fairy Tales, Macmillan & Co, Senate Random House 1912, 1996 Jacket: ' Japanese Fairy Tales brings together a magnificent selection o dtories from all parts of the Land of the Rising Sun. Drawing on richly imaginative folk tradition, and set against the backdrop of Japan's mythic past, the tales are suffused with the essence of this magincal land. 
Amazon
  back

Jech (1997), Thomas, Set Theory, Springer 1997 Jacket: 'This book covers major areas of modern set theory: cardinal arithmetic, constructible sets, forcing and Boolean-valued models, large cardinals and descriptive set theory. . . . It can be used as a textbook for a graduate course in set theory and can serve as a reference book.' 
Amazon
  back

Joseph (2010), George Gheverghese, The Crest of the Peacock: Non-European Roots of Mathematics, Princeton University Press 2010 'From the Ishango Bone of central Africa and the Inca quipu of South America to the dawn of modern mathematics, The Crest of the Peacock makes it clear that human beings everywhere have been capable of advanced and innovative mathematical thinking. George Gheverghese Joseph takes us on a breathtaking multicultural tour of the roots and shoots of non-European mathematics. He shows us the deep influence that the Egyptians and Babylonians had on the Greeks, the Arabs' major creative contributions, and the astounding range of successes of the great civilizations of India and China.' 
Amazon
  back

Kaku (1998), Michio, Introduction to Superstrings and M-Theory (Graduate Texts in Contemporary Physics), Springer 1998 ' Called by some "the theory of everything," superstrings may solve a problem which has eluded physicists for the past 50 years -- the final unification of the two great theories of the twentieth century, general relativity and quantum field theory. This is a course-tested comprehensive introductory graduate text on superstrings which stresses the most current areas of interest, not covered in other presentation, including: string field theory, multi loops, Teichmueller spaces, conformal field theory, and four-dimensional strings. The book begins with a simple discussion of point particle theory, and uses the Feynman path integral technique to unify the presentation of superstrings. Prerequisites are an aquaintance with quantum mechanics and relativity. This second edition has been revised and updated throughout.' 
Amazon
  back

Kaku (2021), Michio, The God Equation: The Quest for a Theory of Everything , Doubleday 2021 ' This is the story of a quest: to find a Theory of Everything. Einstein dedicated his life to seeking this elusive Holy Grail, a single, revolutionary 'god equation' which would tie all the forces in the universe together, yet never found it. Some of the greatest minds in physics took up the search, from Stephen Hawking to Brian Greene. None have yet succeeded. In The God Equation, renowned theoretical physicist Michio Kaku takes the reader on a mind-bending ride through the twists and turns of this epic journey: a mystery that has fascinated him for most of his life. He guides us through the key debates in modern physics, from Newton's law of gravity via relativity and quantum mechanics to the latest developments in string theory.  
Amazon
  back

Khinchin (1998), Aleksandr Yakovlevich, The Mathematical Foundations of Quantum Statistics, Dover 1998 'In the area of quantum statistics, I show that a rigorous mathematical basis of the computational formulas of statistical physics . . . may be obtained from an elementary application of the well-developed limit theorems of the theory of probability.' 
Amazon
  back

Kolmogorov, Andrey Nikolaevich, and Nathan Morrison (Translator) (With an added bibliography by A T Bharucha-Reid), Foundations of the Theory of Probability, Chelsea 1956 Preface: 'The purpose of this monograph is to give an axiomatic foundation for the theory of probability. . . . This task would have been a rather hopeless one before the introduction of Lebesgue's theories of measure and integration. However, after Lebesgue's publication of his investigations, the analogies between measure of a set and mathematical expectation of a random variable became apparent. These analogies allowed of further extensions; thus, for example, various properties of independent random variables were seen to be in complete analogy with the corresponding properties of orthogonal functions . . .' 
Amazon
  back

Kolmogorov (1956), Andrey Nikolaevich, and Nathan Morrison (Translator) (With an added bibliography by A T Bharucha-Reid), Foundations of the Theory of Probability, Chelsea 1956 Preface: 'The purpose of this monograph is to give an axiomatic foundation for the theory of probability. . . . This task would have been a rather hopeless one before the introduction of Lebesgue's theories of measure and integration. However, after Lebesgue's publication of his investigations, the analogies between measure of a set and mathematical expectation of a random variable became apparent. These analogies allowed of further extensions; thus, for example, various properties of independent random variables were seen to be in complete analogy with the corresponding properties of orthogonal functions . . .' 
Amazon
  back

Kolmogorov (1956), Andrey Nikolaevich, and Nathan Morrison (Translator) (With an added bibliography by A T Bharucha-Reid), Foundations of the Theory of Probability, Chelsea 1956 Preface: 'The purpose of this monograph is to give an axiomatic foundation for the theory of probability. . . . This task would have been a rather hopeless one before the introduction of Lebesgue's theories of measure and integration. However, after Lebesgue's publication of his investigations, the analogies between measure of a set and mathematical expectation of a random variable became apparent. These analogies allowed of further extensions; thus, for example, various properties of independent random variables were seen to be in complete analogy with the corresponding properties of orthogonal functions . . .' 
Amazon
  back

Kuhn (1996), Thomas S, The Structure of Scientific Revolutions, U of Chicago Press 1962, 1970, 1996 Introduction: 'a new theory, however special its range of application, is seldom just an increment to what is already known. Its assimilation requires the reconstruction of prior theory and the re-evaluation of prior fact, an intrinsically revolutionary process that is seldom completed by a single man, and never overnight.' [p 7]  
Amazon
  back

Lonergan (1997), Bernard J F, and Robert M. Doran, Frederick E. Crowe (eds), Verbum : Word and Idea in Aquinas (Collected Works of Bernard Lonergan volume 2), University of Toronto Press 1997 Jacket: 'Verbum is a product of Lonergan's eleven years of study of the thought of Thomas Aquinas. The work is considered by many to be a breakthrough in the history of Lonergan's theology . . .. Here he interprets aspects in the writing of Aquinas relevant to trinitarian theory and, as in most of Lonergan's work, one of the principal aims is to assist the reader in the search to understand the workings of the human mind.' 
Amazon
  back

Macmillan (2020), Margaret, War: How Conflict Shaped Us, Profile Books 2020 ' In War, Professor Margaret MacMillan explores the deep links between society and war and the questions they raise. We learn when war began - whether among early homo sapiens or later, as we began to organise ourselves into tribes and settle in communities. We see the ways in which war reflects changing societies and how war has brought change - for better and worse. Economies, science, technology, medicine, culture: all are instrumental in war and have been shaped by it - without conflict it we might not have had penicillin, female emancipation, radar or rockets. Throughout history, writers, artists, film-makers, playwrights, and composers have been inspired by war - whether to condemn, exalt or simply puzzle about it. If we are never to be rid of war, how should we think about it and what does that mean for peace? 
Amazon
  back

Newton (1972), Isaac, Philosophiae Naturalis Principia Mathematica , Harvard University Press 1972 One of the most important contributions to human knowledge. First translated from the Latin by Andrew Motte in 1729,  
Amazon
  back

Nielsen, Michael A, and Isaac L Chuang, Quantum Computation and Quantum Information, Cambridge University Press 2000 Review: A rigorous, comprehensive text on quantum information is timely. The study of quantum information and computation represents a particularly direct route to understanding quantum mechanics. Unlike the traditional route to quantum mechanics via Schroedinger's equation and the hydrogen atom, the study of quantum information requires no calculus, merely a knowledge of complex numbers and matrix multiplication. In addition, quantum information processing gives direct access to the traditionally advanced topics of measurement of quantum systems and decoherence.' Seth Lloyd, Department of Quantum Mechanical Engineering, MIT, Nature 6876: vol 416 page 19, 7 March 2002. 
Amazon
  back

Nielsen (2000), Michael A, and Isaac L Chuang, Quantum Computation and Quantum Information, Cambridge University Press 2000 Review: A rigorous, comprehensive text on quantum information is timely. The study of quantum information and computation represents a particularly direct route to understanding quantum mechanics. Unlike the traditional route to quantum mechanics via Schroedinger's equation and the hydrogen atom, the study of quantum information requires no calculus, merely a knowledge of complex numbers and matrix multiplication. In addition, quantum information processing gives direct access to the traditionally advanced topics of measurement of quantum systems and decoherence.' Seth Lloyd, Department of Quantum Mechanical Engineering, MIT, Nature 6876: vol 416 page 19, 7 March 2002. 
Amazon
  back

Pais (1982), Abraham, 'Subtle is the Lord...': The Science and Life of Albert Einstein, Oxford UP 1982 Jacket: In this . . . major work Abraham Pais, himself an eminent physicist who worked alongside Einstein in the post-war years, traces the development of Einstein's entire ouvre. . . . Running through the book is a completely non-scientific biography . . . including many letters which appear in English for the first time, as well as other information not published before.' [Raffiniert ist der Herr Gott, aber boshaft is er nicht] 
Amazon
  back

Polya, George, and Gordon Latta, Complex Variables, John Wiley & Sons Inc 1974 Preface: 'After having lectured for several decades on complex variables to prospective engineers and physicists, I have definite and, I hope, not unrealistic ideas about their requirements and preferences. . . .
I hope that this book is useful not only to future engineers and physicists, but also to future mathematicians. Mathematical concepts and facts gain in vividness and clarity if they are well connected with the world around us and with general ideas, and if we obtain them by our own work through successive stages instead of in one lump.' 
Amazon
  back

Streater (2000), Raymond F, and Arthur S Wightman, PCT, Spin, Statistics and All That, Princeton University Press 2000 Amazon product description: 'PCT, Spin and Statistics, and All That is the classic summary of and introduction to the achievements of Axiomatic Quantum Field Theory. This theory gives precise mathematical responses to questions like: What is a quantized field? What are the physically indispensable attributes of a quantized field? Furthermore, Axiomatic Field Theory shows that a number of physically important predictions of quantum field theory are mathematical consequences of the axioms. Here Raymond Streater and Arthur Wightman treat only results that can be rigorously proved, and these are presented in an elegant style that makes them available to a broad range of physics and theoretical mathematics.' 
Amazon
  back

Tanenbaum (1996), Andrew S, Computer Networks, Prentice Hall International 1996 Preface: 'The key to designing a computer network was first enunciated by Julius Caesar: Divide and Conquer. The idea is to design a network as a sequence of layers, or abstract machines, each one based upon the previous one. . . . This book uses a model in which networks are divided into seven layers. The structure of the book follows the structure of the model to a considerable extent.'  
Amazon
  back

Tapsell (2014), Kieran, Potiphar's Wife: The Vatican's Secret and Child Sexual Abuse, ATF Press 2014 Back cover: 'For 1500 years the Catholic Church accepted that clergy who sexually abused children deserved to be stripped of their status as priests and then imprisoned. . . . That all changed in 1922 when Pope Pius XI issues his decree Crimen Sollicitationis that created a de facto 'privilege of clergy' by imposing the 'secret of the Holy Ofice' on all information obtained through the Church's canonical investigations. If the State did not know about these crimes, then there would be no State trials, and the matter could be treated as a purely canonical crime to be dealt with in secret in the Church courts.' 
Amazon
  back

Veltman, Martinus, Facts and Mysteries in Elementary Particle Physics, World Scientific 2003 'Introduction: The twentieth century has seen an enormous progress in physics. The fundamental physics of the first half of the century was dominated by the theory of relativity, Einstein's theory of gravitation and the theory of quantum mechanics. The second half of the century saw the rise of elementary particle physics. . . . Through this development there has been a subtle change in point of view. In Einstein's theory space and time play an overwhelming dominant role. . . . The view that we would like to defend can perhaps best be explaned by an analogy. To us, space-time and the laws of quantum mechanics are like the decor, the setting of a play. The elementary articles are the actors, and physics is what they do. . . . Thus in this book the elementary particles are the central objects.' 
Amazon
  back

Veltman (1994), Martinus, Diagrammatica: The Path to the Feynman Rules, Cambridge University Press 1994 Jacket: 'This book provides an easily accessible introduction to quantum field theory via Feynman rules and calculations in particle physics. The aim is to make clear what the physical foundations of present-day field theory are, to clarify the physical content of Feynman rules, and to outline their domain of applicability. ... The book includes valuable appendices that review some essential mathematics, including complex spaces, matrices, the CBH equation, traces and dimensional regularization. ...' 
Amazon
  back

Veltman (2003), Martinus, Facts and Mysteries in Elementary Particle Physics, World Scientific 2003 'Introduction: The twentieth century has seen an enormous progress in physics. The fundamental physics of the first half of the century was dominated by the theory of relativity, Einstein's theory of gravitation and the theory of quantum mechanics. The second half of the century saw the rise of elementary particle physics. . . . Through this development there has been a subtle change in point of view. In Einstein's theory space and time play an overwhelming dominant role. . . . The view that we would like to defend can perhaps best be explaned by an analogy. To us, space-time and the laws of quantum mechanics are like the decor, the setting of a play. The elementary articles are the actors, and physics is what they do. . . . Thus in this book the elementary particles are the central objects.' 
Amazon
  back

von Neumann, John, and Robert T Beyer (translator), Mathematical Foundations of Quantum Mechanics, Princeton University Press 1983 Jacket: '. . . a revolutionary book that caused a sea change in theoretical physics. . . . JvN begins by presenting the theory of Hermitean operators and Hilbert spaces. These provide the framework for transformation theory, which JvN regards as the definitive form of quantum mechanics. . . . Regarded as a tour de force at the time of its publication, this book is still indispensable for those interested in the fundamental issues of quantum mechanics.' 
Amazon
  back

Weinberg (1995), Steven, The Quantum Theory of Fields Volume I: Foundations, Cambridge University Press 1995 Jacket: 'After a brief historical outline, the book begins anew with the principles about which we are most certain, relativity and quantum mechanics, and then the properties of particles that follow from these principles. Quantum field theory then emerges from this as a natural consequence. The classic calculations of quantum electrodynamics are presented in a thoroughly modern way, showing the use of path integrals and dimensional regularization. The account of renormalization theory reflects the changes in our view of quantum field theory since the advent of effective field theories. The book's scope extends beyond quantum elelctrodynamics to elementary partricle physics and nuclear physics. It contains much original material, and is peppered with examples and insights drawn from the author's experience as a leader of elementary particle research. Problems are included at the end of each chapter. ' 
Amazon
  back

Weyl (1985), Hermann, Space Time Matter (translated by Henry L Brose), Dover 1985 Amazon customer review: ' The birth of gauge theory by its author: This book bewitched several generations of physicists and students. Hermann Weyl was one of the very great mathematicians of this century. He was also a great physicist and an artist with ideas and words. In this book you will find, at a deep level, the philosophy, mathematics and physics of space-time. It appeared soon after Einstein's famous paper on General Relativity, and is, in fact, a magnificent exposition of it, or, rather, of a tentative generalization of it. The mathematical part is of the highest class, striving to put geometry to the forefront. Actually, the book introduced a far-reaching generalization of the theory of connections, with respect to the Levi-Civita theory. It was not a generalization for itself, but motivated by the dream (Einstein's) of including gravitation and electromagnetism in the same (geometrical) theory. The result was gauge theory, which, slightly modified and applied to quantum mechanics resulted in the theory which dominates present particle physics. Weyl's unified theory was proved wrong by Einstein, and his criticism alone, accepted by Weyl and included in the book, would justify the reading. Though wrong, Weyl's theory is so beautiful that Paul Dirac stated that nature could not afford neglecting such perfection, and that the theory was probably only misplaced. Prophetic words! . . . ' Henrique Fleming 
Amazon
  back

Whitehead (1910, 1962), Alfred North, and Bertrand Arthur Russell, Principia Mathematica (Cambridge Mathematical Library), Cambridge University Press 1910, 1962 The great three-volume Principia Mathematica is deservedly the most famous work ever written on the foundations of mathematics. Its aim is to deduce all the fundamental propositions of logic and mathematics from a small number of logical premisses and primitive ideas, and so to prove that mathematics is a development of logic. Not long after it was published, Goedel showed that the project could not completely succeed, but that in any system, such as arithmetic, there were true propositions that could not be proved.  
Amazon
  back

Wiener (1996), Norbert, Cybernetics or Control and Communication in the Animal and the Machine, MIT Press 1996 The classic founding text of cybernetics. 
Amazon
  back

Wilczek (2008), Frank, The Lightness of Being: Mass, Ether, and the Unification of Forces, Basic Books 2008 ' In this excursion to the outer limits of particle physics, Wilczek explores what quarks and gluons, which compose protons and neutrons, reveal about the manifestation of mass and gravity. A corecipient of the 2004 Nobel Prize in Physics, Wilczek knows what he’s writing about; the question is, will general science readers? Happily, they know what the strong interaction is (the forces that bind the nucleus), and in Wilczek, they have a jovial guide who adheres to trade publishing’s belief that a successful physics title will not include too many equations. Despite this injunction (against which he lightly protests), Wilczek delivers an approachable verbal picture of what quarks and gluons are doing inside a proton that gives rise to mass and, hence, gravity. Casting the light-speed lives of quarks against “the Grid,” Wilczek’s term for the vacuum that theoretically seethes with quantum activity, Wilczek exudes a contagious excitement for discovery. A near-obligatory acquisition for circulating physics collections.' --Gilbert Taylor  
Amazon
  back

Zee (2010), Anthony, Quantum Field Theory in a Nutshell, Princeton University Press 2010 ' Since it was first published, Quantum Field Theory in a Nutshell has quickly established itself as the most accessible and comprehensive introduction to this profound and deeply fascinating area of theoretical physics. Now in this fully revised and expanded edition, A. Zee covers the latest advances while providing a solid conceptual foundation for students to build on, making this the most up-to-date and modern textbook on quantum field theory available. This expanded edition features several additional chapters, as well as an entirely new section describing recent developments in quantum field theory such as gravitational waves, the helicity spinor formalism, on-shell gluon scattering, recursion relations for amplitudes with complex momenta, and the hidden connection between Yang-Mills theory and Einstein gravity. Zee also provides added exercises, explanations, and examples, as well as detailed appendices, solutions to selected exercises, and suggestions for further reading.' 
Amazon
  back

Zemanian (1996), Armen H, Transfiniteness for Graphs, Electrical Newtorks and Random Walks, Springer Verlag 1996 'A substantial introduction is followed by chapters covering transfinite graphs; connectedness problems; finitely structured transfinite graphs; transfinite electrical networks; permissively finitely structured networks; and a theory for random walks on a finitely structured transfinite network. Appendices present brief surveys of ordinal and cardinal numbers; summable series; and irreducible and reversible Markov chains. Accessible to those familiar with basic ideas about graphs, Hilbert spaces, and resistive electrical networks. (Annotation copyright Book News, Inc. Portland, Or.)'  
Amazon
  back

Links

2019 redefinition of the SI base units - Wikipedia, 2019 redefinition of the SI base units - Wikipedia - the free encyclopedia, ' Effective 20 May 2019, the 144th anniversary of the Metre Convention, the SI base units were redefined in agreement with the International System of Quantities. In the redefinition, four of the seven SI base units – the kilogram, ampere, kelvin, and mole – were redefined by setting exact numerical values when expressed in SI units for the Planck constant (h), the elementary electric charge (e), the Boltzmann constant (kB), and the Avogadro constant (NA), respectively. The second, metre, and candela were already defined by physical constants and were not subject to correction to their definitions.' back

Action potential - Wikipedia, Action potential - Wikipedia, the free encyclopedia, ' In physiology, an action potential (AP) occurs when the membrane potential of a specific cell location rapidly rises and falls: this depolarization then causes adjacent locations to similarly depolarize. Action potentials occur in several types of animal cells, called excitable cells, which include neurons, muscle cells, endocrine cells and in some plant cells. ' back

Ad Gentes (Vatican II), Decree on the Mission Activity of the Church, 'Divinely sent to the nations of the world to be unto them "a universal sacrament of salvation," the Church, driven by the inner necessity of her own catholicity, and obeying the mandate of her Founder (cf. Mark 16:16), strives ever to proclaim the Gospel to all men. The Apostles themselves, on whom the Church was founded, following in the footsteps of Christ, "preached the word of truth and begot churches." It is the duty of their successors to make this task endure "so that the word of God may run and be glorified (2 Thess. 3:1) and the kingdom of God be proclaimed and established throughout the world.' back

Alan Turing, On Computable Numbers, with an application to the Entscheidungsproblem, 'The "computable" numbers may be described briefly as the real numbers whose expressions as a decimal are calculable by some finite means. Although the subject of this paper is ostensibly the computable numbers, it is almost equally easy to define and investigate computable functions of an integral variable of a real or computable variable, computable predicates and so forth. . . . ' back

Algebraic closure - Wikipedia, Algebraic closure - Wikipedia, the free encyclopedia, I'n mathematics, particularly abstract algebra, an algebraic closure of a field K is an algebraic extension of K that is algebraically closed. It is one of many closures in mathematics. Using Zorn's lemma, it can be shown that every field has an algebraic closure, and that the algebraic closure of a field K is unique up to an isomorphism that fixes every member of K. Because of this essential uniqueness, we often speak of the algebraic closure of K, rather than an algebraic closure of K. . . . The fundamental theorem of algebra states that the algebraic closure of the field of real numbers is the field of complex numbers.' back

Allegory of the cave - Wikipedia, Allegory of the cave - Wikipedia, the free encyclopedia, 'Plato has Socrates describe a gathering of people who have lived chained to the wall of a cave all of their lives, facing a blank wall. The people watch shadows projected on the wall by things passing in front of a fire behind them, and begin to designate names to these shadows. The shadows are as close as the prisoners get to viewing reality. He then explains how the philosopher is like a prisoner who is freed from the cave and comes to understand that the shadows on the wall do not make up reality at all, as he can perceive the true form of reality rather than the mere shadows seen by the prisoners.' back

Analogue Computer - Wikipedia, Analogue Computer - Wikipedia, the free encyclopedia, ' An analog computer or analogue computer is a type of computer that uses the continuous variation aspect of physical phenomena such as electrical, mechanical, or hydraulic quantities to model the problem being solved. In contrast, digital computers represent varying quantities symbolically and by discrete values of both time and amplitude.' back

Andrea Falcon (Stanford Encyclopedia of Philosophy), Aristotle on Causality, ' Each Aristotelian science consists in the causal investigation of a specific department of reality. If successful, such an investigation results in causal knowledge; that is, knowledge of the relevant or appropriate causes. The emphasis on the concept of cause explains why Aristotle developed a theory of causality which is commonly known as the doctrine of the four causes. For Aristotle, a firm grasp of what a cause is, and how many kinds of causes there are, is essential for a successful investigation of the world around us.' back

Anthropic principle - Wikipedia, Anthropic principle - Wikipedia, the free encyclopedia, 'The anthropic principle (from Greek anthropos, meaning "human") is the philosophical consideration that observations of the universe must be compatible with the conscious and sapient life that observes it. Some proponents of the anthropic principle reason that it explains why the universe has the age and the fundamental physical constants necessary to accommodate conscious life. As a result, they believe it is unremarkable that the universe's fundamental constants happen to fall within the narrow range thought to be compatible with life.' back

Apophatic theology - Wikipedia, Apophatic theology - Wikipedia, the free encyclopedia, 'Apophatic theology (from Greek ἀπόφασις from ἀπόφημι - apophēmi, "to deny")—also known as negative theology or via negativa (Latin for "negative way")—is a theology that attempts to describe God, the Divine Good, by negation, to speak only in terms of what may not be said about the perfect goodness that is God. It stands in contrast with cataphatic theology.' back

Aquinas Summa I, 3, 1 Proemium, Of the Simplicity of God, 'Cognito de aliquo an sit, inquirendum restat quomodo sit, ut sciatur de eo quid sit. Sed quia de Deo scire non possumus quid sit, sed quid non sit, non possumus considerare de Deo quomodo sit, sed potius quomodo non sit. . . . Potest autem ostendi de Deo quomodo non sit, removendo ab eo ea quae ei non conveniunt, utpote compositionem, motum, et alia huiusmodi. Primo ergo inquiratur de simplicitate ipsius, per quam removetur ab eo compositio.' back

Aquinas, Summa I, 15, 1, Are there ideas in God?, ' . . . in other agents (the form of the thing to be made pre-exists) according to intelligible being, as in those that act by the intellect; and thus the likeness of a house pre-exists in the mind of the builder. And this may be called the idea of the house, since the builder intends to build his house like to the form conceived in his mind. As then the world was not made by chance, but by God acting by His intellect, as will appear later, there must exist in the divine mind a form to the likeness of which the world was made. And in this the notion of an idea consists.' back

Aquinas, Summa I, 25, 3, Is God omnipotent?, '. . . God is called omnipotent because He can do all things that are possible absolutely; which is the second way of saying a thing is possible. For a thing is said to be possible or impossible absolutely, according to the relation in which the very terms stand to one another, possible if the predicate is not incompatible with the subject, as that Socrates sits; and absolutely impossible when the predicate is altogether incompatible with the subject, as, for instance, that a man is a donkey.' back

Aquinas, Summa, I, 1, 2, Is sacred doctrine is a science?, 'I answer that, Sacred doctrine is a science. We must bear in mind that there are two kinds of sciences. There are some which proceed from a principle known by the natural light of intelligence, such as arithmetic and geometry and the like. There are some which proceed from principles known by the light of a higher science: thus the science of perspective proceeds from principles established by geometry, and music from principles established by arithmetic. So it is that sacred doctrine is a science because it proceeds from principles established by the light of a higher science, namely, the science of God and the blessed.' back

Aquinas, Summa, I, 14, 8, Is the knowledge of God the cause of things?, 'Now it is manifest that God causes things by His intellect, since His being is His act of understanding; and hence His knowledge must be the cause of things, in so far as His will is joined to it. Hence the knowledge of God as the cause of things is usually called the "knowledge of approbation".' back

Aquinas, Summa, I, 27, 1, Is there procession in God?, 'As God is above all things, we should understand what is said of God, not according to the mode of the lowest creatures, namely bodies, but from the similitude of the highest creatures, the intellectual substances; while even the similitudes derived from these fall short in the representation of divine objects. Procession, therefore, is not to be understood from what it is in bodies, either according to local movement or by way of a cause proceeding forth to its exterior effect, as, for instance, like heat from the agent to the thing made hot. Rather it is to be understood by way of an intelligible emanation, for example, of the intelligible word which proceeds from the speaker, yet remains in him. In that sense the Catholic Faith understands procession as existing in God.' back

Aquinas, Summa, I, 3, 7, Is God altogether simple?, 'I answer that, The absolute simplicity of God may be shown in many ways. First, from the previous articles of this question. For there is neither composition of quantitative parts in God, since He is not a body; nor composition of matter and form; nor does His nature differ from His "suppositum"; nor His essence from His existence; neither is there in Him composition of genus and difference, nor of subject and accident. Therefore, it is clear that God is nowise composite, but is altogether simple. . . . ' back

Aquinas, Summa: I, 14, 1, Is there knowledge in God?, ' I answer that, In God there exists the most perfect knowledge. . . . it is clear that the immateriality of a thing is the reason why it is cognitive; and according to the mode of immateriality is the mode of knowledge. Hence it is said in De Anima ii that plants do not know, because they are wholly material. But sense is cognitive because it can receive images free from matter, and the intellect is still further cognitive, because it is more separated from matter and unmixed, as said in De Anima iii. Since therefore God is in the highest degree of immateriality as stated above (Question 7, Article 1), it follows that He occupies the highest place in knowledge.' back

Aquinas, Summa: I, 14, 1, Is there knowledge in God?, ' Respondeo dicendum quod in Deo perfectissime est scientia. Ad cuius evidentiam, considerandum est quod cognoscentia a non cognoscentibus in hoc distinguuntur, quia non cognoscentia nihil habent nisi formam suam tantum; sed cognoscens natum est habere formam etiam rei alterius, nam species cogniti est in cognoscente. Unde manifestum est quod natura rei non cognoscentis est magis coarctata et limitata, natura autem rerum cognoscentium habet maiorem amplitudinem et extensionem. Propter quod dicit philosophus, III de anima, quod anima est quodammodo omnia. Coarctatio autem formae est per materiam. Unde et supra diximus quod formae, secundum quod sunt magis immateriales, secundum hoc magis accedunt ad quandam infinitatem. Patet igitur quod immaterialitas alicuius rei est ratio quod sit cognoscitiva; et secundum modum immaterialitatis est modus cognitionis. Unde in II de anima dicitur quod plantae non cognoscunt, propter suam materialitatem. Sensus autem cognoscitivus est, quia receptivus est specierum sine materia, et intellectus adhuc magis cognoscitivus, quia magis separatus est a materia et immixtus, ut dicitur in III de anima. Unde, cum Deus sit in summo immaterialitatis, ut ex superioribus patet, sequitur quod ipse sit in summo cognitionis. back

Aquinas: Summa I, 1, 2, Is sacred doctrine a science?, 'I answer that, Sacred doctrine is a science. We must bear in mind that there are two kinds of sciences. There are some which proceed from a principle known by the natural light of intelligence, such as arithmetic and geometry and the like. There are some which proceed from principles known by the light of a higher science: thus the science of perspective proceeds from principles established by geometry, and music from principles established by arithmetic. So it is that sacred doctrine is a science because it proceeds from principles established by the light of a higher science, namely, the science of God and the blessed.' back

Archdiocese of Baltimore, Baltimore Cathechism No. 1, 'A Catechism of Christian Doctrine, Prepared and Enjoined by Order of the Third Council of Baltimore, or simply the Baltimore Catechism, is the official national catechism for children in the United States. It was the standard Catholic religion school text in the United States from 1885 to the late 1960s.' back

Aristotle, Aristotle, Metaphysics, ' All men naturally desire knowledge. An indication of this is our esteem for the senses; for apart from their use we esteem them for their own sake, and most of all the sense of sight. Not only with a view to action, but even when no action is contemplated, we prefer sight, generally speaking, to all the other senses.The reason of this is that of all the senses sight best helps us to know things, and reveals many distinctions.' back

Aristotle (continuity), Physics V, iii, 'A thing that is in succession and touches is 'contiguous'. The 'continuous' is a subdivision of the contiguous: things are called continuous when the touching limits of each become one and the same and are, as the word implies, contained in each other: continuity is impossible if these extremities are two. This definition makes it plain that continuity belongs to things that naturally in virtue of their mutual contact form a unity. And in whatever way that which holds them together is one, so too will the whole be one, e.g. by a rivet or glue or contact or organic union. ' 227a10 sqq back

Atlas Detector - CERN, Detector & Technology, ' The detector itself is a many-layered instrument designed to detect some of the tiniest yet most energetic particles ever created on earth. It consists of six different detecting subsystems wrapped concentrically in layers around the collision point to record the trajectory, momentum, and energy of particles, allowing them to be individually identified and measured. A huge magnet system bends the paths of the charged particles so that their momenta can be measured as precisely as possible.' back

Augustine of Hippo - Wikipedia, Augustine of Hippo - Wikipedia, ' Augustine of Hippo, Latin: Aurelius Augustinus Hipponensis; 13 November 354 – 28 August 430), also known as Saint Augustine, was a theologian and philosopher of Berber origin and the bishop of Hippo Regius in Numidia, Roman North Africa. His writings influenced the development of Western philosophy and Western Christianity, and he is viewed as one of the most important Church Fathers of the Latin Church in the Patristic Period. His many important works include The City of God, On Christian Doctrine, and Confessions.' back

Bernard d'Espagnat (1979), The Quantum Theory and Reality, 'The doctrine that the world is made up of objects whose existence is independent of human consciousness turns out to be in conflict with quantum mechanics and with facts established by experiment'
Bernard d'Espagnat, "Quantum theory and reality", Scientific American 241, (November 1979): 5, 128. back

Beta function (physics) - Wikipedia, Beta function (physics) - Wikipedia, the free encyclopedia, ' In theoretical physics, specifically quantum field theory, a beta function, β(g), encodes the dependence of a coupling parameter, g, on the energy scale, μ, of a given physical process described by quantum field theory. It is defined as
β (g ) = ∂g / ∂ log ⁡μ,
and, because of the underlying renormalization group, it has no explicit dependence on μ, so it only depends on μ implicitly through g. This dependence on the energy scale thus specified is known as the running of the coupling parameter, a fundamental feature of scale-dependence in quantum field theory, and its explicit computation is achievable through a variety of mathematical techniques. ' back

Big Bang - Wikipedia, Big Bang - Wikipedia, the free encyclopedia, ' The Big Bang theory is the prevailing cosmological model explaining the existence of the observable universe from the earliest known periods through its subsequent large-scale evolution. The model describes how the universe expanded from an initial state of high density and temperature, and offers a comprehensive explanation for a broad range of observed phenomena, including the abundance of light elements, the cosmic microwave background (CMB) radiation, and large-scale structure. ' back

Born rule - Wikipedia, Born rule - Wikipedia, the free encyclopedia, 'The Born rule (also called the Born law, Born's rule, or Born's law) is a law of quantum mechanics which gives the probability that a measurement on a quantum system will yield a given result. It is named after its originator, the physicist Max Born. The Born rule is one of the key principles of the Copenhagen interpretation of quantum mechanics. There have been many attempts to derive the Born rule from the other assumptions of quantum mechanics, with inconclusive results. . . . The Born rule states that if an observable corresponding to a Hermitian operator A with discrete spectrum is measured in a system with normalized wave function (see bra-ket notation), then the measured result will be one of the eigenvalues λ of A, and the probability of measuring a given eigenvalue λi will equal <ψ|Pi|ψ> where Pi is the projection onto the eigenspace of A corresponding to λi'. back

Bose-Einstein statistics - Wikipedia, Bose-Einstein statistics - Wikipedia, the free encyclopedia, 'In statistical mechanics, Bose–Einstein statistics (or more colloquially B–E statistics) determines the statistical distribution of identical indistinguishable bosons over the energy states in thermal equilibrium.' back

Boson - Wikipedia, Boson - Wikipedia, the free encyclopedia, 'In particle physics, bosons are particles with an integer spin, as opposed to fermions which have half-integer spin. From a behaviour point of view, fermions are particles that obey the Fermi-Dirac statistics while bosons are particles that obey the Bose-Einstein statistics. They may be either elementary, like the photon, or composite, as mesons. All force carrier particles are bosons. They are named after Satyendra Nath Bose. In contrast to fermions, several bosons can occupy the same quantum state. Thus, bosons with the same energy can occupy the same place in space.' back

Brain size - Wikipedia, Brain size - Wikipedia, the free encyclopedia, 'The size of the brain is a frequent topic of study within the fields of anatomy and evolution. Brain size is sometimes measured by weight and sometimes by volume (via MRI scans or by skull volume). Neuroimaging intelligence testing can be used to study the size of the brain in males and females. One question that has been frequently investigated is the relation of brain size to intelligence.' back

Brouwer fixed point theorem - Wikipedia, Brouwer fixed point theorem - Wikipedia, the free encyclopedia, ' Brouwer's fixed-point theorem is a fixed-point theorem in topology, named after Luitzen Brouwer. It states that for any continuous function f with certain properties there is a point x0 such that f(x0) = x0. The simplest form of Brouwer's theorem is for continuous functions f from a disk D to itself. A more general form is for continuous functions from a convex compact subset K of Euclidean space to itself.' back

Calculus of variations - Wikipedia, Calculus of variations - Wikipedia, the free encylopedia, ' The calculus of variations may be said to begin with Newton's minimal resistance problem in 1687, followed by the brachistochrone curve problem raised by Johann Bernoulli (1696). It immediately occupied the attention of Jakob Bernoulli and the Marquis de l'Hôpital, but Leonhard Euler first elaborated the subject, beginning in 1733. Lagrange was influenced by Euler's work to contribute significantly to the theory. After Euler saw the 1755 work of the 19-year-old Lagrange, Euler dropped his own partly geometric approach in favor of Lagrange's purely analytic approach and renamed the subject the calculus of variations in his 1756 lecture Elementa Calculi Variationum.' back

Cantor's theorem - Wikipedia, Cantor's theorem - Wikipedia, the free encyclopedia, ' In elementary set theory, Cantor's theorem is a fundamental result which states that, for any set A, the set of all subsets of A (the power set of A, denoted by P(A) ) has a strictly greater cardinality than A itself. For finite sets, Cantor's theorem can be seen to be true by simple enumeration of the number of subsets. Counting the empty set as a subset, a set with n members has a total of 2n subsets, so that if card (A) = n, then card (P(A)) = 2n, and the theorem holds because 2n > n for all non-negative integers.' back

Catherine & Johnathan Karoly, Heitor Villa-Lobos: The Jet Whistle, ' Villa-Lobos never forgot his “musical education” – the Rio street bands, the trips to the Amazon, and the music of the movie halls and theaters of his teen-age years. He fused these diverse influences into a powerfully nationalist musical voice. Villa-Lobos composed Assobio a Jato (The Jet Whistle) in New York in 1950. The composer named his work to describe the technique he calls on the flutist to use during its last movement. To produce the effect, the player blows directly and forcefully into the flute with his or her mouth almost covering the mouthpiece. Combined with a glissando, the resulting whistle sounds like a jet taking off.' back

Catholic Catechism: §§ 385-412, Satan, '391 Behind the disobedient choice of our first parents lurks a seductive voice, opposed to God, which makes them fall into death out of envy. Scripture and the Church's Tradition see in this being a fallen angel, called "Satan" or the "devil". The Church teaches that Satan was at first a good angel, made by God: "The devil and the other demons were indeed created naturally good by God, but they became evil by their own doing." ' back

Christianity - Wikipedia, Christianity - Wikipedia, the free encyclopedia, ' Christianity is an Abrahamic monotheistic religion based on the life and teachings of Jesus of Nazareth. Its adherents, known as Christians, believe that Jesus is the Christ, whose coming as the messiah was prophesied in the Hebrew Bible, called the Old Testament in Christianity, and chronicled in the New Testament. It is the world's largest religion, with about 2.4 billion followers.' back

Christopher Shields (Stanford Encyclopedia of Philosophy), The Active Mind of De Anima III 5 , ' After characterizing the mind (nous) and its activities in De Animaiii 4, Aristotle takes a surprising turn. In De Anima iii 5, he introduces an obscure and hotly disputed subject: the active mind or active intellect (nous poiêtikos). Controversy surrounds almost every aspect of De Anima iii 5, not least because in it Aristotle characterizes the active mind—a topic mentioned nowhere else in his entire corpus—as ‘separate and unaffected and unmixed, being in its essence actuality’ (chôristos kai apathês kai amigês, tê ousia energeia; DA iii 5, 430a17–18) and then also as ‘deathless and everlasting’ (athanaton kai aidion; DA iii 5, 430a23). This comes as no small surprise to readers of De Anima, because Aristotle had earlier in the same work treated the mind (nous) as but one faculty (dunamis) of the soul (psuchê), and he had contended that the soul as a whole is not separable from the body (DA ii 1, 413a3–5). back

Classical mechanics - Wikipedia, Classical mechanics - Wikipedia, the free encyclopedia, 'Classical mechanics (commonly confused with Newtonian mechanics, which is a subfield thereof) is used for describing the motion of macroscopic objects, from projectiles to parts of machinery, as well as astronomical objects, such as spacecraft, planets, stars, and galaxies. It produces very accurate results within these domains, and is one of the oldest and largest subjects in science and technology.' back

Claude E Shannon, A Mathematical Theory of Communication, 'The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point. Frequently the messages have meaning; that is they refer to or are correlated according to some system with certain physical or conceptual entities. These semantic aspects of communication are irrelevant to the engineering problem. The significant aspect is that the actual message is one selected from a set of possible messages.' back

Claude Shannon (1949), Communication in the Presence of Noise, 'A method is developed for representing any communication system geometrically. Messages and the corresponding signals are points in two “function spaces,” and the modulation process is a mapping of one space into the other. Using this representation, a number of results in communication theory are deduced concerning expansion and compression of bandwidth and the threshold effect. Formulas are found for the maximum rate of transmission of binary digits over a system when the signal is perturbed by various types of noise. Some of the properties of “ideal” systems which transmit at this maximum rate are discussed. The equivalent number of binary digits per second for certain information sources is calculated.' [C. E. Shannon , “Communication in the presence of noise,” Proc. IRE, vol. 37, pp. 10–21, Jan. 1949.] back

Codec - Wikipedia, Codec - Wikipedia, the free encyclopedia, 'A codec is a device or computer program capable of encoding or decoding a digital data stream or signal. Codec is a portmanteau of coder-decoder or, less commonly, compressor-decompressor.' back

Complete theory - Wikipedia, Complete theory - Wikipedia - Wikipedia, the free encyclopedia, 'I In mathematical logic, a theory is complete if, for every closed formula in the theory's language, that formula or its negation is demonstrable. Recursively axiomatizable first-order theories that are consistent and rich enough to allow general mathematical reasoning to be formulated cannot be complete, as demonstrated by Gödel's first incompleteness theorem. This sense of complete is distinct from the notion of a complete logic, which asserts that for every theory that can be formulated in the logic, all semantically valid statements are provable theorems (for an appropriate sense of "semantically valid"). Gödel's completeness theorem is about this latter kind of completeness. ' back

Completeness of the real numbers - Wikipedia, Completeness of the real numbers - Wikipedia, the free encyclopedia, 'Intuitively, completeness implies that there are not any “gaps” (in Dedekind's terminology) or “missing points” in the real number line. . . . Depending on the construction of the real numbers used, completeness may take the form of an axiom (the completeness axiom), or may be a theorem proven from the construction. There are many equivalent forms of completeness, the most prominent being Dedekind completeness and Cauchy completeness (completeness as a metric space).' back

Complex number - Wikipedia, Complex number - Wikipedia, the free encyclopedia, 'IA complex number is a number that can be expressed in the form a + bi, where a and b are real numbers and i is the imaginary unit, which satisfies the equation i2 = −1. In this expression, a is the real part and b is the imaginary part of the complex number. Complex numbers extend the concept of the one-dimensional number line to the two-dimensional complex plane (also called Argand plane) by using the horizontal axis for the real part and the vertical axis for the imaginary part.' back

Complex plane - Wikipedia, Complex plane - Wikipedia, the free encuclopedia, ' In mathematics, the complex plane is a geometric representation of the complex numbers established by the real axis and the orthogonal imaginary axis. It can be thought of as a modified Cartesian plane, with the real part of a complex number represented by a displacement along the x-axis, and the imaginary part by a displacement along the y-axis.' back

Computability theory - Wikipedia, Computability theory - Wikipedia, the free encyclopedia, 'Computability theory, also called recursion theory, is a branch of mathematical logic that originated in the 1930s with the study of computable functions and Turing degrees. The field has grown to include the study of generalized computability and definability. In these areas, recursion theory overlaps with proof theory and effective descriptive set theory. The basic questions addressed by recursion theory are "What does it mean for a function from the natural numbers to themselves to be computable?" and "How can noncomputable functions be classified into a hierarchy based on their level of noncomputability?". The answers to these questions have led to a rich theory that is still being actively researched.' back

Computer network - Wikipedia, Computer network - Wikipedia the free encyclopedia, 'A computer network, or simply a network, is a collection of computers and network hardware interconnected by communication channels that allow sharing of resources and information. . . . The best known computer network is the Internet. . . . Computer networking can be considered a branch of electrical engineering, telecommunications, computer science, information technology or computer engineering, since it relies upon the theoretical and practical application of the related disciplines.' back

Cosmological constant problem - Wikipedia, Cosmological constant problem - Wikipedia, the free encyclopedia, 'In cosmology, the cosmological constant problem or vacuum catastrophe is the disagreement between measured values of the vacuum energy density (the small value of the cosmological constant) and the zero-point energy suggested by quantum field theory. Depending on the assumptions, the discrepancy ranges from 40 to more than 100 orders of magnitude, a state of affairs described by Hobson et al. (2006) as "the worst theoretical prediction in the history of physics." ' back

Delay line memory - Wikipedia, Delay line memory - Wikipedia, the free encyclopedia, ' Delay line memory is a form of computer memory, now obsolete, that was used on some of the earliest digital computers. Like many modern forms of electronic computer memory, delay line memory was a refreshable memory, but as opposed to modern random-access memory, delay line memory was sequential-access.' back

Differentiable manifold - Wikipedia, Differentiable manifold - Wikipedia, the free encyclopedia, ' In mathematics, a differentiable manifold is a type of manifold that is locally similar enough to a linear space to allow one to do calculus. Any manifold can be described by a collection of charts, also known as an atlas. One may then apply ideas from calculus while working within the individual charts, since each chart lies within a linear space to which the usual rules of calculus apply. If the charts are suitably compatible (namely, the transition from one chart to another is differentiable), then computations done in one chart are valid in any other differentiable chart.' back

Double-slit experiment - Wikipedia, Double-slit experiment - Wikipedia, the free encyclopedia, 'In the double-slit experiment, light is shone at a solid thin plate that has two slits cut into it. A photographic plate is set up to record what comes through those slits. One or the other slit may be open, or both may be open. . . . The most baffling part of this experiment comes when only one photon at a time is fired at the barrier with both slits open. The pattern of interference remains the same as can be seen if many photons are emitted one at a time and recorded on the same sheet of photographic film. The clear implication is that something with a wavelike nature passes simultaneously through both slits and interferes with itself — even though there is only one photon present. (The experiment works with electrons, atoms, and even some molecules too.)' back

Eigenvalues and eigenvectors - Wikipedia, Eigenvalues and eigenvectors - Wikipedia, the free encyclopedia, ' In linear algebra, an eigenvector or characteristic vector of a linear transformation is a nonzero vector that changes at most by a scalar factor when that linear transformation is applied to it. The corresponding eigenvalue, often denoted by λ, is the factor by which the eigenvector is scaled. Geometrically, an eigenvector, corresponding to a real nonzero eigenvalue, points in a direction in which it is stretched by the transformation and the eigenvalue is the factor by which it is stretched. If the eigenvalue is negative, the direction is reversed. Loosely speaking, in a multidimensional vector space, the eigenvector is not rotated. ' back

Einstein, Podolsky & Rosen (1935), Can the Quantum Mechanical Description of Physical Reality be Considered Complete?, A PDF of the classic paper. 'In a complete theory there is an element corresponding to each element of reality. A sufficient condition for the reality of a physical quantity is the possibility of predicting it with certainty, without disturbing the system. In quantum mechanics in the case of two physical quantities described by non-commuting operators, the knowledge of one precludes the knowledge of the other. Then either (1) the description of reality given by the wave function in quantum mechanics is not complete or (2) these two quantities cannot have simultaneous reality. Consideration of the problem of making predictions concerning a system on the basis of measurements made on another system that had previously interacted with it leads to the result that if (1) is false then (2) is also false, One is thus led to conclude that the description of reality given by the wave function is not complete.' back

Einstein, Podolsky and Rosen, Can the Quantum Mechanical Description of Physical Reality be Considered Complete?, A PDF of the classic paper. 'In a complete theory there is an element corresponding to each element of reality. A sufficient condition for the reality of a physical quantity is the possibility of predicting it with certainty, without disturbing the system. In quantum mechanics in the case of two physical quantities described by non-commuting operators, the knowledge of one precludes the knowledge of the other. Then either (1) the description of reality given by the wave function in quantum mechanics is not complete or (2) these two quantities cannot have simultaneous reality. Consideration of the problem of making predictions concerning a system on the basis of measurements made on another system that had previously interacted with it leads to the result that if (1) is false then (2) is also false, One is thus led to conclude that the description of reality given by the wave function is not complete.' back

Entropy - Wikipedia, Entropy - Wikipedia, the free encyclopedia, 'In statistical mechanics, entropy is an extensive property of a thermodynamic system. It is closely related to the number Ω of microscopic configurations (known as microstates) that are consistent with the macroscopic quantities that characterize the system (such as its volume, pressure and temperature). Under the assumption that each microstate is equally probable, the entropy S is the natural logarithm of the number of microstates, multiplied by the Boltzmann constant kB. Formally (assuming equiprobable microstates), S = k B ln ⁡ Ω . ' back

Entscheidungsproblem - Wikipedia, Entscheidungsproblem - Wikipedia, the free encyclopedia, 'In mathematics, the Entscheidungsproblem (. . . German for 'decision problem') is a challenge posed by David Hilbert in 1928. The Entscheidungsproblem asks for an algorithm that will take as input a description of a formal language and a mathematical statement in the language and produce as output either "True" or "False" according to whether the statement is true or false. . . . In 1936 and 1937 Alonzo Church and Alan Turing respectively, published independent papers showing that it is impossible to decide algorithmically whether statements in arithmetic are true or false, and thus a general solution to the Entscheidungsproblem is impossible. This result is now known as Church's Theorem or the Church–Turing Theorem (not to be confused with the Church–Turing thesis).' back

(ε, δ)-definition of limit - Wikipedia, (ε, δ)-definition of limit - Wikipedia, the free encyclopedia, 'In calculus, the (ε, δ)-definition of limit ("epsilon-delta definition of limit") is a formalization of the notion of limit. It was first given by Bernard Bolzano in 1817. Augustin-Louis Cauchy never gave an (ε, δ) definition of limit in his Cours d'Analyse, but occasionally used ε, δ arguments in proofs. The definitive modern statement was ultimately provided by Karl Weierstrass.' back

Eternity of the world - Wikipedia, Eternity of the world - Wikipedia, the free encyclopedia, ' The eternity of the world is the question of whether the world has a beginning in time or has existed from eternity. It was a concern for both ancient philosophers and the medieval theologians and medieval philosophers of the 13th century. The problem became a focus of a dispute in the 13th century, when some of the works of Aristotle, who believed in the eternity of the world, were rediscovered in the Latin West. This view conflicted with the view of the Catholic Church that the world had a beginning in time. The Aristotelian view was prohibited in the Condemnations of 1210–1277.' back

Euclidean geometry - Wikipedia, Euclidean geometry - Wikipedia, the free encyclopedia, ' Euclidean geometry is a mathematical system attributed to Alexandrian Greek mathematician Euclid, which he described in his textbook on geometry: the Elements. Euclid's method consists in assuming a small set of intuitively appealing axioms, and deducing many other propositions (theorems) from these. . . . The Elements begins with plane geometry, still taught in secondary school (high school) as the first axiomatic system and the first examples of formal proof. It goes on to the solid geometry of three dimensions. Much of the Elements states results of what are now called algebra and number theory, explained in geometrical language.' back

Euler's formula - Wikipedia, Euler's formula - Wikipedia, the free encyclopedia, 'Euler's formula, named after Leonhard Euler, is a mathematical formula in complex analysis that establishes the fundamental relationship between the trigonometric functions and the complex exponential function. Richard Feynman called Euler's formula "our jewel" and "the most remarkable formula in mathematics". (Feynman, Richard P. (1977). The Feynman Lectures on Physics, vol. I. Addison-Wesley, p. 22-10. ISBN 0-201-02010-6.) back

Evolution - Wikipedia, Evolution - Wikipedia, the free encyclopedia, '. . . Charles Darwin and Alfred Wallace were the first to formulate a scientific argument for the theory of evolution by means of natural selection. Evolution by natural selection is a process that is inferred from three facts about populations: 1) more offspring are produced than can possibly survive, 2) traits vary among individuals, leading to different rates of survival and reproduction, and 3) trait differences are heritable. . . . ' back

Fermi-Dirac statistics - Wikipedia, Fermi-Dirac statistics - Wikipedia, the fre encyclopedia, 'In statistical mechanics, Fermi-Dirac statistics is a particular case of particle statistics developed by Enrico Fermi and Paul Dirac that determines the statistical distribution of fermions over the energy states for a system in thermal equilibrium. In other words, it is the distribution of the probabilities that each possible energy levels is occupied by a fermion. back

Fermion - Wikipedia, Fermion - Wikipedia, the free encyclopedia, 'In particle physics, fermions are particles with a half-integer spin, such as protons and electrons. They obey the Fermi-Dirac statistics and are named after Enrico Fermi. In the Standard Model there are two types of elementary fermions: quarks and leptons. . . . In contrast to bosons, only one fermion can occupy a quantum state at a given time (they obey the Pauli Exclusion Principle). Thus, if more than one fermion occupies the same place in space, the properties of each fermion (e.g. its spin) must be different from the rest. Therefore fermions are usually related with matter while bosons are related with radiation, though the separation between the two is not clear in quantum physics. back

Feynman, Leighton & Sands FLP III:01, Chapter 1: Quantum Behaviour, 'The gradual accumulation of information about atomic and small-scale behavior during the first quarter of the 20th century, which gave some indications about how small things do behave, produced an increasing confusion which was finally resolved in 1926 and 1927 by Schrödinger, Heisenberg, and Born. They finally obtained a consistent description of the behavior of matter on a small scale. We take up the main features of that description in this chapter.' back

Feynman, Leighton & Sands FLP III:03, Chapter 3: Probability Amplitudes, 'We will begin in this chapter by dealing with some general quantum mechanical ideas. Some of the statements will be quite precise, others only partially precise. It will be hard to tell you as we go along which is which, but by the time you have finished the rest of the book, you will understand in looking back which parts hold up and which parts were only explained roughly. The chapters which follow this one will not be so imprecise. In fact, one of the reasons we have tried carefully to be precise in the succeeding chapters is so that we can show you one of the most beautiful things about quantum mechanics—how much can be deduced from so little.' back

Feynman, Leighton & Sands FLP III:08, Chapter 8: The Hamiltonian Matrix, 'One problem then in describing nature is to find a suitable representation for the base states. But that’s only the beginning. We still want to be able to say what “happens.” If we know the “condition” of the world at one moment, we would like to know the condition at a later moment. So we also have to find the laws that determine how things change with time. We now address ourselves to this second part of the framework of quantum mechanics—how states change with time.' back

Feynman, Leighton and Sands FLP II_02, Chapter 2: Differential Calculus of Vector Fields, ' Ideas such as the field lines, capacitance, resistance, and inductance are, for such purposes, very useful. So we will spend much of our time analyzing them. In this way we will get a feel as to what should happen in different electromagnetic situations. On the other hand, none of the heuristic models, such as field lines, is really adequate and accurate for all situations. There is only one precise way of presenting the laws, and that is by means of differential equations. They have the advantage of being fundamental and, so far as we know, precise. If you have learned the differential equations you can always go back to them. There is nothing to unlearn.' back

Form of the Good - Wikipedia, Form of the Good - Wikipedia, the free encyclopedia, ' "Form of the Good", or more literally "the idea of the good" (ἡ τοῦ ἀγαθοῦ ἰδέα) is a concept in the philosophy of Plato. It is described in Plato's dialogue the Republic (508e2–3), speaking through the character of Socrates. This form is the one that allows a philosopher-in-training to advance to a philosopher-king. It cannot be clearly seen or explained, but it is the form that allows one to realize all the other forms. The definition of the Good is a perfect, eternal, and changeless Form, existing outside space and time, in which particular good things share.' back

Formalism (mathematics) - Wikipedia, Formalism (mathematics) - Wikipedia, the free encyclopedia, ' In foundations of mathematics, philosophy of mathematics, and philosophy of logic, formalism is a theory that holds that statements of mathematics and logic can be thought of as statements about the consequences of certain string manipulation rules. For example, Euclidean geometry can be seen as a game whose play consists in moving around certain strings of symbols called axioms according to a set of rules called "rules of inference" to generate new strings. In playing this game one can "prove" that the Pythagorean theorem is valid because the string representing the Pythagorean theorem can be constructed using only the stated rules.' back

Frank Wilczek (1999), Quantum Field Theory, ' What are the essential features of quantum field theory? This question has no sharp answer. Theoretical physicists are very flexible in adapting their tools, and no axiomization can keep up with them. However I think it is fair to say that there are two characteristic, core ideas of quantum field theory. First: The basic dynamical degrees of freedom are operator functions of space and time – quantum fields, that obey appropriate commutation relations. Second: The interactions of these fields are local in space and time.' back

Gaussian curvature - Wikipedia, Gaussian curvature - Wikipedia, the free encyclopedia, ' In differential geometry, the Gaussian curvature or Gauss curvature of a point on a surface is the product of the principal curvatures, κ1 and κ2, of the given point. It is an intrinsic measure of curvature, i.e., its value depends only on how distances are measured on the surface, not on the way it is isometrically embedded in space. This result is the content of Gauss's Theorema egregium.' back

General relativity - Wikipedia, General relativity - Wikipedia, the free encyclopedia, 'General relativity or the general theory of relativity is the geometric theory of gravitation published by Albert Einstein in 1916. It is the current description of gravitation in modern physics. General relativity generalises special relativity and Newton's law of universal gravitation, providing a unified description of gravity as a geometric property of space and time, or spacetime. In particular, the curvature of spacetime is directly related to the four-momentum (mass-energy and linear momentum) of whatever matter and radiation are present. The relation is specified by the Einstein field equations, a system of partial differential equations.' back

Genesis, Genesis, from the Holy Bible, King James Version, '1: In the beginning God created the heaven and the earth. 2: And the earth was without form, and void; and darkness was upon the face of the deep. And the Spirit of God moved upon the face of the waters. 3: And God said, Let there be light: and there was light.' back

Genesis, King James Version, ' In the beginning God created the heaven and the earth.
And the earth was without form, and void; and darkness was upon the face of the deep. And the Spirit of God moved upon the face of the waters.
And God said, Let there be light: and there was light.' back

Geodesics in general relativity - Wikipedia, Geodesics in general relativity - Wikipedia, the free encyclopedia, ' In general relativity, a geodesic generalizes the notion of a "straight line" to curved spacetime. Importantly, the world line of a particle free from all external, non-gravitational force, is a particular type of geodesic. In other words, a freely moving or falling particle always moves along a geodesic.' back

Gödel's incompleteness theorems - Wikipedia, Gödel's incompleteness theorems - Wikipedia, 'Gödel's incompleteness theorems are two theorems of mathematical logic that establish inherent limitations of all but the most trivial axiomatic systems capable of doing arithmetic. The theorems, proven by Kurt Gödel in 1931, are important both in mathematical logic and in the philosophy of mathematics. The two results are widely, but not universally, interpreted as showing that Hilbert's program to find a complete and consistent set of axioms for all mathematics is impossible, giving a negative answer to Hilbert's second problem. The first incompleteness theorem states that no consistent system of axioms whose theorems can be listed by an "effective procedure" (i.e., any sort of algorithm) is capable of proving all truths about the relations of the natural numbers (arithmetic). For any such system, there will always be statements about the natural numbers that are true, but that are unprovable within the system. The second incompleteness theorem, an extension of the first, shows that such a system cannot demonstrate its own consistency.' back

Gravitational-wave observatory - Wikipedia, Gravitational-wave observatory - Wikipedia, the free encyclopedia, 'A gravitational-wave detector (used in a gravitational-wave observatory) is any device designed to measure tiny distortions of spacetime called gravitational waves. Since the 1960s, various kinds of gravitational-wave detectors have been built and constantly improved. The present-day generation of laser interferometers has reached the necessary sensitivity to detect gravitational waves from astronomical sources, thus forming the primary tool of gravitational-wave astronomy.' back

Gregory J. Chaitin (1982), Gödel's Theorem and Information, 'Abstract: Gödel's theorem may be demonstrated using arguments having an information-theoretic flavor. In such an approach it is possible to argue that if a theorem contains more information than a given set of axioms, then it is impossible for the theorem to be derived from the axioms. In contrast with the traditional proof based on the paradox of the liar, this new viewpoint suggests that the incompleteness phenomenon discovered by Gödel is natural and widespread rather than pathological and unusual.'
International Journal of Theoretical Physics 21 (1982), pp. 941-954 back

Hamiltonian (quantum mechanics) - Wikipedia, Hamiltonian (quantum mechanics) - Wikipedia, the free encyclopedia, 'I In quantum mechanics, a Hamiltonian is an operator corresponding to the sum of the kinetic energies plus the potential energies for all the particles in the system (this addition is the total energy of the system in most of the cases under analysis). It is usually denoted by H . . .. Its spectrum is the set of possible outcomes when one measures the total energy of a system. Because of its close relation to the time-evolution of a system, it is of fundamental importance in most formulations of quantum theory. The Hamiltonian is named after William Rowan Hamilton, who created a revolutionary reformulation of Newtonian mechanics, now called Hamiltonian mechanics, which is also important in quantum physics. ' back

Hamiltonian mechanics - Wikipedia, Hamiltonian mechanics - Wikipedia, the free encyclopedia, ' Hamiltonian mechanics is a mathematically sophisticated formulation of classical mechanics. Historically, it contributed to the formulation of statistical mechanics and quantum mechanics. Hamiltonian mechanics was first formulated by William Rowan Hamilton in 1833, starting from Lagrangian mechanics, a previous reformulation of classical mechanics introduced by Joseph Louis Lagrange in 1788. Like Lagrangian mechanics, Hamiltonian mechanics is equivalent to Newton's laws of motion in the framework of classical mechanics. back

Hebrew Bible - Wikipedia, Hebrew Bible - Wikipedia, the free encyclopedia, The Hebrew Bible . . . is a term referring to the books of the Jewish Bible as originally written mostly in Biblical Hebrew with some Biblical Aramaic. The term closely corresponds to contents of the Jewish Tanakh and the Protestant Old Testament (see also Judeo-Christian) but does not include the deuterocanonical portions of the Roman Catholic or the Anagignoskomena portions of the Eastern Orthodox Old Testaments. The term does not imply naming, numbering or ordering of books, which varies (see also Biblical canon).' back

Hilbert's program - Wikipedia, Hilbert's program - Wikipedia, the free encyclopedia, ' In mathematics, Hilbert's program, formulated by German mathematician David Hilbert, was a proposed solution to the foundational crisis of mathematics, when early attempts to clarify the foundations of mathematics were found to suffer from paradoxes and inconsistencies. As a solution, Hilbert proposed to ground all existing theories to a finite, complete set of axioms, and provide a proof that these axioms were consistent. Hilbert proposed that the consistency of more complicated systems, such as real analysis, could be proven in terms of simpler systems. Ultimately, the consistency of all of mathematics could be reduced to basic arithmetic. back

Hindu - Wikipedia, Hindu - Wikipedia, the free encyclopedia, 'In 1995, while considering the question "who are Hindus and what are the broad features of Hindu religion", the Supreme Court of India highlighted Bal Gangadhar Tilak's formulation of Hinduism's defining features:
Acceptance of the Vedas with reverence; recognition of the fact that the means or ways to salvation are diverse; and the realization of the truth that the number of gods to be worshipped is large, that indeed is the distinguishing feature of Hindu religion.' back

History of the Internet - Wikipedia, History of the Internet - Wikipedia, the free encyclopedia, ' The history of the Internet begins with the development of electronic computers in the 1950s. Initial concepts of packet networking originated in several computer science laboratories in the United States, United Kingdom, and France. The US Department of Defense awarded contracts as early as the 1960s for packet network systems, including the development of the ARPANET (which would become the first network to use the Internet Protocol).' back

Human brain - Wikipedia, Human brain - Wikipedia, the free encyclopedia, 'The human brain is the central organ of the human nervous system, and with the spinal cord makes up the central nervous system. The brain consists of the cerebrum, the brainstem and the cerebellum. It controls most of the activities of the body, processing, integrating, and coordinating the information it receives from the sense organs, and making decisions as to the instructions sent to the rest of the body.' back

Human Genome Project - Wikipedia, Human Genome Project - Wikipedia, the free encyclopedia, ' The Human Genome Project (HGP) was an international scientific research project with the goal of determining the base pairs that make up human DNA, and of identifying and mapping all of the genes of the human genome from both a physical and a functional standpoint. . . . The Human Genome Project originally aimed to map the nucleotides contained in a human haploid reference genome (more than three billion). The "genome" of any given individual is unique; mapping the "human genome" involved sequencing a small number of individuals and then assembling these together to get a complete sequence for each chromosome. Therefore, the finished human genome is a mosaic, not representing any one individual.' back

Hylomorphism - Wikipedia, Hylomorphism - Wikipedia, the free encyclopedia, 'Hylomorphism (Greek ὑλο- hylo-, "wood, matter" + -morphism < Greek μορφή, morphē, "form") is a philosophical theory developed by Aristotle, which analyzes substance into matter and form. Substances are conceived of as compounds of form and matter.' back

On the Trinity - Wikipedia, On the Trinity - Wikipedia, the free encyclopedia, 'On the Trinity (Latin: De Trinitate) is a Latin book written by Augustine of Hippo to discuss the Trinity in context of the logos. Although not as well known as some of his other works, it is arguably his masterpiece and of more doctrinal importance than the Confessions or City of God. . . . Arthur West Haddan inferred from [the] evidence that it was written between 400, when he was forty-six years old and had been Bishop of Hippo about four years, and 428 at the latest; but it probably had been published ten or twelve years earlier, in around 417.' back

Inner product space - Wikipedia, Inner product space - Wikipedia, the free encyclopedia, 'In mathematics, an inner product space is a vector space of arbitrary (possibly infinite) dimension with additional structure, which, among other things, enables generalization of concepts from two or three-dimensional Euclidean geometry. The additional structure associates to each pair of vectors in the space a number which is called the inner product (also called a scalar product) of the vectors. Inner products allow the rigorous introduction of intuitive geometrical notions such as the angle between vectors or length of vectors in spaces of all dimensions. It also allows introduction of the concept of orthogonality between vectors. Inner product spaces generalize Euclidean spaces (with the dot product as the inner product) and are studied in functional analysis. An inner product space is sometimes also called a pre-Hilbert space, since its completion with respect to the metric induced by its inner product is a Hilbert space.' back

Internet protocol suite - Wikipedia, Internet protocol suite - Wikipedia, the freeencyclopedia, ' The Internet protocol suite is the conceptual model and set of communications protocols used in the Internet and similar computer networks. It is commonly known as TCP/IP because the foundational protocols in the suite are the Transmission Control Protocol (TCP) and the Internet Protocol (IP). . . .. The Internet protocol suite provides end-to-end data communication specifying how data should be packetized, addressed, transmitted, routed, and received. . . .. The technical standards underlying the Internet protocol suite and its constituent protocols are maintained by the Internet Engineering Task Force (IETF). The Internet protocol suite predates the OSI model, a more comprehensive reference framework for general networking systems. ' back

Isaac Newton, Method of Fluxions and Infinite Series with its Application to the Geometry of Curve-Lines, 'The method of fluxions and infinite series with its application to the geometry of curve-lines by the inventor Sir Isaac Newton ... ; translated from the author's Latin original not yet made publick. To which is subjoin'd, A perpetual comment upon the whole work, consisting of annotations, illustrations, and supplements, to make this treatise a compleat institution for the use of learners. back

Jeff Tollefson, US achieves laser-fusion record: what it means for nuclear-weapons research, ' Housed at Lawrence Livermore National Laboratory in California, the US$3.5-billion facility wasn’t designed to serve as a power-plant prototype, however, but rather to probe fusion reactions at the heart of thermonuclear weapons. After the United States banned underground nuclear testing at the end of the cold war in 1992, the energy department proposed the NIF as part of a larger science-based Stockpile Stewardship Program, designed to verify the reliability of the country’s nuclear weapons without detonating any of them.' back

Jeffrey Nicholls (1987), A theory of Peace, ' The argument: I began to think about peace in a very practical way during the Viet Nam war. I was the right age to be called up. I was exempted because I was a clergyman, but despite the terrors that war held for me, I think I would have gone. It was my first whiff of the force of patriotism. To my amazement, it was strong enough to make even me face death.
In the Church, I became embroiled in a deeper war. Not a war between goodies and baddies, but the war between good and evil that lies at the heart of all human consciousness. Existence is a struggle. We need all the help we can get. Religion is part of that help.' back

Jeffrey Nicholls (July 2019), Entropy and metaethics, ' I propose an answer [to the problematic search for modern ethics] in terms of what Einstein considered to be the most fundamental and irrefutable law of nature, the second law of thermodynamics, which expresses the fact that entropy almost never decreases. In a more morally relevant frame, this law expresses the fact that the universe is inherently creative. Human spirituality, whatever it may be, has emerged from the natural world. back

Jessica Haines, Two Stones Wave Patterns, back

Jim Branson, Eigenvalue Equations, 'The time independent Schrödinger Equation is an example of an Eigenvalue equation. ' back

John Palmer (Stanford Encyclopedia of Philosophy), Parmenides, ' Immediately after welcoming Parmenides to her abode, the goddess describes as follows the content of the revelation he is about to receive:
You must needs learn all things,/ both the unshaken heart of well-rounded reality/ and the notions of mortals, in which there is no genuine trustworthiness./ Nonetheless these things too will you learn, how what they resolved/ had actually to be, all through all pervading. (Fr. 1.28b-32) ' back

John Paul II (1983), Code of Canon Law: §331: Papal Power, ' Can. 331 The bishop of the Roman Church, in whom continues the office given by the Lord uniquely to Peter, the first of the Apostles, and to be transmitted to his successors, is the head of the college of bishops, the Vicar of Christ, and the pastor of the universal Church on earth. By virtue of his office he possesses supreme, full, immediate, and universal ordinary power in the Church, which he is always able to exercise freely.' back

John Paul II (1994), Ordinatio Sacerdotalis: Apostolic Letter to the Bishops of the Catholic Church on Reserving Priestly Ordination to Men Alone., 'When the question of the ordination of women arose in the Anglican Communion, Pope Paul VI, out of fidelity to his office of safeguarding the Apostolic Tradition, and also with a view to removing a new obstacle placed in the way of Christian unity, reminded Anglicans of the position of the Catholic Church: "She holds that it is not admissible to ordain women to the priesthood, for very fundamental reasons. These reasons include: the example recorded in the Sacred Scriptures of Christ choosing his Apostles only from among men; the constant practice of the Church, which has imitated Christ in choosing only men; and her living teaching authority which has consistently held that the exclusion of women from the priesthood is in accordance with God's plan for his Church".' back

John the Evangelist, The Gospel of John (KJV), ' In the beginning was the Word, and the Word was with God, and the Word was God. The same was in the beginning with God. All things were made by him; and without him was not any thing made that was made. In him was life; and the life was the light of men. And the light shineth in darkness; and the darkness comprehended it not.' back

John von Neumann (2014), Mathematical Foundations of Quantum Mechanics, ' Mathematical Foundations of Quantum Mechanics by John von Neumann translated from the German by Robert T. Beyer (New Edition) edited by Nicholas A. Wheeler. Princeton UP Princeton & Oxford. Preface: ' This book is the realization of my long-held intention to someday use the resources of TEX to produce a more easily read version of Robert T. Beyer’s authorized English translation (Princeton University Press, 1955) of John von Neumann’s classic Mathematische Grundlagen der Quantenmechanik (Springer, 1932).'This content downloaded from 129.127.145.240 on Sat, 30 May 2020 22:38:31 UTC back

Juan Yin et al, Lower Bound on the Speed of Nonlocal Correlations without Locality and Measurement Choice Loopholes , ' In their well-known paper, Einstein, Podolsky, and Rosen called the nonlocal correlation in quantum entanglement a “spooky action at a distance.” If the spooky action does exist, what is its speed? All previous experiments along this direction have locality and freedom-of-choice loopholes. Here, we strictly closed the loopholes by observing a 12 h continuous violation of the Bell inequality and concluded that the lower bound speed of spooky action was 4 orders of magnitude of the speed of light if Earth’s speed in any inertial reference frame was less than 10-3 time the speed of light. ' back

Kerson Huang (2013), A Critical History of Renormalization, ' The history of renormalization is reviewed with a critical eye,starting with Lorentz's theory of radiation damping, through perturbative QED with Dyson, Gell‐Mann & Low, and others, to Wilson's formulation and Polchinski's functional equation,and applications to "triviality", and dark energy in cosmology.' back

Lagrangian mechanics - Wikipedia, Lagrangian mechanics - Wikipedia, the free encyclopedia, ' Introduced by the Italian-French mathematician and astronomer Joseph-Louis Lagrange in 1788, Lagrangian mechanics is a formulation of classical mechanics and is founded on the stationary action principle. Given a system of point masses and a pair, t1 and t2 Lagrangian mechanics postulates that the system's trajectory (describing evolution of the system over time) . . . must be a stationary point of the action functional S = L dt. By convention, L = T − V, where T and V are the kinetic and potential energy of the system, respectively.' back

Laplace's demon - Wikipedia, Laplace's demon - Wikipedia, the free encyclopedia, ' We may regard the present state of the universe as the effect of its past and the cause of its future. An intellect which at a certain moment would know all forces that set nature in motion, and all positions of all items of which nature is composed, if this intellect were also vast enough to submit these data to analysis, it would embrace in a single formula the movements of the greatest bodies of the universe and those of the tiniest atom; for such an intellect nothing would be uncertain and the future just like the past would be present before its eyes.' A Philosophical Essay on Probabilities, Essai philosophique dur les probabilites introduction to the second edition of Theorie analytique des probabilites based on a lecture given in 1794. back

Lie Group - Wikipedia, Lie Group - Wikipedia, the free encyclopedia, 'In mathematics, a Lie group . . . is a group that is also a differentiable manifold, with the property that the group operations are compatible with the smooth structure. Lie groups are named after Norwegian mathematician Sophus Lie, who laid the foundations of the theory of continuous transformation groups. Lie groups represent the best-developed theory of continuous symmetry of mathematical objects and structures, which makes them indispensable tools for many parts of contemporary mathematics, as well as for modern theoretical physics. . . . One of the key ideas in the theory of Lie groups is to replace the global object, the group, with its local or linearized version, which Lie himself called its "infinitesimal group" and which has since become known as its Lie algebra.' back

Lumen Gentium (Vatican II), Dogmatic Constitution on the Church, 'THE MYSTERY OF THE CHURCH 1. Christ is the Light of nations. Because this is so, this Sacred Synod gathered together in the Holy Spirit eagerly desires, by proclaiming the Gospel to every creature, to bring the light of Christ to all men, a light brightly visible on the countenance of the Church. Since the Church is in Christ like a sacrament or as a sign and instrument both of a very closely knit union with God and of the unity of the whole human race, it desires now to unfold more fully to the faithful of the Church and to the whole world its own inner nature and universal mission.' back

Matrix mechanics - Wikipedia, Matrix mechanics - Wikipedia, the free encyclopedia, 'Matrix mechanics is a formulation of quantum mechanics created by Werner Heisenberg, Max Born, and Pascual Jordan in 1925. Matrix mechanics was the first conceptually autonomous and logically consistent formulation of quantum mechanics. It extended the Bohr Model by describing how the quantum jumps occur. It did so by interpreting the physical properties of particles as matrices that evolve in time. It is equivalent to the Schrödinger wave formulation of quantum mechanics, and is the basis of Dirac's bra-ket notation for the wave function. back

Matthew Schmaltz, What is the Great Commission and why is it so controversial?, ' Converting others to Christianity raises a fundamental question about whether religious diversity is a reality to be celebrated or an obstacle to be overcome. Given the complex history of missionary activity, the meaning of the Great Commission will continue to be a subject of debate as Christianity confronts a rapidly changing world.' back

Max Planck, On the Law of Distribution of Energy in the Normal Spectrum, Annalen der Physik, vol. 4, p. 553 ff (1901) 'The recent spectral measurements made by O. Lummer and E. Pringsheim and even more notable those by H. Rubens and F. Kurlbaum which together confirmed an earlier result obtained by H. Beckmann show that the law of energy distribution in the normal spectrum, first derived by W. Wien from molecular-kinetic considerations and later by me from the theory of electromagnetic radiation, is not valid generally.' back

Maxwell's equations - Wikipedia, Maxwell's equations - Wikipedia, the free encyclopedia, ' Maxwell's equations are a set of partial differential equations that, together with the Lorentz force law, form the foundation of classical electromagnetism, classical optics, and electric circuits. The equations provide a mathematical model for electric, optical and radio technologies, such as power generation, electric motors, wireless communication, lenses, radar etc. Maxwell's equations describe how electric and magnetic fields are generated by charges, currents, and changes of the fields. One important consequence of the equations is that they demonstrate how fluctuating electric and magnetic fields propagate at the speed of light.' back

Measurement problem - Wikipedia, Measurement problem - Wikipedia, the free encyclopedia, 'The measurement problem in quantum mechanics is the problem of how (or whether) wave function collapse occurs. The inability to observe this process directly has given rise to different interpretations of quantum mechanics, and poses a key set of questions that each interpretation must answer. The wave function in quantum mechanics evolves deterministically according to the Schrödinger equation as a linear superposition of different states, but actual measurements always find the physical system in a definite state. Any future evolution is based on the state the system was discovered to be in when the measurement was made, meaning that the measurement "did something" to the system that is not obviously a consequence of Schrödinger evolution.' back

Measurement uncertainty - Wikipedia, Measurement uncertainty - Wikipedia, the free encyclopedia, 'In metrology, measurement uncertainty is a non-negative parameter characterizing the dispersion of the values attributed to a measured quantity. The uncertainty has a probabilistic basis and reflects incomplete knowledge of the quantity. All measurements are subject to uncertainty and a measured value is only complete if it is accompanied by a statement of the associated uncertainty.' back

Meinard Kuhlmann (Stanford Encyclopedia of Philosophy), Quantum Field Theory, ' Quantum Field Theory (QFT) is the mathematical and conceptual framework for contemporary elementary particle physics. In a rather informal sense QFT is the extension of quantum mechanics (QM), dealing with particles, over to fields, i.e. systems with an infinite number of degrees of freedom. (See the entry on quantum mechanics.) In the last few years QFT has become a more widely discussed topic in philosophy of science, with questions ranging from methodology and semantics to ontology. QFT taken seriously in its metaphysical implications seems to give a picture of the world which is at variance with central classical conceptions of particles and fields, and even with some features of QM.' back

Michelson and Morley, On the relative motion of the earth and the lumeniferous ether, The classic paper, 1887. They conclude that "It appears, from all that precedes, reasonably certain that if there be any relative motion between the earth and the luminiferous ether, it must be small; . . . " back

Middle term - Wikipedia, Middle term - Wikipedia, the free encyclopedia, 'In logic, a middle term is a term that appears (as a subject or predicate of a categorical proposition) in both premises but not in the conclusion of a categorical syllogism. The middle term (in bold below) must be distributed in at least one premise but not in the conclusion. The major term and the minor terms, also called the end terms, do appear in the conclusion.' back

Minkowski space - Wikipedia, Minkowski space - Wikipedia, the free encyclopedia, ' In mathematical physics, Minkowski space or Minkowski spacetime is a combination of Euclidean space and time into a four-dimensional manifold where the spacetime interval between any two events is independent of the inertial frame of reference in which they are recorded. Although initially developed by mathematician Hermann Minkowski for Maxwell's equations of electromagnetism, the mathematical structure of Minkowski spacetime was shown to be an immediate consequence of the postulates of special relativity.' back

Neuron - Wikipedia, Neuron - Wikipedia, the free encyclopedia, 'A neuron or nerve cell is an electrically excitable cell that communicates with other cells via specialized connections called synapses. It is the main component of nervous tissue in all animals except sponges and placozoa. Plants and fungi do not have nerve cells. . . . A typical neuron consists of a cell body (soma), dendrites, and a single axon. The soma is usually compact. The axon and dendrites are filaments that extrude from it. Dendrites typically branch profusely and extend a few hundred micrometers from the soma. The axon leaves the soma at a swelling called the axon hillock, and travels for as far as 1 meter in humans or more in other species.' back

New Testament - Wikipedia, New Testament - Wikipedia, the free encyclopedia, 'The New Testament (Koine Greek: Ἡ Καινὴ Διαθήκη, Hē Kainḕ Diathḗkē) is the second major division of the Christian biblical canon, the first such division being the much longer Old Testament.

Unlike the Old Testament or Hebrew Bible, of which Christians hold different views, the contents of the New Testament deal explicitly with 1st century Christianity, although both the Old and New Testament are regarded, together, as Sacred Scripture. The New Testament has therefore (in whole or in part) frequently accompanied the spread of Christianity around the world, and both reflects and serves as a source for Christian theology.' back

Nic Hugget (Stanford Encyclopedia of Philosophy), Zeno's Paradoxes, ' Almost everything that we know about Zeno of Elea is to be found in the opening pages of Plato's Parmenides. There we learn that Zeno was nearly 40 years old when Socrates was a young man, say 20. Since Socrates was born in 469 BC we can estimate a birth date for Zeno around 490 BC. Beyond this, really all we know is that he was close to Parmenides (Plato reports the gossip that they were lovers when Zeno was young), and that he wrote a book of paradoxes defending Parmenides' philosophy. Sadly this book has not survived, and what we know of his arguments is second-hand, principally through Aristotle and his commentators (here I have drawn particularly on Simplicius, who, though writing a thousand years after Zeno, apparently possessed at least some of his book).' back

Nicene Creed - Wikipedia, Nicene Creed - Wikipedia, the free encyclopedia, ' The Nicene Creed (Greek: Σύμβολον τῆς Νίκαιας, Latin: Symbolum Nicaenum) is the profession of faith or creed that is most widely used in Christian liturgy. It forms the mainstream definition of Christianity for most Christians. It is called Nicene because, in its original form, it was adopted in the city of Nicaea (present day Iznik in Turkey) by the first ecumenical council, which met there in the year 325. The Nicene Creed has been normative for the Catholic Church, the Eastern Orthodox Church, the Church of the East, the Oriental Orthodox churches, the Anglican Communion, and the great majority of Protestant denominations.' back

NIST, Kilogram, Mass and Planck's Constant, ' For many observers, the connection between mass on the scale of a liter of water and a constant deriving from the very earliest days of quantum mechanics may not be immediately obvious. The scientific context for that connection is suggested by a deep underlying relationship between two of the most celebrated formulations in physics. One is Einstein's famous E =mc2, where E is energy, m is mass and c is the speed of light. The other expression, less well known to the general public but fundamental to modern science, is E = hν, the first "quantum" expression in history, stated by Max Planck in 1900. Here, E is energy, ν is frequency (the ν is not a “v” but instead the lowercase Greek letter nu), and h is what is now known as the Planck constant.' back

No-cloning theorem - Wikipedia, No-cloning theorem - Wikipedia, the free encyclopedia, ' In physics, the no-cloning theorem states that it is impossible to create an independent and identical copy of an arbitrary unknown quantum state, a statement which has profound implications in the field of quantum computing among others. The theorem is an evolution of the 1970 no-go theorem authored by James Park, in which he demonstrates that a non-disturbing measurement scheme which is both simple and perfect cannot exist . . .. The aforementioned theorems do not preclude the state of one system becoming entangled with the state of another as cloning specifically refers to the creation of a separable state with identical factors.' back

Nobelprize.org, The Nobel prize in Physics 1921: Albert Einstein, 'The Nobel Prize in Physics 1921 was awarded to Albert Einstein "for his services to Theoretical Physics, and especially for his discovery of the law of the photoelectric effect".' back

Noether's theorem - Wikipedia, Noether's theorem - Wikipedia, the free encyclopedia, 'Noether's (first) theorem states that any differentiable symmetry of the action of a physical system has a corresponding conservation law. The theorem was proved by German mathematician Emmy Noether in 1915 and published in 1918. The action of a physical system is the integral over time of a Lagrangian function (which may or may not be an integral over space of a Lagrangian density function), from which the system's behavior can be determined by the principle of least action.' back

Nowak, Plotkin & Jansen (2000), The evolution of syntacic communication, 'Animal communication is typically non-syntactic, which means that signals refer to whole situations. Human language is syntactic, and signals consist of discrete components that have their own meaning. Syntax is a prerequisite for taking advantage of combinatorics, that is, "making infinite use of finite means''. The vast expressive power of human language would be impossible without syntax, and the transition from non-syntactic to syntactic communication was an essential step in the evolutionof human language. . . . ' back

Observable - Wikipedia, Observable - Wikipedia, the free encyclopedia, ' In physics, an observable is a physical quantity that can be measured. Examples include position and momentum. In systems governed by classical mechanics, it is a real-valued "function" on the set of all possible system states. In quantum physics, it is an operator, or gauge, where the property of the quantum state can be determined by some sequence of operations. . . . Physically meaningful observables must also satisfy transformation laws which relate observations performed by different observers in different frames of reference. These transformation laws are automorphisms of the state space, that is bijective transformations which preserve certain mathematical properties of the space in question. ' back

On shell and off shell - Wikipedia, On shell and off shell - Wikipedia, the free encyclopedia, ' In physics, particularly in quantum field theory, configurations of a physical system that satisfy classical equations of motion are called on shell, and those that do not are called off shell. In quantum field theory, virtual particles are termed off shell because they do not satisfy the energy–momentum relation; real exchange particles do satisfy this relation and are termed on shell (mass shell). In classical mechanics for instance, in the action formulation, extremal solutions to the variational principle are on shell and the Euler–Lagrange equations give the on-shell equations. Noether's theorem regarding differentiable symmetries of physical action and conservation laws is another on-shell theorem.' back

Original sin - Wikipedia, Original sin - Wikipedia, the free encyclopedia, 'Original sin, sometimes called ancestral sin, is, according to a doctrine proposed in Christian theology, humanity's state of sin resulting from the Fall of Man. This condition has been characterized in many ways, ranging from something as insignificant as a slight deficiency, or a tendency toward sin yet without collective guilt, referred to as a "sin nature," to something as drastic as total depravity or automatic guilt by all humans through collective guilt. Those who uphold this doctrine look to the teaching of Paul the Apostle in Romans 5:12-21 and 1 Corinthians 15:22 for its scriptural base, and see it as perhaps implied in an Old Testament passage Psalm 51:5.' back

Orthonormal basis - Wikipedia, Orthonormal basis - Wikipedia, the free encyclopedia, 'In mathematics, an orthonormal basis of an inner product space V (i.e., a vector space with an inner product), or in particular of a Hilbert space H, is a set of elements whose span is dense in the space, in which the elements are mutually orthogonal and of magnitude 1. Elements in an orthogonal basis do not have to have a magnitude of 1 but must be mutually perpendicular. It is easy to change the vectors in an orthogonal basis by scalar multiples to get an orthonormal basis, and indeed this is a typical way that an orthonormal basis is constructed.' back

OSI model - Wikipedia, OSI model - Wikipedia, the free encyclopedia, 'The Open Systems Interconnection model (OSI model) is a conceptual model that characterizes and standardizes the communication functions of a telecommunication or computing system without regard to its underlying internal structure and technology. Its goal is the interoperability of diverse communication systems with standard protocols. The model partitions a communication system into abstraction layers. The original version of the model defined seven layers.' back

P. A. M. Dirac (1933), The Lagrangian in Quantum Mechanics, ' . . . there is an alternative formulation [to the Hamiltonian] in classical dynamics, provided by the Lagrangian. This requires one to work in terms of coordinates and velocities instead of coordinates and momenta. The two formulation are closely related but there are reasons for believing that the Lagrangian one is more fundamental. . . . Secondly the lagrangian method can easily be expressed relativistically, on account of the action function being a relativistic invariant; . . .. ' [This article was first published in Physikalische Zeitschrift der Sowjetunion, Band 3, Heft 1 (1933), pp. 64–72.] back

Pan, Bouwmeester, Daniell, Weinfurter & Zeilinger, Experimental test of quantum nonlocality in three-photon Greenberger–Horne–Zeilinger entanglement, ' Abstract Bell's theorem states that certain statistical correlations predicted by quantum physics for measurements on two-particle systems cannot be understood within a realistic picture based on local properties of each individual particle—even if the two particles are separated by large distances. Einstein, Podolsky and Rosen first recognized the fundamental significance of these quantum correlations (termed ‘entanglement’ by Schrödinger) and the two-particle quantum predictions have found ever-increasing experimental support. A more striking conflict between quantum mechanical and local realistic predictions (for perfect correlations) has been discovered; but experimental verification has been difficult, as it requires entanglement between at least three particles. Here we report experimental confirmation of this conflict, using our recently developed method to observe three-photon entanglement, or ‘Greenberger–Horne–Zeilinger’ (GHZ) states. The results of three specific experiments, involving measurements of polarization correlations between three photons, lead to predictions for a fourth experiment; quantum physical predictions are mutually contradictory with expectations based on local realism. We find the results of the fourth experiment to be in agreement with the quantum prediction and in striking conflict with local realism.' back

Papal supremacy - Wikipedia, Papal supremacy - Wikipedia, the free encyclopedia, 'Papal supremacy is the doctrine of the Roman Catholic Church that the pope, by reason of his office as Vicar of Christ and as pastor of the entire Christian Church, has full, supreme, and universal power over the whole Church, a power which he can always exercise unhindered: that, in brief, "the Pope enjoys, by divine institution, supreme, full, immediate, and universal power in the care of souls." (Code of Canon Law: Can. 331) The doctrine had the most significance in the relationship between the church and the temporal state, in matters such as ecclesiastic privileges, the actions of monarchs and even successions.' back

Particle physics and representation theory - Wikipedia, Particle physics and representation theory - Wikipedia, the free encyclopedia, ' There is a natural connection between particle physics and representation theory, as first noted in the 1930s by Eugene Wigner. It links the properties of elementary particles to the structure of Lie groups and Lie algebras. According to this connection, the different quantum states of an elementary particle give rise to an irreducible representation of the Poincaré group. Moreover, the properties of the various particles, including their spectra, can be related to representations of Lie algebras, corresponding to "approximate symmetries" of the universe.' back

Path integral formulation - Wikipedia, Path integral formulation - Wikipedia, the free encyclopedia, 'The path integral formulation of quantum mechanics is a description of quantum theory which generalizes the action principle of classical mechanics. It replaces the classical notion of a single, unique trajectory for a system with a sum, or functional integral, over an infinity of possible trajectories to compute a quantum amplitude. . . . This formulation has proved crucial to the subsequent development of theoretical physics, since it provided the basis for the grand synthesis of the 1970s which unified quantum field theory with statistical mechanics. . . . ' back

Patricia Curd (Stanford Encyclopedia of Philosophy), Anaxagoras, ' Anaxagoras of Clazomenae (a major Greek city of Ionian Asia Minor), a Greek philosopher of the 5th century B.C.E. (born ca. 500–480), was the first of the Presocratic philosophers to live in Athens. He propounded a physical theory of “everything-in-everything,” and claimed that nous (intellect or mind) was the motive cause of the cosmos. He was the first to give a correct explanation of eclipses, and was both famous and notorious for his scientific theories, including the claims that the sun is a mass of red-hot metal, that the moon is earthy, and that the stars are fiery stones.' back

PDG - U of California, Particle Data Group, ' The 2020 PDG collaboration consists of 237 authors and 5 technical associates from 174 institutions in 26 countries. It is led by a coordination team based mostly at Lawrence Berkeley National Laboratory (LBNL), which has served as PDG's headquarters since inception. . . . In the over 60 years since PDG started with the publication of the first wallet cards (W.H. Barkas and A.H. Rosenfeld, UCRL-8030), the Review of Particle Physics has become the most-cited publication in particle physics.' back

Permutation group - Wikipedia, Permutation group - Wikipedia, the free encyclopedia, 'In mathematics, a permutation group is a group G whose elements are permutations of a given set M and whose group operation is the composition of permutations in G (which are thought of as bijective functions from the set M to itself). The group of all permutations of a set M is the symmetric group of M, often written as Sym(M). The term permutation group thus means a subgroup of the symmetric group. If M = {1,2,...,n} then, Sym(M), the symmetric group on n letters is usually denoted by Sn. The way in which the elements of a permutation group permute the elements of the set is called its group action. Group actions have applications in the study of symmetries, combinatorics and many other branches of mathematics, physics and chemistry.' back

Philosophiae Naturalis Principia Mathematica - Wikipedia, Philosophiae Naturalis Principia Mathematica - Wikipedia, the free encyclopedia, 'The Philosophiæ Naturalis Principia Mathematica (Latin: "mathematical principles of natural philosophy" often Principia or Principia Mathematica for short) is a three-volume work by Isaac Newton published on 5 July 1687. It contains the statement of Newton's laws of motion forming the foundation of classical mechanics, as well as his law of universal gravitation and a derivation of Kepler's laws for the motion of the planets (which were first obtained empirically). The Principia is widely regarded as one of the most important scientific works ever written.' back

Photon - Wikipedia, Photon - Wikipedia, the free encyclopedia, 'A photon is an elementary particle, the quantum of all forms of electromagnetic radiation including light. It is the force carrier for electromagnetic force, even when static via virtual photons. The photon has zero rest mass and as a result, the interactions of this force with matter at long distance are observable at the microscopic and macroscopic levels.' back

Planck constant - Wikipedia, Planck constant - Wikipedia, the free encyclopedia, ' Since energy and mass are equivalent, the Planck constant also relates mass to frequency. By 2017, the Planck constant had been measured with sufficient accuracy in terms of the SI base units, that it was central to replacing the metal cylinder, called the International Prototype of the Kilogram (IPK), that had defined the kilogram since 1889. . . . For this new definition of the kilogram, the Planck constant, as defined by the ISO standard, was set to 6.626 070 150 × 10-34 J⋅s exactly. ' back

Planck units - Wikipedia, Planck units - Wikipedia, the free encycloedia, ' In particle physics and physical cosmology, Planck units are a set of units of measurement defined exclusively in terms of four universal physical constants, in such a manner that these physical constants take on the numerical value of 1 when expressed in terms of these units. . Originally proposed in 1899 by German physicist Max Planck, these units are also known as natural units because the origin of their definition comes only from properties of nature and not from any human construct. Planck units are only one system of several systems of natural units, but Planck units are not based on properties of any prototype object or particle (that would be arbitrarily chosen), but rather on only the properties of free space.' back

Planck-Einstein relation - Wikipedia, Planck-Einstein relation - Wikipedia, the free encyclopedia, 'The Planck–Einstein relation. . . refers to a formula integral to quantum mechanics, which states that the energy of a photon (E) is proportional to its frequency (ν). E = hν. The constant of proportionality, h, is known as the Planck constant.' back

Pleroma - Wikipedia, Pleroma - Wikipedia, the free encyclopedia, 'Pleroma (Greek πλήρωμα) generally refers to the totality of divine powers. The word means fullness from πληρόω ("I fill") comparable to πλήρης which means "full", and is used in Christian theological contexts: both in Gnosticism generally, and by St. Paul the Apostle in Colossians 2:9 (the word is used 17 times in the NT).[2] Pleroma is also used in the general Greek language and is used by the Greek Orthodox Church in this general form since the word appears in the book of Colossians. Elaine Pagels of Princeton University, view the reference in Colossians as something that was to be interpreted in the Gnostic sense.' back

Pope John Paul II (1996), Truth Cannot Contradict Truth, Address to the Pontifical Academy of Sciences October 22, 1996 , 'Consideration of the method used in the various branches of knowledge makes it possible to reconcile two points of view which would seem irreconcilable. The sciences of observation describe and measure the multiple manifestations of life with increasing precision and correlate them with the time line. The moment of transition to the spiritual cannot be the object of this kind of observation, which nevertheless can discover at the experimental level a series of very valuable signs indicating what is specific to the human being.' back

Pope Paul III (1546), Council of Trent: Decree Concerning Original Sin, '1. If anyone does not confess that the first man, Adam, when he transgressed the commandment of God in paradise, immediately lost the holiness and justice in which he had been constituted, and through the offense of that prevarication incurred the wrath and indignation of God, and thus death with which God had previously threatened him, and, together with death, captivity under his power who thenceforth had the empire of death, that is to say, the devil, and that the entire Adam through that offense of prevarication was changed in body and soul for the worse, let him be anathema.' back

Pope Paul VI (1964), Dogmatic Constitution: Lumen Gentium Chapter 7, ' § 48: . . . Already the final age of the world has come upon us (242) and the renovation of the world is irrevocably decreed and is already anticipated in some kind of a real way; for the Church already on this earth is signed with a sanctity which is real although imperfect. However, until there shall be new heavens and a new earth in which justice dwells, the pilgrim Church in her sacraments and institutions, which pertain to this present time, has the appearance of this world which is passing and she herself dwells among creatures who groan and travail in pain until now and await the revelation of the sons of God.' back

Portal: Mind and Brain - Wikipedia, Portal: Mind and Brain - Wikipedia, the free encyclopedia, 'Welcome to the Mind and Brain Portal. This is an interdisciplinary point of entry to such related fields as cognitive psychology, philosophy of mind, neuroscience, and linguistics.' back

Potential energy - Wikipedia, Potential energy - Wikipedia, the free encyclopedia, 'In physics, potential energy is the energy of an object or a system due to the position of the body or the arrangement of the particles of the system. The SI unit for measuring work and energy is the joule (symbol J). The term potential energy was coined by the 19th century Scottish engineer and physicist William Rankine although it has links to Greek philosopher Aristotle's concept of potentiality. Potential energy is associated with a set of forces that act on a body in a way that depends only on the body's position in space.' back

Potentiality and actuality - Wikipedia, Potentiality and actuality - Wikipedia, the free encyclopedia, 'In philosophy, Potentiality and Actuality are principles of a dichotomy which Aristotle used to analyze motion, causality, ethics, and physiology in his Physics, Metaphysics, Ethics and De Anima (which is about the human psyche). The concept of potentiality, in this context, generally refers to any "possibility" that a thing can be said to have. Aristotle did not consider all possibilities the same, and emphasized the importance of those that become real of their own accord when conditions are right and nothing stops them. Actuality, in contrast to potentiality, is the motion, change or activity that represents an exercise or fulfillment of a possibility, when a possibility becomes real in the fullest sense. back

Projective Hilbert space - Wikipedia, Projective Hilbert space - Wikipedia, the free encyclopedia, ' In mathematics and the foundations of quantum mechanics, the projective Hilbert space P(H) of a complex Hilbert space H is the set of equivalence classes of non-zero vectors v in H , for the relation ∼ on H given by w ∼ v if and only if v = λ w for some non-zero complex number λ . The equivalence classes of v for the relation ∼ \sim } are also called rays or projective rays. This is the usual construction of projectivization, applied to a complex Hilbert space.' back

Pseudo-Dionysius the Areopagite - Wikipedia, Pseudo-Dionysius the Areopagite - Wikipedia, the free encyclopedia, 'Pseudo-Dionysius the Areopagite (Greek: Διονύσιος ὁ Ἀρεοπαγίτης), also known as Pseudo-Denys, was a Christian theologian and philosopher of the late 5th to early 6th century (writing before 532), probably Syrian, the author of the set of works commonly referred to as the Corpus Areopagiticum or Corpus Dionysiacum. The author pseudonymously identifies himself in the corpus as "Dionysios", portraying himself as the figure of Dionysius the Areopagite, the Athenian convert of St. Paul mentioned in Acts 17:34 This false attribution resulted in the work being given great authority in subsequent theological writing in both East and West, with its influence only decreasing in the West with the fifteenth century demonstration of its later dating.' back

Psychophysical parallelism - Wikipedia, Psychophysical parallelism - Wikipedia, the free encyclopedia, ' In the philosophy of mind, psychophysical parallelism (or simply parallelism) is the theory that mental and bodily events are perfectly coordinated, without any causal interaction between them. As such, it affirms the correlation of mental and bodily events (since it accepts that when a mental event occurs, a corresponding physical effect occurs as well), but denies a direct cause and effect relation between mind and body.[1] This coordination of mental and bodily events has been postulated to occur either in advance by means of God (as per Gottfried Leibniz's idea of pre-established harmony) or at the time of the event (as in the occasionalism of Nicolas Malebranche) or, finally, according to Baruch Spinoza's Ethics, mind and matter are two of infinite attributes of the only Substance-God, which go as one without interacting with each other. On this view, mental and bodily phenomena are independent yet inseparable, like two sides of a coin.' back

Pythagorean theorem - Wikipedia, Pythagorean theorem - Wikipedia, the free encyclopedia, 'In any right triangle, the area of the square whose side is the hypotenuse (the side opposite the right angle) is equal to the sum of the areas of the squares whose sides are the two legs (the two sides that meet at a right angle). This is usually summarized as: The square on the hypotenuse is equal to the sum of the squares on the other two sides.' back

Quantum entanglement - Wikipedia, Quantum entanglement - Wikipedia, the free encyclopedia, 'Quantum entanglement is a physical phenomenon which occurs when pairs or groups of particles are generated, interact, or share spatial proximity in ways such that the quantum state of each particle cannot be described independently of the state of the other(s), even when the particles are separated by a large distance—instead, a quantum state must be described for the system as a whole. . . . Entanglement is considered fundamental to quantum mechanics, even though it wasn't recognized in the beginning. Quantum entanglement has been demonstrated experimentally with photons, neutrinos, electrons, molecules as large as buckyballs, and even small diamonds. The utilization of entanglement in communication and computation is a very active area of research.' back

Quantum harmonic oscillator - Wikipedia, Quantum harmonic oscillator - Wikipedia, the free encyclopedia, 'The quantum harmonic oscillator is the quantum-mechanical analog of the classical harmonic oscillator. Because an arbitrary potential can usually be approximated as a harmonic potential at the vicinity of a stable equilibrium point, it is one of the most important model systems in quantum mechanics. Furthermore, it is one of the few quantum-mechanical systems for which an exact, analytical solution is known.' back

Quantum nonlocality - Wikipedia, Quantum nonlocality - Wikipedia, the free encyclopedia, 'Quantum nonlocality is the phenomenon by which the measurements made at a microscopic level necessarily refute one or more notions (often referred to as local realism) that are regarded as intuitively true in classical mechanics. Rigorously, quantum nonlocality refers to quantum mechanical predictions of many-system measurement correlations that cannot be simulated by any local hidden variable theory. Many entangled quantum states produce such correlations when measured, as demonstrated by Bell's theorem.' back

Quantum state - Wikipedia, Quantum state - Wikipedia, the free encyclopedia, 'In quantum physics, a quantum state is a mathematical entity that provides a probability distribution for the outcomes of each possible measurement on a system. Knowledge of the quantum state together with the rules for the system's evolution in time exhausts all that can be predicted about the system's behavior. A mixture of quantum states is again a quantum state. Quantum states that cannot be written as a mixture of other states are called pure quantum states, while all other states are called mixed quantum states. A pure quantum state can be represented by a ray in a Hilbert space over the complex numbers, while mixed states are represented by density matrices, which are positive semidefinite operators that act on Hilbert spaces.' back

Qubit - Wikipedia, Qubit - Wikipedia, the free encyclopedia, 'A quantum bit, or qubit . . . is a unit of quantum information. That information is described by a state vector in a two-level quantum mechanical system which is formally equivalent to a two-dimensional vector space over the complex numbers. Benjamin Schumacher discovered a way of interpreting quantum states as information. He came up with a way of compressing the information in a state, and storing the information on a smaller number of states. This is now known as Schumacher compression. In the acknowledgments of his paper (Phys. Rev. A 51, 2738), Schumacher states that the term qubit was invented in jest, during his conversations with Bill Wootters.' back

Renormalization - Wikipedia, Renormalization - Wikipedia, the free encyclopedia, ' Renormalization is a collection of techniques in quantum field theory, the statistical mechanics of fields, and the theory of self-similar geometric structures, that are used to treat infinities arising in calculated quantities by altering values of quantities to compensate for effects of their self-interactions. But even if it were the case that no infinities arose in loop diagrams in quantum field theory, it could be shown that renormalization of mass and fields appearing in the original Lagrangian is necessary.' back

Renormalization group - Wikipedia, Renormalization group - Wikipedia, the free encyclopedia, 'In theoretical physics, renormalization group (RG) refers to a mathematical apparatus that allows one to investigate the changes of a physical system as one views it at different distance scales. In particle physics it reflects the changes in the underlying force laws as one varies the energy scale at which physical processes occur. A change in scale is called a "scale transformation" or "conformal transformation." The renormalization group is intimately related to "conformal invariance" or "scale invariance," a symmetry by which the system appears the same at all scales (so-called self-similarity).' back

Ribosome - Wikipedia, Ribosome - Wikipedia, the free encyclopedia, ' Ribosomes are macromolecular machines, found within all living cells, that perform biological protein synthesis (mRNA translation). Ribosomes link amino acids together in the order specified by the codons of messenger RNA molecules to form polypeptide chains. Ribosomes consist of two major components: the small and large ribosomal subunits. Each subunit consists of one or more ribosomal RNA molecules and many ribosomal proteins.' back

Richard P. Feynman (1948), Space-Time Approach to Non-Relativistic Quantum Mechanics, ' Abstract Non-relativistic quantum mechanics is formulated here in a different way. It is, however, mathematically equivalent to the familiar formulation. In quantum mechanics the probability of an event which can happen in several different ways is the absolute square of a sum of complex contributions, one from each alternative way. The probability that a particle will be found to have a path x(t) lying somewhere within a region of space time is the square of a sum of contributions, one from each path in the region. The contribution from a single path is postulated to be an exponential whose (imaginary) phase is the classical action (in units of ) for the path in question. The total contribution from all paths reaching x, t from the past is the wave function ψ(x, t). This is shown to satisfy Schroedinger's equation. The relation to matrix and operator algebra is discussed. Applications are indicated, in particular to eliminate the coordinates of the field oscillators from the equations of quantum electrodynamics.' back

Richard P. Feynman, Nobel Lecture: The Development of the Space-Time View of Quantum Electrodynamics, Nobel Lecture, December 11, 1965: We have a habit in writing articles published in scientific journals to make the work as finished as possible, to cover all the tracks, to not worry about the blind alleys or to describe how you had the wrong idea first, and so on. So there isn’t any place to publish, in a dignified manner, what you actually did in order to get to do the work, although, there has been in these days, some interest in this kind of thing. Since winning the prize is a personal thing, I thought I could be excused in this particular situation, if I were to talk personally about my relationship to quantum electrodynamics, rather than to discuss the subject itself in a refined and finished fashion. Furthermore, since there are three people who have won the prize in physics, if they are all going to be talking about quantum electrodynamics itself, one might become bored with the subject. So, what I would like to tell you about today are the sequence of events, really the sequence of ideas, which occurred, and by which I finally came out the other end with an unsolved problem for which I ultimately received a prize.' back

Robin Smith (Stanford Encyclopedia of Philosophy), Aristotle's Logic, 'Aristotle's logic, especially his theory of the syllogism, has had an unparalleled influence on the history of Western thought. It did not always hold this position: in the Hellenistic period, Stoic logic, and in particular the work of Chrysippus, took pride of place. However, in later antiquity, following the work of Aristotelian Commentators, Aristotle's logic became dominant, and Aristotelian logic was what was transmitted to the Arabic and the Latin medieval traditions, while the works of Chrysippus have not survived.' back

Rolf Landauer (1961), Irreversibility and Heat Generation in the Computing Process, 'Abstract: 'It is argued that computing machines inevitably involve devices which perform logical functions that do not have a single-valued inverse. The logical irreversibility is associated with physical irreversibility, and requires a minimum heat generation, per machine cycle, typically of the order of kT for each irreversible function. The dissipation serves the purpose of standardizing signals and making them independent of their exact logical history. Two simple, but representative, models of bistable devices are subjected to a more detailed analysis of switching kinetics to yield the relationship between speed and energy dissipation, and to estimate the effects of errors induced by thermal fluctuations.' back

Rolf Landauer (1999), Information is a Physical Entity, 'Abstract: This paper, associated with a broader conference talk on the fundamental physical limits of information handling, emphasizes the aspects still least appreciated. Information is not an abstract entity but exists only through a physical representation, thus tying it to all the restrictions and possibilities of our real physical universe. The mathematician's vision of an unlimited sequence of totally reliable operations is unlikely to be implementable in this real universe. Speculative remarks about the possible impact of that on the ultimate nature of the laws of physics are included.' back

Roulette - Wikipedia, Roulette - Wikipedia, the free encyclopedia, ' Roulette is a casino game named after the French word meaning little wheel which was likely developed from the Italian game Biribi. In the game, players may choose to place bets on either a single number, various groupings of numbers, . . . To determine the winning number, a croupier spins a wheel in one direction, then spins a ball in the opposite direction around a tilted circular track running around the outer edge of the wheel. The ball eventually loses momentum, passes through an area of deflectors, and falls onto the wheel and into one of [the] colored and numbered pockets on the wheel.' back

Russel Shorto, The Irish Affliction, 'Of the various crises the Catholic Church is facing around the world, the central one — wave after wave of accounts of systemic sexual abuse of children by priests and other church figures — has affected Ireland more strikingly than anywhere else. And no place has reacted so aggressively. The Irish responded to the publication in 2009 of two lengthy, damning reports — detailing thousands of cases of rape, sexual molestation and lurid beatings, spanning Ireland’s entire history as an independent country, and the efforts of church officials to protect the abusers rather than the victims — with anger, disgust, vocal assaults on priests in public and demands that the government and society disentangle themselves from the church.' back

Salart, Baas, Branciard, Gisin & Zbinden (2008), Testing the speed of 'spooky action at a distance', ' Abstract Correlations are generally described by one of two mechanisms: either a first event influences a second one by sending information encoded in bosons or other physical carriers, or the correlated events have some common causes in their shared history. Quantum physics predicts an entirely different kind of cause for some correlations, named entanglement. This reveals itself in correlations that violate Bell inequalities (implying that they cannot be described by common causes) between space-like separated events (implying that they cannot be described by classical communication). Many Bell tests have been performed, and loopholes related to locality and detection have been closed in several independent experiments. It is still possible that a first event could influence a second, but the speed of this hypothetical influence (Einstein’s ‘spooky action at a distance’) would need to be defined in some universal privileged reference frame and be greater than the speed of light. Here we put stringent experimental bounds on the speed of all such hypothetical influences. We performed a Bell test over more than 24 hours between two villages separated by 18 km and approximately east–west oriented, with the source located precisely in the middle. We continuously observed two-photon interferences well above the Bell inequality threshold. Taking advantage of the Earth’s rotation, the configuration of our experiment allowed us to determine, for any hypothetically privileged frame, a lower bound for the speed of the influence. For example, if such a privileged reference frame exists and is such that the Earth’s speed in this frame is less than 10-3 times that of the speed of light, then the speed of the influence would have to exceed that of light by at least four orders of magnitude.' back

Salart, Baas, Branciard, Gisin, & Zbinden 2008, Testing spooky action at a distance, ' In science, one observes correlations and invents theoretical models that describe them. In all sciences, besides quantum physics, all correlations are described by either of two mechanisms. Either a first event influences a second one by sending some information encoded in bosons or molecules or other physical carriers, depending on the particular science. Or the correlated events have some common causes in their common past. Interestingly, quantum physics predicts an entirely different kind of cause for some correlations, named entanglement. This new kind of cause reveals itself, e.g., in correlations that violate Bell inequalities (hence cannot be described by common causes) between space-like separated events (hence cannot be described by classical communication). Einstein branded it as spooky action at a distance. A real spooky action at a distance would require a faster than light influence defined in some hypothetical universally privileged reference frame. Here we put stringent experimental bounds on the speed of all such hypothetical influences. We performed a Bell test during more than 24 hours between two villages separated by 18 km and approximately east-west oriented, with the source located precisely in the middle. We continuously observed 2-photon interferences well above the Bell inequality threshold. Taking advantage of the Earth's rotation, the configuration of our experiment allowed us to determine, for any hypothetically privileged frame, a lower bound for the speed of this spooky influence. For instance, if such a privileged reference frame exists and is such that the Earth's speed in this frame is less than 10^-3 that of the speed of light, then the speed of this spooky influence would have to exceed that of light by at least 4 orders of magnitude. back

Schrödinger equation - Wikipedia, Schrödinger equation - Wikipedia, the free encyclopedia, ' In quantum mechanics, the Schrödinger equation is a partial differential equation that describes how the quantum state of a quantum system changes with time. It was formulated in late 1925, and published in 1926, by the Austrian physicist Erwin Schrödinger. . . . In classical mechanics Newton's second law, (F = ma), is used to mathematically predict what a given system will do at any time after a known initial condition. In quantum mechanics, the analogue of Newton's law is Schrödinger's equation for a quantum system (usually atoms, molecules, and subatomic particles whether free, bound, or localized). It is not a simple algebraic equation, but in general a linear partial differential equation, describing the time-evolution of the system's wave function (also called a "state function").' back

Second law of thermodynamics - Wikipedia, Second law of thermodynamics - Wikipedia - The free encyclopedia, 'The second law of thermodynamics states that in a natural thermodynamic process, there is an increase in the sum of the entropies of the participating systems. The second law is an empirical finding that has been accepted as an axiom of thermodynamic theory. back

Sheffer stroke - Wikipedia, Sheffer stroke - Wikipedia, the free encyclopedia, 'In Boolean functions and propositional calculus, the Sheffer stroke, named after Henry M. Sheffer, written "|" . . . denotes a logical operation that is equivalent to the negation of the conjunction operation, expressed in ordinary language as "not both". It is also called nand ("not and") or the alternative denial, since it says in effect that at least one of its operands is false.' back

Shelter Island Conference - Wikipedia, Shelter Island Conference - Wikipedia, the free encyclopedia, ' The first Shelter Island Conference on the Foundations of Quantum Mechanics was held from June 2–4, 1947 at the Ram's Head Inn in Shelter Island, New York. Shelter Island was the first major opportunity since Pearl Harbor and the Manhattan Project for the leaders of the American physics community to gather after the war. As Julian Schwinger would later recall, "It was the first time that people who had all this physics pent up in them for five years could talk to each other without somebody peering over their shoulders and saying, 'Is this cleared?'" ' back

Singlet - Wikipedia, Singlet - Wikipedia, the free encyclopedia, 'In theoretical physics, a singlet usually refers to a one-dimensional representation (e.g. a particle with vanishing spin). It may also refer to two or more particles prepared in a correlated state, such that the total angular momentum of the state is zero. Singlets frequently occur in atomic physics as one of the two ways in which the spin of two electrons can be combined; the other being a triplet. A single electron has spin 1/2, and transforms as a doublet, that is, as the fundamental representation of the rotation group SU(2). The product of two doublet representations can be decomposed into the sum of the adjoint representation (the triplet) and the trivial representation, the singlet. More prosaically, a pair of electron spins can be combined to form a state of total spin 1 and a state of spin 0. The singlet state formed from a pair of electrons has many peculiar properties, and plays a fundamental role in the EPR paradox and quantum entanglement' back

Spacetime - Wikipedia, Spacetime - Wikipedia, the free encyclopedia, 'In physics, spacetime is any mathematical model that combines space and time into a single construct called the space-time continuum. Spacetime is usually interpreted with space being three-dimensional and time playing the role of the fourth dimension. According to Euclidean space perception, the universe has three dimensions of space, and one dimension of time. By combining space and time into a single manifold, physicists have significantly simplified a large amount of physical theory, as well as described in a more uniform way the workings of the universe at both the supergalactic and subatomic levels.' back

Special relativity - Wikipedia, Special relativity - Wikipedia, the free encyclopedia, ' Special relativity . . . is the physical theory of measurement in an inertial frame of reference proposed in 1905 by Albert Einstein (after the considerable and independent contributions of Hendrik Lorentz, Henri Poincaré and others) in the paper "On the Electrodynamics of Moving Bodies". It generalizes Galileo's principle of relativity—that all uniform motion is relative, and that there is no absolute and well-defined state of rest (no privileged reference frames)—from mechanics to all the laws of physics, including both the laws of mechanics and of electrodynamics, whatever they may be. Special relativity incorporates the principle that the speed of light is the same for all inertial observers regardless of the state of motion of the source.' back

Standard model - Wikipedia, Standard model - Wikipedia, the free encyclopedia, 'The Standard Model of particle physics is a theory that describes three of the four known fundamental interactions between the elementary particles that make up all matter. It is a quantum field theory developed between 1970 and 1973 which is consistent with both quantum mechanics and special relativity. To date, almost all experimental tests of the three forces described by the Standard Model have agreed with its predictions. However, the Standard Model falls short of being a complete theory of fundamental interactions, primarily because of its lack of inclusion of gravity, the fourth known fundamental interaction, but also because of the large number of numerical parameters (such as masses and coupling constants) that must be put "by hand" into the theory (rather than being derived from first principles) . . . ' back

Symmetry - Wikipedia, Symmetry - Wikipedia, the free encyclopedia, 'Symmetry (from Greek συμμετρία symmetria "agreement in dimensions, due proportion, arrangement") in everyday language refers to a sense of harmonious and beautiful proportion and balance. In mathematics, "symmetry" has a more precise definition, that an object is invariant to a transformation, such as reflection but including other transforms too. Although these two meanings of "symmetry" can sometimes be told apart, they are related, so they are here discussed together.' back

Telecommunications industry - Wikipedia, Telecommunications industry - Wikipedia, the free encyclopedia, ' The telecommunications industries within the sector of information and communication technology is made up of all telecommunications/telephone companies and internet service providers and plays the crucial role in the evolution of mobile communications and the information society. . . . Digital subscriber line (DSL) is the main broadband telecom technology. The fastest growth comes from (value-added) services delivered over mobile networks. . . . Think of telecommunications as the world's biggest machine. Strung together by complex networks, telephones, mobile phones and internet-linked PCs, the global system touches nearly all of us. [Investopedia]' back

Thomas Ainsworth (Stanford Encyclopedia of Philosophy), Form vs. Matter, 'Aristotle famously contends that every physical object is a compound of matter and form. This doctrine has been dubbed “hylomorphism”, a portmanteau of the Greek words for matter (hulê) and form (eidos or morphê). Highly influential in the development of Medieval philosophy, Aristotle’s hylomorphism has also enjoyed something of a renaissance in contemporary metaphysics.' back

Thomas Aquinas Summa Theologiae I, 1, 2, Is sacred doctrine is a science?, 'I answer that, Sacred doctrine is a science. We must bear in mind that there are two kinds of sciences. There are some which proceed from a principle known by the natural light of intelligence, such as arithmetic and geometry and the like. There are some which proceed from principles known by the light of a higher science: thus the science of perspective proceeds from principles established by geometry, and music from principles established by arithmetic. So it is that sacred doctrine is a science because it proceeds from principles established by the light of a higher science, namely, the science of God and the blessed.' back

Thomas Aquinas, Summa I, 3 Proemium (Latin), Summa I, 3: On the simplicity of God, ' Cognito de aliquo an sit, inquirendum restat quomodo sit, ut sciatur de eo quid sit. Sed quia de Deo scire non possumus quid sit, sed quid non sit, non possumus considerare de Deo quomodo sit, sed potius quomodo non sit. Primo ergo considerandum est quomodo non sit; . . . Potest autem ostendi de Deo quomodo non sit, removendo ab eo ea quae ei non conveniunt, utpote compositionem, motum, et alia huiusmodi. Primo ergo inquiratur de simplicitate ipsius, per quam removetur ab eo compositio. . . . Circa primum quaeruntur octo. Primo, utrum Deus sit corpus. Secundo, utrum sit in eo compositio formae et materiae. Tertio, utrum sit in eo compositio quidditatis, sive essentiae, vel naturae, et subiecti. Quarto, utrum sit in eo compositio quae est ex essentia et esse. Quinto, utrum sit in eo compositio generis et differentiae. Sexto, utrum sit in eo compositio subiecti et accidentis. Septimo, utrum sit quocumque modo compositus, vel totaliter simplex. Octavo, utrum veniat in compositionem cum aliis.' back

Thomas Aquinas, Summa, I, 2, 3, Does God exist?, 'I answer that, The existence of God can be proved in five ways. The first and more manifest way is the argument from motion. . . . ' back

Thomas Aquinas, Summa, I, 3, Introduction, ' When the existence of a thing has been ascertained there remains the further question of the manner of its existence, in order that we may know its essence. Now, because we cannot know what God is, but rather what He is not, we have no means for considering how God is, but rather how He is not. . . . ' back

Topology - Wikipedia, Topology - Wikipedia, the freeencyclopedia, 'In mathematics, topology (from the Greek τόπος, place, and λόγος, study) is concerned with the properties of space that are preserved under continuous deformations, such as stretching, crumpling and bending, but not tearing or gluing. . . . The motivating insight behind topology is that some geometric problems depend not on the exact shape of the objects involved, but rather on the way they are put together. For example, the square and the circle have many properties in common: they are both one dimensional objects (from a topological point of view) and both separate the plane into two parts, the part inside and the part outside.' back

Transfinite numbers - Wikipedia, Transfinite numbers - Wikipedia, the free encyclopedia, 'Transfinite numbers are cardinal numbers or ordinal numbers that are larger than all finite numbers, yet not necessarily absolutely infinite. The term transfinite was coined by Georg Cantor, who wished to avoid some of the implications of the word infinite in connection with these objects, which were nevertheless not finite. Few contemporary workers share these qualms; it is now accepted usage to refer to transfinite cardinals and ordinals as "infinite". However, the term "transfinite" also remains in use.' back

Tree of life (biology) - Wikipedia, Tree of life (biology) - Wikipedia, the free encyclopedia, 'The tree of life or universal tree of life is a metaphor used to describe the relationships between organisms, both living and extinct, as described in a famous passage in Charles Darwin's On the Origin of Species (1859).' back

Trinity - Wikipedia, Trinity - Wikipedia, the free encyclopedia, 'The Christian doctrine of the Trinity (from Latin trinitas "triad", from trinus "threefold") defines God as three consubstantial persons, expressions, or hypostases: the Father, the Son (Jesus Christ), and the Holy Spirit; "one God in three persons". The three persons are distinct, yet are one "substance, essence or nature" homoousios). In this context, a "nature" is what one is, while a "person" is who one is.' back

Triple-alpha process - Wikipedia, Triple-alpha process - Wikipedia, the free encyclopedia, 'The triple-alpha process is a set of nuclear fusion reactions by which three helium-4 nuclei (alpha particles) are transformed into carbon.' back

Truth table - Wikipedia, Truth table - Wikipedia, the free encyclopedia, ' A truth table is a mathematical table used in logic—specifically in connection with Boolean algebra, boolean functions, and propositional calculus—which sets out the functional values of logical expressions on each of their functional arguments, that is, for each combination of values taken by their logical variables (Enderton, 2001). In particular, truth tables can be used to show whether a propositional expression is true for all legitimate input values, that is, logically valid.' back

Turing machine - Wikipedia, Turing machine - Wikipedia, the free encyclopedia, ' A Turing machine is a hypothetical device that manipulates symbols on a strip of tape according to a table of rules. Despite its simplicity, a Turing machine can be adapted to simulate the logic of any computer algorithm, and is particularly useful in explaining the functions of a CPU inside a computer. The "machine" was invented in 1936 by Alan Turingwho called it an "a-machine" (automatic machine). The Turing machine is not intended as practical computing technology, but rather as a hypothetical device representing a computing machine. Turing machines help computer scientists understand the limits of mechanical computation.' back

Turing's Proof - Wikipedia, Turing's Proof - Wikipedia, the free encyclopedia, 'Turing's proof is a proof by Alan Turing, first published in January 1937 with the title "On Computable Numbers, with an Application to the Entscheidungsproblem." It was the second proof of the assertion (Alonzo Church's proof was first) that some decision problems are "undecidable": there is no single algorithm that infallibly gives a correct "yes" or "no" answer to each instance of the problem. In his own words: "...what I shall prove is quite different from the well-known results of Gödel ... I shall now show that there is no general method which tells whether a given formula U is provable in K [Principia Mathematica]..." '. back

Ullin T. Place, Is Consciousness a Brain Process?, The thesis that consciousness is a process in brain is put forward as a reasonable scientific hypothesis, not to be dismissed on logical grounds alone. conditions under which two sets of observations are treated as observations of same process, rather than as observations of two independent correlated processes, are discussed. It is suggested that we can identify consciousness with a given pattern of brain activity, if we can explain subject's introspective observations by reference to brain processes with which they are correlated. It is argued that problem of providing a physiological explanation of introspective observations is made to seem more difficult than it really is by ‘phenomenological fallacy’, mistaken idea that descriptions of appearances of things are descriptions of actual state of affairs in a mysterious internal environment. back

Unitarity (physics) - Wikipedia, Unitarity (physics) - Wikipedia, the free encyclopedia, In quantum physics, unitarity means that the sum of probabilities of all possible outcome of any event is always 1. This is necessary for the theory to be consistent. This implies that the operator which describes the progress of a physical system in time must be a unitary operator. This operator is eiHt where H is the Hamiltonian of the system and t is time. back

Unitary operator - Wikipedia, Unitary operator - Wikipedia, the free encyclopedia, ' In functional analysis, a branch of mathematics, a unitary operator . . . is a bounded linear operator U : H → H on a Hilbert space H satisfying UU* = U*U = I where U* is the adjoint of U, and I : H → H is the identity operator. This property is equivalent to the following: 1. U preserves the inner product ( , ) of the Hilbert space, ie for all vectors x and y in the Hilbert space, (Ux, Uy) = (x, y) and
2. U is surjective.' back

Unmoved mover - Wikipedia, Unmoved mover - Wikipedia, the free encyclopedia, ' The unmoved mover (Ancient Greek: ὃ οὐ κινούμενον κινεῖ, romanized: ho ou kinoúmenon kineî, lit. 'that which moves without being moved'] or prime mover (Latin: primum movens) is a concept advanced by Aristotle as a primary cause (or first uncaused cause) or "mover" of all the motion in the universe. As is implicit in the name, the unmoved mover moves other things, but is not itself moved by any prior action. In Book 12 (Greek: Λ) of his Metaphysics, Aristotle describes the unmoved mover as being perfectly beautiful, indivisible, and contemplating only the perfect contemplation: self-contemplation. He equates this concept also with the active intellect. This Aristotelian concept had its roots in cosmological speculations of the earliest Greek pre-Socratic philosophers and became highly influential and widely drawn upon in medieval philosophy and theology. St. Thomas Aquinas, for example, elaborated on the unmoved mover in the Quinque viae. ' back

URL - Wikipedia, URL - Wikipedia, the free encyclopedia, ' A Uniform Resource Locator (URL), colloquially termed a web address, is a reference to a web resource that specifies its location on a computer network and a mechanism for retrieving it. A URL is a specific type of Uniform Resource Identifier (URI) although many people use the two terms interchangeably. URLs occur most commonly to reference web pages (http), but are also used for file transfer (ftp), email (mailto), database access (JDBC), and many other applications.' back

USSG - Indiana University, ISO/OSI Network Model, 'The standard model for networking protocols and distributed applications is the International Standard Organization's Open System Interconnect (ISO/OSI) model. It defines seven network layers: Layer 1 - Physical ... Layer 2 - Data Link ... Layer 3 - Network ...Layer 4 - Transport ... :ayer 5 - Session ... Layer 6 - Presentation ... Layer 7 - Application back

W. F. McGrew et al, Atomic clock performance enabling geodesy below the centimetre level, ' The passage of time is tracked by counting oscillations of a frequency reference, such as Earth’s revolutions or swings of a pendulum. By referencing atomic transitions, frequency (and thus time) can be measured more precisely than any other physical quantity, with the current generation of optical atomic clocks reporting fractional performance below the 10−17 level. However, the theory of relativity prescribes that the passage of time is not absolute, but is affected by an observer’s reference frame. Consequently, clock measurements exhibit sensitivity to relative velocity, acceleration and gravity potential. Here we demonstrate local optical clock measurements that surpass the current ability to account for the gravitational distortion of space-time across the surface of Earth. In two independent ytterbium optical lattice clocks, we demonstrate unprecedented values of three fundamental benchmarks of clock performance. In units of the clock frequency, we report systematic uncertainty of 1.4 × 10−18, measurement instability of 3.2 × 10−19 and reproducibility characterized by ten blinded frequency comparisons, yielding a frequency difference of [−7 ± (5)stat ± (8)sys] × 10−19, where ‘stat’ and ‘sys’ indicate statistical and systematic uncertainty, respectively. Although sensitivity to differences in gravity potential could degrade the performance of the clocks as terrestrial standards of time, this same sensitivity can be used as a very sensitive probe of geopotential. Near the surface of Earth, clock comparisons at the 1 × 10−18 level provide a resolution of one centimetre along the direction of gravity, so the performance of these clocks should enable geodesy beyond the state-of-the-art level. These optical clocks could further be used to explore geophysical phenomena, detect gravitational waves, test general relativity and search for dark matter.' back

Werner Heisenberg, Quantum-theoretical re-interpretation of kinematic and mechanical relations, 'The present paper seeks to establish a basis for theoretical quantum mechanics founded exclusively upon relationships between quantities which in principle are observable.' back

Wigner's theorem - Wikipedia, Wigner's theorem - Wikipedia, the free encyclopedia, 'Wigner's theorem, proved by Eugene Wigner in 1931, is a cornerstone of the mathematical formulation of quantum mechanics. The theorem specifies how physical symmetries such as rotations, translations, and CPT are represented on the Hilbert space of states. According to the theorem, any symmetry transformation of ray space is represented by a linear and unitary or antilinear and antiunitary transformation of Hilbert space. The representation of a symmetry group on Hilbert space is either an ordinary representation or a projective representation. back

Wojciech Hubert Zurek (2008), Quantum origin of quantum jumps: breaking of unitary symmetry induced by information transfer and the transition from quantum to classical, 'Submitted on 17 Mar 2007 (v1), last revised 18 Mar 2008 (this version, v3)) Measurements transfer information about a system to the apparatus, and then further on – to observers and (often inadvertently) to the environment. I show that even imperfect copying essential in such situations restricts possible unperturbed outcomes to an orthogonal subset of all possible states of the system, thus breaking the unitary symmetry of its Hilbert space implied by the quantum superposition principle. Preferred outcome states emerge as a result. They provide framework for the “wavepacket collapse”, designating terminal points of quantum jumps, and defining the measured observable by specifying its eigenstates.' back

Zeno's paradoxes - Wikipedia, Zeno's paradoxes - Wikipedia, the free encyclopedia, 'Zeno's paradoxes are a set of problems generally thought to have been devised by Zeno of Elea to support Parmenides's doctrine that "all is one" and that, contrary to the evidence of our senses, the belief in plurality and change is mistaken, and in particular that motion is nothing but an illusion.' back

Zero-energy universe - Wikipedia, Zero-energy universe - Wikipedia, the free encyclopedia, 'The zero-energy universe hypothesis proposes that the total amount of energy in the universe is exactly zero: its amount of positive energy in the form of matter is exactly cancelled out by its negative energy in the form of gravity. . . . The zero-energy universe theory originated in 1973, when Edward Tryon proposed in the journal Nature that the universe emerged from a large-scale quantum fluctuation of vacuum energy, resulting in its positive mass-energy being exactly balanced by its negative gravitational potential energy.' back

Zero-point energy - Wikipedia, Zero-point energy - Wikipedia, the free encyclopedia, ' Zero-point energy (ZPE) is the lowest possible energy that a quantum mechanical system may have. Unlike in classical mechanics, quantum systems constantly fluctuate in their lowest energy state as described by the Heisenberg uncertainty principle. As well as atoms and molecules, the empty space of the vacuum has these properties. According to quantum field theory, the universe can be thought of not as isolated particles but continuous fluctuating fields: matter fields, whose quanta are fermions (i.e. leptons and quarks), and force fields, whose quanta are bosons (e.g. photons and gluons). All these fields have zero-point energy. These fluctuating zero-point fields lead to a kind of reintroduction of an aether in physics, since some systems can detect the existence of this energy. However this aether cannot be thought of as a physical medium if it is to be Lorentz invariant such that there is no contradiction with Einstein's theory of special relativity.' back

%%%leftovers%%%

From this point of view, some think of the initial singularity as bubble of enormous energy that has materialized as a universe. My aim is to connect this idea with the model of God developed by Aquinas. The two stories meet in in the initial singularity: both God and the singularity are eternal, structureless and the source of the universe.

In §7 I tried to demonstrate the versatility of this model by imagining the mathematical community as a network. Now I wish to apply the same approach to a comparison of our human intelligence through which we create culture to the divine intelligence which creates the universe. The idea is to compare the vast information processing network built into us which runs from our senses to our muscles via our neural networks to the networks of communication that have brought the universe from the beginning to the present. All living things begin as something akin to a fertilized eggs. Single celled creatures are their own fertilized eggs, reproducing by division. More complex sexual being require the cooperation of two or more individuals to start a new life.

%%% p> In practical communication networks a message originating with user A is passed down through the software layers in A’s computer to the physical layer which carries it to B’s machine. It is then passed up through B’s software layers until reaches a form that B can read. By analogy, communication between one system in the Universe and another goes through the space-time interface into the quantum world which comprises only time and energy, there is it processed and returns to the interface. This system is exactly analogous to what is happening when you use your computer. You put information in, it goes into the processor to be processed and the answer comes out on a screen.

The difference is that the working parts of the computer are also in the spacetime world, and it is through them that your work is passed into the quantum world for processing and them comes back up the chain to your eyes. The processing inside you goes through a similar chain of connection. Your eyes, your body and your mind are systems built on the quantum substrate of the world.

We might construct a further analogy between the functioning of our bodies and the quantum processes that go on between our genes and the molecular processes in our cells that process the instructions from the genes to carry out all the business of life. As we know, the biological "discovery" of genetics was the key to the explosion of life through evolution by natural selection. Like all the recorded literature that forms the backbone of our society, genes keep a living record of all the processes of life throughout the world. So we might ask: how did the world evolve before the origin of genes. How can this happen in the divine foundation of the world where there is no space and no memory? The answer of course is that there is no memory, every thing is done from scratch in real time guided only by the demands of consistency. This is how God really works, not eternal, but intensely ephemeral, all its fixed points are parts of its dynamics.

Does this view of the world get rid of all the complexities of quantum field theory and implement the heuristic of simplicity (§3, princile 10)? I hope so. At the fundamental level, the universe is continually creating itself, a fluid foundation upon which we all float. Here we may see why in our accelerator experiments we can break everything down to pure energy and yet it all reforms and comes out as new particles. So the real physics comes down to describing the rapid almost instantaneous evolution that occurs here in the depths of god.

The network model sees lower layers as providing the facilities to enable the function of the layers above them. In the internet for instance, the transport layer provides that data arrives in the correct order, error free, and without duplicates or losses. It is in the interests of higher layers to maintain the integrity of lower layers in order to ensure their own survival. A principal social role of government is to maintain the infrastructure upon which peaceful social life depends. Internet protocol suite - Wikipedia

In the cosmological context we may see the role of the Minkowski layer as the curation of the Hilbert layer to maintain the integrity of the universe. This is achieved by the process of selection which has been interpreted as the collapse of the wave function (§10). The role of spacetime is analogous the operating system in computers and networks, which handles input and output, memory and communication. I guess that every time the universe successfully "measures" itself by an interaction between two quantum states a particle and a corresponding pixel of spacetime emerges. Quantum mechanics explain spacetime by inventing it all the time, just in time creation of all the parallel processes the world are produced by a serial stream of action at the heart of god.

In the Hilbert layer the differentiation of states, that is their orthogonality is established by their position on the scale of energy or frequency which we might imagine as running from 0 (the eternal lifetime of the initial singularity) to 0, the least upper bound of energy. The Hilbert symmetry of quantum mechanics is its indifference to different positions on this energy scale.

Since particles travelling at the speed of light in Minkowski space are on null geodesics and in effect outside space and time, they also provide a reservoir of static, or potential energy. If, as we may guess, this reservoir is equivalent to the negative of the kinetic energy of massive particles, we have a route to understanding the total energy of the universe as zero, that is equivalent to the energy of the static, eternal inactive initial singularity within which we exist. Zero-energy universe - Wikipedia (ref link §3.14)

If we assume that Hilbert is the primordial space and use the principle of simplicity to argue that in the beginning there are only two opposite phases of recursive function at any energy / frequency we can account for the existence of bosons and fermions without any recourse to the velocity of light and spacelike separation and then argue for frequency based network communication via bosons and wired 3D communication by fermions, giving a ground for 4D space when we use the Minkowski trick to maintain contact when space comes into existence. This is a foundation for quantum gravitation with the additional constraint that closed Lie groups must have curved geodesics to that inertial paths, like the orbit of the moon, can be closed, as is necessary for fixed point theory to work.

%%%

www.naturaltheology.net is maintained by The Theology Company Proprietary Limited ACN 097 887 075 ABN 74 097 887 075 Copyright 2000-2022 © Jeffrey Nicholls