Notes
Sunday 26 January 2020 - Saturday 1 February 2020
[Notebook: DB 84 Pam's Book]
[page 135]
Sunday 26 January 2020
After reading Veltman's book a little light shining on the role
[page 136]
of particles in the structure of the universe. As Veltman points out, each major problem in the development of quantum field theory has been solved by the postulation and discovery of a new particle whose role in the system is a solution to the prevailing problem. This seems to me to reinforce the view that the universe is an information processing network in which particles play the role of representative vehicles of software components of the overall system making it, as it must be, a closed system which can deal with all eventualities. This view may serve as a framework for an overall synthesis of the functioning of the universe that explains the role of all of its subsystems from the fundamental particles to the galaxies and beyond and forms the physical framework for a theology which embraces the spiritual nature of the universe in a manner analogous to the Aboriginal conception of the Land [or Country] as an explanation of human life and spirituality. Martinus Veltman: Facts and Mysteries in Elementary Particle Physics
The intellectual effort that dominated my early 20s when I was in the Dominican order was Thomas Aquinas's treatment of the Trinity in the first part of the Summa, qq. 27-43. Now, some 50 years later, I am struggling to place the physicists' Standard Model in a theological understanding of the divine universe. I came to see the Trinity as a simple network of three elements. Now I see the universe as a rather complex network of a transfinite number of elements held together by a formal transfinite computer network that gives a network address to ever quantum of action, conceived as the execution of a computation in the life of the universe. Prolegomenon to scientific theology: 4.3.5 The Cantor Theorem Very exciting for me, but will anyone ever see my message.
Maybe we can think of space-time as the operating system of
[page 137]
the universe, providing the memory and comunication for all the other systems (particles) that go to make up the whole system. Have I ever said this before? Cannot remember, but can search my notes.
Over the last 400 or so years the physics community has ferreted out all worts of details about the functioning of the universe, work begun by the numerical astronomers and Galileo. All these bits and pieces exist as independent snippets of the overall jigsaw and in many cases their boundaries are vague because they have not been put together. I sort of hope that my universal network picture of the universe will put all these snippets together and complete them by establishing their relationship to one another and perhaps help to remove all the ad hoceries like renormalization which have been found necessary to get the isolated bits to work. Veltman points the way to this idea by showing how different Feynman diagrams, when superposed, correct the difficulties in one another. There is an old saying that one should not show an amateur a job half done, and it may be that the universe applies this adage to virtual particles, ie incomplete processes.
Monday 27 January 2020
Maybe the source of infinities in physics is processes that do not halt because they are incomplete and the identification of processes that complete (halt) eachother removes the infinities, as Veltman explains with the notion of diagrams complementing and controlling one another. A Feynman diagram is a map of a network process. Feynman diagram - Wikipedia
so a dynamic cutoff is equivalent to a halting computation.
[page 138]
Tuesday 28 January 2020
Coming to grips with a foundational task, matching the standard model of theology produced by Thomas Aquinas with the standard model of physics proposed by the high energy people starting in the 50s. Both models appear to me to be defective for a similar reason. Aquinas, following Aristotle, sees spirituality [immateriality] as the foundation of knowledge so that god's omniscience follows directly from its spirituality. A feature of spirituality is its absolute simplicity and here is the problem. The modern theory of information requires marks to encode information and an absolutely simple spiritual god has no marks and so can carry no information and so cannot possibly be omniscient.
The physicists divide the world into two segments which they call real and virtual. All the computations of quantum mechanics are carried out in a virtual world of fields. These calculations provide us with an approach to understanding the nature and probability of the particles and interactions that we observe in the real world. For many physicists [and philosophers like Auyang] the real reality is the unobservable world of fields which, like the spiritual world of Aquinas, has no intrinsic marks and is characterized by continuous processes modelled by complex periodic functions. One of the problems with the mathematical world of fields is that it tends to produce infinite results. These results arise because the virtual states that underly the computations tend to have infinite energies and momenta [resulting from the excursions from the mass shell understood to be permitted by the uncertainty principle] that when integrated tend to yield infinite results that are clearly unrealistic and so have to be modified by renormalization
to eliminate the infinities and get plausible results. In some cases these calculations yield results which agree very precisely with observation. In others they tend to be absurd. Calculations of the cosmological constant are out by up to one hundred orders of magnitude, and it has taken months of supercomputer time to calculate the mass of a proton, a process that seems to be carried out in real time by every proton in the universe. Sunny Auyang: How is Quantum Field Theory Possible?, On shell and off shell - Wikipedia, , Frank Wilczek: The Lightness of Being: Mass, Ether, and the Unification of Forces pp 109, 122 sqq.
It may be possible to correct these problems in the standard models by combining them. My approach to modelling the real universe [with a view to showing that it is infinite enough to be called divine] is via a structure I call the transfinite network. The network is constructed by mapping the countable set of Turing machines onto the countable set of natural numbers. Cantor's construction of the transfinite numbers beginning with the set of natural numbers by creating power sets, which are various combinations the natural numbers, is then mirrored by creating a network by coupling combinations of computing processes. By identifying a halted communication with a quantum of action (which may be a combination of Planck sized quanta of action) this structure provides a real to every quantum of action in the universe [regardless of its size - eg my life is a quantum of action comprising some 1060 Planck quanta]. Cantor's theorem - Wikipedia
Couling this model to the virtual model proposed by quantum field theory nd the standard model id based on the notion that the real and the virtual world's have the same entropy or complexity. This automatically imposes a cutoff on virtual computations which is equivalent to reducing the real numbers to the rational numbers, which is in effect to digitize the virtual world. Despite this approach the invisibility theorem nevertheless shows that the virtual world remains invisible to use because a digital process cannot both execute an algorithm and simultaneously communicate every step it is taking [from the virtual world of computation into the real world of monitor or printer]. Development/Model/Invisibility
[page 140]
Computation halts when a quantum of action is complete or vice versa, ie the task is done and the invisible virtual process becomes real and visible [just as it does on this computer when this machine completes the computation required to transform a key stroke into a glyph on the screen, and idles waiting for the next keystroke].
Completion of computation is equivalent to probability 1, ie it happens. As a die is rolling the probabilities of individual faces appearing when it becomes stationary vary until it does become stationary and the probability of the uppermost face becomes 1, ie things only happen when their probability becomes 1.
The fact that it takes months of supercomputer time to calculate the mass f a proton means that the physicists have yet to discover the real algorithms that run the world since every proton does this computation in real time.
The problem would seem to be all the extra work done following fictitious virtual processes to very high cutoffs. In my thesis I described the real side of the universe with a network model capable of giving a network address to every quantum of action in the universe. This model is imagined to be implemented by a network of Turing machines. The feeling is that the virtual side of the universe follows a similar pattern, so that the uncertainty principle is to be understood in integral rather than continuous arithmetic so that the virtual possibilities resulting from the equation ΔxΔp ≈ h are countable rather than strictly infinite, thus cutting the virtual space down to something computable and it is in this space that the world computes its outcomes, just as the outcomes of my mind are computed
[page 142]
in a digital space of synapses and action potentials.
I think that the principle of requisite variety requires that the entropy of the real and virtual sides of reality are equal and to we can express them both in terms of the transfinite network.
Virtual computations are the foundations of causality, the invisible mechanistic connections that Hume could not see or know about, the logical connections that hold formal mathematical systems together [and explain the effectiveness of computational modelling in physics and engineering]. Maybe we should realise that form / field is the invisible controller of the world, but perhaps quantum field theory is a big mistake and the real formalism that runs the world is the transfinite computer network. Wouldn't that be nice! David Hume: A Treatise on Human Nature
Particles are analogous to the clear and distinct ideas of the cosmological cognition, ideas that pop out at random and I try to capture their forms in these sentences: my writing is the mage of the particle of my thought.
How to elements of the natural network connect? By sharing a frequency, when a photon couples with an electron, a feature that is selected out by Feynman's path integral, seeking situations where two frequencies overlap with probability 1 to give a seamless bond measured by a quantum of action. Frequency is a measure of mass via the relationship E = hf = mc2. So we imagine the masses of all fundamental particles are related by their interactions with one another in some not immediately obvious way.
[page 142]
Wednesday 29 January 2020
The big question at this point is whether I should continue on the theological trajectory or turn to physics. I imagine that the physics might be more intellectually demanding but it will also have the theological benefit of bringing me closer to a synthesis of theology and physics since most of the information in this area comes from the physical side.
I feel that the transfinite computer network model in my thesis takes care of the real side of the universal structure and gives us a machine to logically connect events to one another, and so it is in a sense a virtual model of the whole universe. Before we can apply mathematics we need a model of what we are applying it to. So Boltzmann had a picture of a vast number of atoms in unceasing elastic collision with one another, related kinetic energy to temperature and devised a mathematical scheme to overcome the continuous nature of spacetime in order to devise a computation which yielded a measure of the entropy of the system. Carlo Cercignani: Ludwig Boltzmann: The Man Who Trusted Atoms
The transfinite computer network is in effect a means to implement the fundamental structure of evolution. We are in a universe of pure activity which can try everything, that is produce every possible combination of computational atoms (Turing machines) to create larger computational sequences which are then subjected to a selective process to find the ones that are self maintaining [at least partly closed] and self propagating.
The challenge is to successfully apply this model at the foundation of the universe beginning with the initial singularity and, in the first instance, developing a computational picture which gives us the fundamental particles with their properties of charge, mass, spin, bonding etc as we see in our experimental measurements. What are the possibilities of pulling this off? Obviously a number of serious conceptual steps are required that may require reconception of many physical ideas and parameters
My starting point, for a long time, has been the quantization of action. Both the traditional God and the initial singularity may be seen as quanta of action. They are quantized because they are single and complete, fulfilling the requirements of fixed point theory, containing their own boundary since there is nothing outside them, continuous because they have no interior structure, and convex since they are not complex enough to have holes.
The first step,which has [also] been with me for a long time, is the creation of energy using a logical definition of action. In the most general sense, an action takes an initial situation, call it p and changes it into something else, not-p. At the very beginning we can imagine that [we are in a space of] just 2 states, p and not-p and if we take not-p as our initial state and change it, we get back to p, not-not-p = p. This structure is very similar to a wave, where we can say that not-up is down and non-down is up. The mechanism of quantum mechanics has a lot to do with waves, that is periodic functions. Non-relativistic quantum mechanics works in a pure time [/ energy] domain, so the next step forward in complexity is to imagine the creation of many different frequencies [since no mechanism actually exists yet to control frequency or energy], that
[page 144]
is a spectrum of energies ranging from 0, equivalent to an eternal unchanging quantum of action to, let us say a countable infinity corresponding to very high energy [an approach which echoes the idea that virtual energies may be infinite, constrained only by the uncertainty principle which says that an infinite energy can exist for an infinitesimal time only, which we might identify as just one cycle of the associated wave].
The formalism of quantum mechanics represents these frequencies as orthogonal vectors in a Hilbert space of countable dimensions. So now we have a wider spectrum of action, since p and not-p are no longer a binary option, since p may be frequency f1 and not-p may be frequency f2, and we can imagine all these frequencies mapped in the Hilbert space, f1 to fℵo as distinct entities, in other words we have imagined the creation of a space of frequencies modelled by Hilbert space, and we imagine that all these frequencies can exist simultaneously. This leads us to a definition of space: it is any structure that enables the simultaneous existence of p and not-p.
Here we have a suggestion of the beginning of quantum field theory and the bifurcation of states into boson states, which may be occupied by many particles of the same frequency / energy and fermion states that can only accept one particle. An added feature of the structure here is that bosons have "spin" 1 and fermions shave "spin" ½. Following these initial clues, the task is to follow the emergence of more and more particles with specific properties that distinguish them from one another so generating a much greater spectrum of not-ps corresponding to every p, that is a large number of different channels through which a specific particle can 'decay' into something else by executing a quantum of action. One reason for doing physics would be to get to the point of writing a
[page 145]
thesis taking the sequences of emergences to the total universe as we now have it, the result being a unification of physics and theology as we now have them.
On the other hand, this would be a long way round. I have been reading physics all my life, so I know a fair bit and can teach myself the new bits I need to continue this story in an ad hoc way, so stick to theology where the real payoff is. I already have enough in hand to argue that the universe is divine and a beautiful consistent model of physics would be the icing on the cake. Veltman's book, detailed in last week's notes gave me the idea that it was possible insofar as he treats the particles of the Standard Model as a complete set of software to construct a universe, and all we have to do is continue the phylogeny sketched above to arrive at the standard model, going through action, energy, spacetime, spin etc etc finishing up with the Higgs and something about gravitation. The transfinite computer network provides a framework or stage on which to give a logical account of this [creative] magic, fully implementing Wigner' insight into the role of mathematics in science. Mathematics is 'unreasonably effective' because it is the embodiment of the logic necessary to construct cognitive cosmology. Eugene Wigner: The Unreasonable Effectiveness of Mathematics in the Natural Sciences
Thursday 30 January 2020
Physics avoids action at a distance by potulating continuous fields mapped onto the spacetime domain to connect eventd manifested by the creation and annihilation of particles. Implicit in this idea is the notion of causality by continuity understood mathematically by the continuity of the real and complex numbers and the corresponding geometric number lines. The foundation of my identification of
[page 146]
god and the universe is the replacement if this geometric notion of continuity by what I call logical continuity which I take to be exemplified by a deterministic computing machine that transforms its inout into its output by a series of deterministic logical steps. This idea forms the foundation of what I call cognitive cosmology.
One of the principal indications we have for the truth of this hypothesis is to be found in our own philosophical discussions where we attempt to prove or disprove the consistency of various sets of ideas by the application of logic. One feature of the logic proposed by Aristotle coincides with his notion of continuity.
The modern mathematical discussion of continuity is founded on the notion of continuity by crowding points, which are by definition isolated entities, so close together than there are in effect no gap between them. So we propose that there is a real number between any two real numbers as there is a point between any two points in a continuum. This approach seems inherently self contradictory insofar as we are trying to imagine making a continuum out of isolated elements.
Aristotle's ideas of both physical and logical continuity avoid this pitfall. In his syllogistic logic he joins premised by having them overlap through a middle term. In his discussion of physical continuity he proposes that things are continuous if they have endpoints in common.
In the computer world
[page 147]
systems communicate with one another by sharing a memory location, one machine writing into a location that can be read by another. This is how I am communicating now with you, writing this text into a location that you can read. If we were speaking face to face we would be writing our ideas onto the air that we share and can read.
This understanding of continuity is both logically consistent and much more powerful than geometric continuity [since for instance, we attempt to use use logical continuity to prove all our theorems about mathematical continuity] and I feel that it provides a foundation for a much broader logical understanding of how the world works and applies universally from physical to human communication and beyond.
As a foundational work, I might be able to write my next honours thesis on the subject of continuity laying a further foundation for showing that the universe is divine and giving an answer to Wigner's question about the role of mathematics, that is applied logic, in the understanding of the physical world.
Honours thesis II: Continuity: geometric vs logical or Church and State [exploiting the Land/Country/Cosmology analogy between Indigenous people and post-Christians].
In physics, as in sort we might guess that everything comes down to timing, that is phase.
The next step in the emergence of the universe is the emergence of space-time from time/energy, that is the emergence of energy / momentum. From a dimensional point of view, time/energy and space/momentum are both contained in actin, that is angular momentum. One question is whether we go from 1D [time] to 4D via 2D and 3D, or whether 2D and 3D are skipped. I prefer the emergence of 2D, 3D and then 4D. We can aproach the question through particles, and begin with the old idea that the role of dpace is to enabe the simultaneous existence of two or more particles. Waves and energy are in effect manifestations of
[page 148]
time division multiplexing insofar as the superposition of up and down is in effect nothing, so to exist one must follow the other in time, a feature. The origin of space enables the existence of two or more different orthogonal frequencies, f1 being not-f2 and so on.
The mechanism of quantum mechanics works basically through the superposition of frequencies / energies which implies that they can in effect exist at one point in space. So the question arises, is this really so physically, even though we can easily model it mathematically [through differential equations with a spectrum of periodic solutions] and we understand that two frequencies can exist in a vibrating physical object and they can be extracted by frequency analysis or physiologically by our hearing. But, at the very simple physical level that we are now exploring, is such superposition possible, and if not how does it contribute to the creation of at least one dimension of space and momentum?
Let us proceed on the assumption that two different frequencies occupy two discrete points in some space. We have here a hint of a fermion, since the angular momentum of one frequency has split into two with half the angular momentum each, or we might say spin ½. We notice here that photons with spin 1 have but 1 discrete spin each although they are bosons and so many with the same frequency can exist in the same state as we see with laser light. Another way of looking at this might be that one photon state has split into two electron states, maybe one electron and one positron.
Friday 31 January 2020
In general theology, religion and politics are closely intermingled. The barrier between church and state established
[page 149]
in many jurisdictions has been made necessary by the perception among people who have experienced and embraced the democratic and scientific revolutions that Churches like the Roman Catholic Church are resolutely anti-science and anti-democratic and opposed to many dimensions of human rightd, particularly the rights of women.
ICAST ISCAST - Wikipedia
Saturday 1 February 2020
Moved house. Tennis, the purpose of the game is to place the ball in a spacelike position from the opponent's point of view so that they cannot reach it. Same for many other ball games, and, one may say, mutatis mutandis that it is the aim of all competition is both physical and cognitive space [although of course, violence in various forms may be used to overcome the spacelike constraint].