The theology company logo


vol 8:
A thought experiment

New pages


Site map
Directory
Search this site

Home

1: About
2: Synopsis
3: Development

Next:
Previous: History: toc

4: Glossary
5: Questions

6: Essays
7: Notes
8: History

9: Persons

10: Supplementary
11: Policy

 

 

a personal journey to natural theology


This site is part of the natural religion project The natural religion project     A new theology    A commentary on the Summa    The theology company

 

A thought experiment

Summary

We demolish the universe to see how it is made, and arrive at a rather paradoxical result. Like a good paradox, it leads us into exciting new vistas, and so we come to a first misty outline of a very interesting view of the universe.

* * * * *

 

You can't make an omelette without breaking eggs
Robespierre (and others)

* * * * *

The idea that some particles may have a binding energy greater than their rest mass has bothered me for quite a while. I have devised this though experiment to show why. The speculation that follows from the experiment leads me to a comprehensive theory of the universe which looks quite exciting to my naive eye.

* * * * *

The idea that everything can be pulled apart, the parts pulled apart and so on, is very old. People have long suspected that there must be a limit to this process. Ancient Greek natural philosophers coined the world atom [= uncuttable] for the ultimate unbreakable element.

Modern physicists have been hot on the trail of the atom for nearly a century. Right now they are spending billions of dollars on the Superconducting Super Collider. This enormous machine will be at least twenty kilometres across and will be able to smash particles together with sufficient energy to break them into smaller pieces than ever before. Perhaps these pieces will be atoms. Perhaps it will be back to the taxpayers for another hundred billion dollars to build the Meta-Conducting Ultra-Collider ...

This a low budget article, no billion dollar machines, only speculation. We will be guided in our destructive efforts by nothing more than Heisenberg's uncertainty principle and one or two other selected facts from modern physics.

* * * * *

First we will pause for a moment and look at the magnificent structure we intend to dismantle. Einstein's theory of gravitation gave us the power to look at the universe as a whole. Since it was published in 1915 the general theory has weathered all storms and been greatly extended.

Using Einstein's theory and available astronomical data, we estimate the age of the universe to be about ten billion years and its size, in light years, to be about the same.

The velocity of light, c is about 300 000 kilometres per second. A suitably bent light ray could circumnavigate the earth seven times per second. Satellites take about an hour to do the same trip, and it would be a two week drive, non stop, if there was a freeway round the planet. Light is fast. A light year is the distance light travels in a year, roughly ten thousand billion kilometres. So our current guesstimate of the size of the universe is around 100 thousand billion billion kilometres. It is bigger than anything else!

How much matter is there in the universe? This has proved to be a difficult question but any estimate is better than none.

A good unit for measuring the mass of the universe is the mass of our own star, the sun. The sun is 300 000 times more massive than the earth, which weights just 6000 billion billion tonnes. I'm sorry if the big numbers give you a headache, but they are the only way to get things in perspective. Our galaxy has the mass of a thousand billion suns, and there are many billions of galaxies in the universe. Demolishing it will be a big job.

Fortunately, we do not have to pull the whole thing apart. The result, as we shall see, will be the same wherever we start. For convenience, I will start with the earth.

We descend at dawn to a rainforest alongside a gorge. The first thing to penetrate our demolition hardened hearts is the astounding complexity and beauty of it all. Even the drops of dew on the leaves are glistening with tiny rainbows, and there is a ceaseless interplay of light and shadow between millions of leaves, every one different. Since this is part of the feasibility study, we only take a small part of the rainforest, an ugly little lizard whose name could easily be Moloch horridus. He will not be missed. A quick check reveals that our lizard has a mass of one hundred grams.

* * * * *

The next task is to get our piece of Earth outside the universe to work on it. Off we set in a direction calculated to take us through the end of the universe and outside. There is a particularly striking constellation of stars on the way. The journey is long, and many many stars flash by. Suddenly we see the same constellation. We have travelled far in one direction and ended up where we started, like a circumnavigator of the earth. The universe is curved in all three dimensions. It is a large closed particle. No matter which way we head, we go around and come back to the start.

Never mind. Using a word or two of magic (covered in the budget for this experiment) we are outside the universe. We are surrounded by nothing. Outside the universe is not just empty space, it is not. Even to say surrounded by nothing does not make sense. There is no time, no space, not one of those attributes that we ascribe to our normal existence. Nothing.

At least nothing will get in the way. Does that make sense? Even language cannot work without something to work on.

Magic again. We have established a workshop and begin dismantling the first little part of the earth. Weigh it again. It is more massive. 100.0000001 grams. Why? To lift it up off the earth we had to supply energy to overcome gravitation. Remember that is the old physics energy is the ability to do work, and doing work means moving a force through a distance. A lot of energy has gone into that lizard.

Every Einstein fan knows that mass and energy are equivalent. The energy necessary to get the lizard off the earth was its gravitational binding energy. The magical processes we used to get outside the universe did not record the binding energy between the lizard and the sun, the galaxy or the universe. In fact, the energy binding anything to the earth is equal to about one billionth of its mass. Not much, because gravity is a very weak force.

Having taken a last look at our captive, we put it to sleep and start dismantling. First dissect out all the anatomical structures, skin, muscles, digestive organs, nerves, eyes and so on. What an amazing amount of detail. A million sheets of blueprint could not contain it.

Now we take the organs apart. We find a great variety of cells, each a little living organism working with the others to make the animal. There is binding energy all the way, and our extraordinarily sensitive measuring equipment tells us that the mass of a set of parts is consistently greater than the mass of the assembly from which it came.

Now we are down to atoms, a very large number of them, but only a dozen or so different kinds. On the outside, the lizard looked quite simple, apart from its horrid spikes. As we cut into it, we were faced with incredible complexity. Now we are moving back to simplicity, out of the world of biology into the world of physics. That extraordinary living structure was made by the complex arrangement of myriads of atoms of just a few different elements.

Note that 'atom' here means the atom of the chemists, not the uncuttable atom of the Greek philosophers. Dalton jumped the gun with his nomenclature.

* * * * *

A carbon atom is first up for demolition. Carbon has a tiny nucleus comprising six protons and six neutrons surrounded by a relatively enormous cloud of electrons. Ten million carbon atoms bonded together like a string of pearls stretch about a millimetre. A few hundred million would cross your thumbnail.

The theory of relativity is our guide in the immensity of the universe. Now we turn to quantum mechanics to show us the way around the microworld that atoms inhabit. From quantum mechanics, we need just one ingredient for our experiment: Heisenberg's uncertainty principle.

When Max Planck was trying to work out how light interacted with matter, he kept coming across infinities in his calculations. Somewhere he was dividing by zero, and the results were as you would expect, garbage.

He found that one cannot make the strength of interaction as small as one likes. There is a physical limit to how delicately one particle can touch another. This limit is called the quantum of action. Planck found his calculations made sense if he assumed that any interaction involved an integral number of quanta of action. The quantum of action is very small: 10-34 Joule-seconds in the metric system; and is named Planck's constant in honour of the man who first put us on the track of quantum mechanics.

Heisenberg's uncertainty principle says that there is a limit to how closely things can be defined in the universe, and that limit is the quantum of action. If we set out to measure the energy and time of an event, we find that the more closely we measure the energy, the less we can discover about the time. This is not just a measurement difficulty, it is a fundamental property of the universe. Quantities that limit the definition of one another via the uncertainty principle are called complementary variables. Another pair of complementary variables are length and momentum. They are the ones that interest us now.

We will assume that the size of any microscopic variable is roughly equal to the uncertainty predicted for that variable by Heisenberg's principle. Experience shows this to be a reasonable assumption. Now we come to look at an electron in our carbon atom. The electron is free to roam about in the atom subject to various rules that don't matter here. The point is that we know that the electron is somewhere in the atom but we don't know exactly where. So the uncertainty in the position of the electron is roughly the diameter of the atom.

Knowing this we can calculate the uncertainty in the electron's momentum, using Heisenberg's principles. From this and the mass of the electron, we can calculate its energy. This energy turns out to be the binding energy of the electron. The binding energy of a particle thus follows from how closely it is confined.

When all this calculation is done, it turns out that if the electron has a mass of 1 inside the atom, its mass outside is 1.00001. This is ten thousand times greater than the energy that bound the lizard to to earth. By the time we come to wrench the last of the six electrons from our much diminished carbon atom, the binding energy has risen to one thousandth of the mass of the electron.

We are left with the nucleus, six neutrons and six protons in a little lump one hundred thousand times smaller than the original atom. Sweating and straining, we pull it apart, to find that the binding energy is about 1% of the mass of the original nucleus.

The next step is to pull a proton apart. According to current theory, a proton is made of three quarks, and so far it has seemed impossible to separate them. Not to worry, we will do it anyway. Let the mass of the proton be 1. When we finally get the three quarks into their pigeon holes, their combined mass is 1000. The proportion of binding energy in a particle is creeping up. Where does it all end?

* * * * *

Let's stop when we come to a particle whose binding energy is equal to the mass of the universe. It seems a natural limit. Our efforts to liberate the ultimate particle would use the whole energy of the universe. Call this particle, which we assume to be the most fundamental of all, the archon [Greek 'arche' = first].

How many archons would we get if we demolished the whole universe? Let us say m. Since each has the mass of the universe, it seems that the mass of the parts from a demolished universe is m times greater than the universe we started with.

All is not well. The conservation of energy seems to have run up against the laws of arithmetic. Both are heavyweight foundations of modern thought. Greek has met Greek. Call this state of affairs Jo's paradox.

How do we escape without damage to arithmetic or the conservation of energy?

* * * * *

I want to suggest a new model for the universe that could help.

The model is a modern computer network. A typical network comprises a few hundred computer talking to one another over a net of communication links.

Such networks are organised into layers. At the bottom is the physical machinery, at the top is the human user. In between are layers of software that do all the housekeeping necessary to give each user access to any computer in the network.

Traditionally, lower layers are regarded as hard and higher layers as soft. Hardware and software are thus relative terms, and obey a simple relationship. A given layer of software can operate only if it is supported by all the harder layers beneath it. On the other hand, it can operate independently of all the softer layers above it.

Each layer therefore includes all the layers below it. Layers are numbered from hard to soft, starting with one. A single user command like 'get me Jo' causes the execution of thousands of instructions in different layers in different computer around the network. Computers are very fast, so the user does not notice the delay, nor does she know what is happening. In computer language, the network is transparent to the user. All he sees is what he asks to see.

The separate entities which communicate in a computer network are called hosts. Corresponding layers in two hosts are called peer processes. Actual communication between two hosts takes place in layer one. When a process in host A wishes to communicate with its peer in host B, the message is passed down through all layers to 1 in A, transmitted over the hardware link to layer 1 in B, and then passed up to A's peer process in B.

* * * * *

The hardest layer in any computer is the clock. A clock cycles between two states. In the abstract, the only property these states need have is that one is not the other. One cannot image a simpler active device than a clock.

Now let us introduce an hypothesis. Suppose that the universe is an enormous communication network and that the archon is the hardest layer, the clock. Its frequency is related to the mass of the universe by Planck's constant:

f = Mc2 / h

where M is the mass of the universe. From the values of M, c and h we calculate f to be about 10100 ticks per second.

Before we go any further, we must introduce a new unit. To measure something is simply to describe it in a special language. The special language of physics is its unity. There are thousands of different units in use, but they all boil down to some combination of mass, length and time.

We are going to go further and reduce all measurements to one unit, the measure of information usually called the bit.

The bit is a very special measure. It is the distance between being something and not being something. It needs no standard, as the metre, kilogram or second do. Or put it another way, the standard is absolute, the gap between being and not being. The bit is thus a good candidate for a fundamental unit.

In computerland, each act of a processor is called an operation or op, and it is specified by an opcode. A clock can only do one thing, change to the opposite state. Its only opcode is 'tick'. Since the clock has two equiprobable states, the potential information processing power of one clock op is one bit. We call the one bit op a tick.

There is no way of measuring the frequency of an isolated archon, because there is nothing to compare it to. It is the clock of the universe, and everything takes its time form it. The value we assign to it is arbitrary. In a system of units where h = c = 1, we may as well call the mass of the universe 1 and the frequency of the archon 1 as well.

What have we done? In effect we have digitised the universe. Quantum mechanics is 'quantum' because the universe is digital. If this model is correct, we can apply the powerful mathematics of digital information processing to the universe and understand the process of nature in a new way. This explanation includes ourselves and everything we do, including our efforts to understand the processes of nature!

* * * * *

The analysis is over. Let us put the universe back together again and see how it works.

The clock is the hardest, fastest and simplest thing in the system (never mind the electronic gadgetry necessary to implement it). A computer is an implementation of symbolic logic. A clock is a cyclic NOT machine. To build a computer we also need to implement the logical AND. With these two logical functions we can do anything, given enough cycles.

Since this is still a though experiment, we needn't worry for the moment how the universal computer is actually made. Let us go back to the proton. In physics, the proton is called a particle. In our model, it would be called a process.

Let us consider two protons as peer processes. They want to talk to one another, that is interact in some way. We know that the archon is layer 1, but we do not know how many layers there are between the archon and the proton, so lets put the proton in layer n. We presume it shares this layer with all the other baryon processes. Proton A sends its message down to layer n-1 (which we believe to be the quark), which sends it to layer n-2 (preon? rishon?) and so on down through all the layers to 1. This is the hardware point of contact. The message then comes up through the layers to proton B. From the point of view of the protons the quarks and all the other layers down to the archon are transparent.

Now we have equated the terms particle and process. In a computer network we might think of the whole operation as a single process made up of sub-processes which have sub-processes right down to the fundamental logical operations in the hardware. What is or is not a process is thus somewhat arbitrary. When a system is being designed it is customary to break it up into small processes so that the work can be given to different programmers. A proper set of programming and communication rules makes it certain that everything will work when it is finally linked together.

Applying this to the universe, we can think of it as one huge particle, or we can turn our attention to isolated particles (processes). Such a sub-process might be a proton, or a human being or a planet.

* * * * *

What is the solution to Jo's paradox. There's not space for it all, but here's the clue. We have equated mass [= energy] with processing rate. The relative mass attributed to any particle is simply the proportion of the total processing effort that that particle uses to perform its task in the overall network.

When one designs a computer any process which is performed often is usually made into a subroutine or subprocess. For example, when doing a long multiplication, you often have to add and carry (ADDC) and multiply and carry (MULC). In the course of multiplying two six digit numbers, you will have to do 36 MULCs and 25 (pairwise) ADDCs.

Because MULC and ADDC are used so often they will be written in efficient code and be very fast. Any process that needs them will call them, so they they do not serve just one process.

In a single computer, one might have just one each of MULC and ADDC. In a network, there might be one in every computer. There are other reasons for duplicating subroutines: if speed is important, for instance. The point is that subroutines are used often and therefore, in our physical model, will be massive.

So we interpret quarks as subroutines called by protons. Although each proton may call three different quarks, each quark is not exclusive to a particular proton. Our belief that n protons have 3n quarks is thus wrong. On the assumption that the mass of the isolated quarks from a single proton is 1000 times greater than the mass of the proton, we can conclude that each quark serves about 3000 protons.

When the quark is 'in' the proton, we see as its mass only those operations it performs on behalf of that proton, ie only 1/1000th of its total operations. When we blast it out into the open, however, we see all its operations as it is being called by 999 other protons. So its mass appears 1000 times greater than when it was in the proton.

If you have not spent much time wandering about in the plumbing of computers, this might seem a bit hard to stomach at first. But it will grow on you.

Books

Creutz, Michael , Quarks Gluons and Lattices, Cambridge UP 1983 Jacket: 'This book introduces the lattice approach to quantum field theory. The spectacular successes of this technique include compelling evidence that exchange of gauge gluons can confine the quarks within subnuclear matter. The treatment begins with the lattice definition of the path integral method and ends on Monte Carlo simulation methods.'  Amazon  back
Davis, Martin, Computability and Unsolvability, Dover 1982 Preface: 'This book is an introduction to the theory of computability and non-computability ususally referred to as the theory of recursive functions. The subject is concerned with the existence of purely mechanical procedures for solving problems. ... The existence of absolutely unsolvable problems and the Goedel incompleteness theorem are among the results in the theory of computability that have philosophical significance.'  Amazon  back
Deutsch, David, The Fabric of Reality: The Science of Parallel Universes - and its Implications, Allen Lane Penguin Press 1997 Jacket: 'Quantum physics, evolution, computation and knowledge - these four strands of scientific theory and philosophy have, until now, remained incomplete explanations of the way the universe works. ... Oxford scholar DD shows how they are so closely intertwined that we cannot properly understand any one of them without reference to the other three. ...'  Amazon  back
Dodd, J E, The Ideas of Particle Physics: An Introduction for Scientists, Cambridge UP 1991 Jacket: 'This book is intended to bridge the gap between traditional textbooks on particle physics and the popular accounts of the subject ... Although entirely self contained, it assumes a greater familiarity with the basic physics concepts than is usually the case in popular texts. This then allows a fuller discussion of more modern developments.'  Amazon  back
Hughes, I S, Elementary Particles, Cambridge Univerity Press (First published by Penguin Books 1972) 1991 Jacket: 'This is an extensively revised and updated edition of a text that has established itself as one of the standard undergraduate books on the subject of elementary particle physics.'  Amazon  back
Kauffman, Stuart, At Home in the Universe: The Search for Laws of Complexity, Oxford University Press 1995 Preface: 'As I will argue in this book, natural selection is important, but it has not laboured alone to craft the fine architectures of the biosphere ... The order of the biological world, I have come to believe ... arises naturally and spontaneously because of the principles of self organisation - laws of complexity that we are just beginning to uncover and understand.'   Amazon  back
Lo, Hoi-Kwong, Introduction to Quantum Computation and Information, World Scientific 1998 Jacket: 'This book provides a pedagogical introduction to the subjects of quantum information and computation. Topics include non-locality of quantum mechanics, quantum computation, quantum cryptography, quantum error correction, fault tolerant quantum computation, as well as some experimental aspects of quantum computation and quantum cryptography. A knowledge of basic quantum mechanics is assumed.'  Amazon  back
Penrose, Roger, The Emporer's New Mind: Concerning Computers, Minds, and the Laws of Physics, Vintage (Oxford University Press 1989) 1990 Jacket: 'Arguing against artifical intelligence, and exploring the mystery of the mind and consciousnesss, Roger Penrose takes the reader on the most engaging and creative tour of modern physics, cosmology, mathematica and philosophy that has ever been written.'  Amazon  back
Silk, Joseph, The Big Bang: The Creation and Evolution of the Universe, Freeman 1988 Jacket: 'Written for the non-specialist, The Big Bang describes the greatest contemporary puzzles and achievements in astronomy, cosmology and astrophysics, clearly recounting the history of the universe and examining current controversies from several points of view. The book concludes with a self contained appendix providing the basic mathematical framework for understanding modern cosmology."  http://www.amazon.com/exec/obidos/ASIN/0716719975/tnrp">Amazon  back
Tanenbaum, Andrew S, Computer Networks, Prenctice Hall International 1996 Preface: 'The key to designing a computer network was first enunciated by Julius Caesar: Divide and Conquer. The idea is to design a network as a sequence of layers, or abstract machines, each one based upon the previous one. ... This book uses a model in which networks are divided into seven layers. The structure of the book follows the structure of the model to a considerable extent.'   Amazon  back

 

  in association with Amazon.com

Click on an "Amazon" link in the booklist at the foot of the page to buy the book, see more details or search for similar items

Related sites:


Concordat Watch
Revealing Vatican attempts to propagate its religion by international treaty

Copyright: You may copy this material freely provided only that you quote fairly and provide a link (or reference) to your source.

 


Top
next:
previous: History: toc
Google
Search WWW Search naturaltheology.net Search physicaltheology.com

top

site scripted with Frontier
This page was last built on 2/28/09; 11:19:08 AM by jhn. tnrp@bigpond.com

ntBLine picture