natural theology

This site is part of The natural religion project
dedicated to developing and promoting the art of peace.

Contact us: Click to email
vol 6: Essays

On modelling the world

In the beginning God created the Heavens and the Earth.
(Genesis 1:1 [Unknown authors, circa 1000 bce] Genesis)

0. Abstract

1. Introduction

2. A definite model: computer networks

3 The physical layer: some fixed points
  3.1 Aristotle’s unmoved mover
  3.2 The world of light
  3.3 The initial singularity
  3.4 The isolated quantum system
  3.5 The Turing machine
  3.6 Isomorphism

4. From isolation to communication
  4.1 The Word of God
  4.2 Quantum measurement

5. Quantum field theory and the computer network
  5.1 Why is the Universe quantized?
  5.2 Calculus and the quantum
  5.3 Action and the path integral
  5.4 Feynman diagrams

6. A transfinite computer network
  6.1 The Cantor Universe
  6.2 The transfinite symmetric universe
  6.3 Cantor symmetry
  6.4 Computation: quantum information theory
  6.5 Creative evolution

7. Some corollaries
  7.1 ‘The Unreasonable effectiveness of mathematics’
  7.2 The view from inside the Universe
  7.3 Network intelligence
  7.4 Past and future
  7.5 Entanglement and ‘spooky action at a distance’
  7.6 Numerical results

0 Abstract

Each of us follows a path through life. To follow a path, one needs navigation, and to navigate one needs both a model (map) of the space to be navigated and a method of locating oneself on the map. Traditionally, navigation in human space is studied by philosophers and theologians with very different world views.

Over the last few centuries, science has provided us with a quite comprehensive picture of our habitat that begins with the initial singularity predicted by general relativity and traces the evolution of the Universe to its present state.

To physicists, this work is an approach to a ‘theory of everything’. It provides a good foundation for practical technologies like computers, power grids, space vehicles and medicine, but has little to say about human spiritual issues. The purpose of this article is to extend the physical model into the spiritual realm, rather as Aristotle generated metaphysics out of physics. Aristotle

Traditionally the majority of philosophers and theologians have postulated a very real and sharp distinction between such categories of being as spiritual and material, living and non-living, intelligent and not-intelligent, eternal and ephemeral and so on.

Here I propose a model that embraces these and all other distinctions in the world in a single consistent picture.

back to top

1 Introduction

Physical cosmology has taught us that in terms of space and time we are a small element of a large Universe. Yet we are still inclined to believe that we are unique, rather overlooking the intelligence of the Universe that made us. Here we assume that creative intelligence is universal and attempt to understand how it works to create the Universe, ourselves, and our theories about ourselves.

This essay is based on a model of the Universe derived from the structure of the computer communication networks that are rapidly evolving round us. Some of the oldest questions in philosophy concern the relationship between motion and stillness. As far as we know, Parmenides started the ball rolling when a Goddess told him that true reality was full and eternal, the moving world being somehow less than fully real and an imperfect object of true knowledge. John Palmer - Parmenides

Truth may be defined as an isomorphism between two meaningful entities, often a sentence and reality: The truth of a sentence consists in its agreement with (or correspondence to) reality. Parmenides is speaking of an eternal correspondence between eternal reality and eternal knowledge. Here we admit a time element to truth: he is standing on his head remains true as long as he is standing on his head, but become false when he falls over. Tarski

We live in a dynamic Universe which nevertheless exhibits points that remain fixed for various lengths of time. Some, the subjects of science, may remain true for all time. Others may be temporary, but we may track changes by revising our sentences as fast as the reality to which they refer is changing. This idea is encapsulated in mathematical sampling theorems, that tells us that as long as we sample a changing reality twice as fast as the maximum frequency of change, our samples will be a true representation of the reality sampled. Nyquist

Here we propose that a class of mathematical theorems known as fixed point theorems provide a relatively simple answer to the problematic relationship between motion and stillness. These theorems, pioneered by Brouwer, that tell us that whenever a convex compact set is mapped onto itself at least one point remains unmoved. van Heijenoort, Casti

It is easy to see that the Universe might fulfill the hypotheses of these theorems. Convex means that it has no ‘holes’: one can get from any A to any B without having to go outside the Universe; compact means that it contains its own boundaries. By definition, there is nothing outside the Universe. Brouwer allows us to turn Parmenides around, seeing the dynamics as primary and the forms, that is the fixed points, as fixed points in the motions. The consistency of the form is to be explained by the consistency of the motion that contains it. Brouwer fixed point theorem - Wikipedia

This approach is consistent with the view that the Universe is pure act (in the sense used by Aquinas, (Summa I, 2, 3) and so divine. It is also consistent with the idea that we are inside rather than outside the divinity, so that the dynamics of the Universe is equivalent to the life of God. Aquinas 13: Does God exist?, Aquinas 113: Is life properly attributed to God?

back to top

2 A definite model: the computer networks

A network is a set of computers (like the internet) physically connected to enable the transmission of information. Although the processes in individual computers are deterministic, a network is not, since any computer may be interrupted at any point in its process and steered onto a different course by information received.

At its most abstract, a network comprises a set of addressed memories and a set of agents that can read from and write to those memories. The engineering of robust practical computer networks is a complex business. Tanenbaum

To deal with these complex problems, networks are constructed in layers. The lowest layer is the physical (or hardware) layer, the wires, transistors, etc necessary to physically realize formal logical operators and connect them together. The physical layer is necessary because all observable information is represented physically, as electric currents, ink on paper, sounds and so on. Rolf Landauer

At the opposite extreme is the user layer, people sending and receiving data through the network. In between these two layers are the various software layers that transform user input into a suitable form for physical transmission and vice versa.

Each subsequent software layer uses the routines of the layer beneath it as an alphabet of operations to achieve its ends. Extrapolating beyond the computer network, a human in the user layer may be a part of a corporate network, reporting through further layers of management to the board of his organization. By analogy to this layered hierarchy, we may consider the Universe as a whole to be the ultimate user of a universal network.

Corresponding layers (‘peers’) of two nodes in a network may communicate if they share a suitable protocol. All such communication uses the services of all layers between the peers and the physical layer. These services are generally invisible or transparent to the peers unless they fail. Thus in conversation we are almost completely unaware of the complex physiological, chemical and physical processes that make our communication possible.

back to top

3 The physical layer: some isolated fixed points

Using the network model, we interpret motion as the transmission and receipt of messages. Further, we accept Landauer’s hypothesis that all information is represented physically, so that there are no ‘pure spirits’. Everything is embodied. We begin our discussion with a set of isolated fixed points which have been defined in various historical contexts. We then note their identity before turning to the task of binding them into a communication network.

3.1 Aristotle’s ‘unmoved mover’

Aristotle supposed that nothing could move itself since he saw motion as the realization of a potential, and held it to be axiomatic that no potential could actualize itself. This led him to propose an unmoved mover ‘whose essence is actuality’ as the source of all motion in the world [Aristotle 1071b3 sqq].

Aristotle notes that ‘the object of desire and the object of thought move without being moved’ [1072a26]. The first mover moves things, not by pushing them, but by attracting them as a ‘final cause’.

3.2 The world of light

The fundamental axiom of special relativity is that observers moving without acceleration (inertially) and at rest with respect to the object observed see the same data. In particular, all inertial observers will see the same velocity of light regardless of their state of motion.

A simple geometrical argument based on this observation leads to the Lorentz transformation. Some well known consequences of this transformation are that if I observe you going past me at a significant fraction of the velocity of light, your clocks will appear slow relative to mine, your rulers will appear foreshortened in the direction of motion and your mass will appear greater.

The mathematical apparatus of Lorentz transformation was greatly simplified by Minkowski. In the Minkowski representation the space-time interval between events appears to be the same regardless of the relative inertial motions of the observer and observed. Using this metric, we arrive at a property of photons and other particles travelling at the velocity of light: from the point of view of any observer a clock on a photon appears to be stopped and the photon has zero length. Given these facts, we might imagine that from the point of view of an hypothetical observer, photons exist in an eternal world of zero size, another fixed point. The velocity of light is not special here. A similar situation arises whenever the speed of communication is equal to the speed of the entity with which one attempts to communicate. Streater & Wightman

3.3 The initial singularity

Einstein built the theory of general relativity on the special theory. The general theory predicts that the Universe is either expanding or contracting. Observation suggests that the latter is true, and we can extrapolate toward the past to a point known as the initial singularity where space-time as we experience it ceases to exist [Hawking & Ellis 1975]. This point, representing the beginning of the whole Universe, is effectively fixed and isolated since there is, by the definition of Universe, nothing outside it. Misner, Thorne & Wheeler, Hawking & Ellis

3.4 The isolated quantum system

Quantum mechanics falls naturally into two sections. The first describes the unitary evolution of state vectors in an isolated quantum system. The second describes what happens when such isolated systems communicate. Here we deal with the isolated system, turning to the theory of communication below.

Quantum theory is developed in a mathematical Hilbert space. Hilbert spaces are a species of function space, each point in the space representing a function or computation which transforms an input domain to an output range. We may consider this text as a function from the natural numbers to the english alphabet. The domain of this function is the natural numbers which we use to number and order the symbols in the text, beginning with 1 for the first symbol and n for the last.

The value of this function at any location is the symbol appearing at that point. We can represent this text in an n dimensional space, each dimension corresponding to a symbol. We can use the same space to represent any n symbol text. Using this approach, each text is represented by a vector or point in n-symbol text space.

Quantum states are represented by vectors in Hilbert space where the symbols are not letters of the alphabet but complex numbers. The evolution of isolated quantum systems is governed by the Schrödinger equation

iℏ ∂ψ / ∂τ = H ψ

where ψ is a state vector, H represents the energy associated with each state vector, is Planck’s constant, τ is time and i is the symbol for the ‘imaginary’ basis vector of complex numbers.

This equation describes a superposition of an infinity of different frequencies, analogous to the sound of an orchestra. It is a generalized version of the ‘Einstein-deBroglie’ equation, E =ℏω, which represents the fixed mechanical relation between energy and frequency, ω.

Quantum processes are in perpetual motion as long as there is energy present. Given that there is nothing outside the Universe, such processes must map the Universe onto itself, and we would therefore expect fixed points. These fixed points are identified by the eigenvalue (special value) equation H ψ = c ψ, where c is an eigenvalue. This equation is satisfied by those eigenfunctions of the operator H which leave the phase of the state vector ψ essentially unchanged.

This representation is purely conjecture, since we cannot, by definition, observe an isolated system and gain information from it. Nevertheless, the quantum mechanics built on this idea works, making it practically credible.

3.5 The Turing machine

A Turing machine is a formal deterministic machine generally agreed to be able to compute all computable functions. It thus serve as a definition of ‘computable’. From a practical point of view, a computer is the physical embodiment of an algorithm or set of algorithms. There are a countable infinity of different algorithms. A machine which can be programmed to compute any of these algorithms is called a universal Turing machine. Alan Turing, Davis

Any algorithm except the simplest can be constructed by a sequence of lesser algorithms (like the individual arithmetic operations in multiplication). Modern computers implement Russell’s idea that logic embraces the whole of computable (and communicable) mathematics, since in a binary digital computer all algorithms are ultimately broken down to a sequence of propositional functions. Further, all these operations can be modelled with a single operation known to logicians as the Sheffer Stroke and to computer people as the NAND gate. Russell, Mendelson

Part of the power of a digital computer lies in its ability to execute very simple logical functions repeatedly (periodically) at high frequency, and to build simple high frequency routines into longer complex routines which execute at a lower frequency. We may thus imagine a computation as a superposition of frequencies analogous to the superposition of frequencies we see in quantum systems.

Further, the eigenvalues selected by the eigenvalue equation correspond to eigenfunctions of the operator involved, which correspond in turn to algorithms or Turing machines. The similarity between quantum mechanics and computation was first noted by Feynman and has now become an important area of research. Feynman, Nielsen & Chuang.

3.6 The isomorphism of these isolated points

Each of the systems outlined above contains an invisible isolated process: Aristotle’s god enjoys the pleasure of thinking about itself [1072b15] while remaining completely independent of the world; to communicate with an inertial system would be to exert a force upon it so that it was no longer inertial; the initial singularity is isolated since it contains the Universe outside which there is, by definition, nothing; according to the theory, isolated quantum systems cannot be observed without changing them; one cannot observe the internal state of a Turing machine without breaking into its process and so destroying its determinism.

Here we propose that motions of these isolated systems are equivalent to Turing machines and therefore isomorphic to one another despite their differences in definition and their historical cultural roles. This proposition is based on the premise that the motion of a universal Turing machine embraces all computable functions, that is all observable transformations.

back to top

4 From isolation to communication

4.1 The Word of God

In ancient times many cultures established a one to one correspondence between their Gods and different features of human existence like love, war, reproduction and so on. The Hebrews, on the contrary, became monotheist attributing all functions to one God which was ultimately transformed into the Christian God.

One of their thorniest problems facing Christian theologians was reconciling Hebrew monotheism with the Trinity that the early writers had woven into the New Testament: the Father (reminiscent of the Old Testament God), the Son (who came to Earth as a man) and the Holy Spirit (who guides the evolution of Christianity).

The standard psychological model of the Trinity was first suggested by Augustine and developed by Aquinas (Augustine ad 400; Aquinas ad 1265). Their theories of knowledge require that the known exist in the knower, a representation often called a ‘mental word’ by analogy to the spoken word derived from it. Augustine, Aquinas 160

Aquinas saw this representation as accidental in human knowledge but essential in God. Thus he considered God’s knowledge of Himself, the Word of God, equivalent to God, distinguished only by the relationships of paternity and sonship established by the procession of the Word from the Father (Lonergan 1997; 2007). Lonergan

From our point of view, the procession of the Son from the Father is the equivalent to the creation of a minimal two unit network within an isolated system. Christianity, guided by its interpretation of the Biblical text, proposes that the Spirit is the substantial love between the Father and the Son, but stops the process at this point. Here we see this ancient model as representing the first steps in the creation of a transfinite network of independent but communicating agents which is isomorphic the the Universe as we know it.

4.2 Quantum measurement

We learn about isolated quantum systems by observing or measuring them. Mathematically, quantum mechanics represents a measurement as an operator called an observable.

In quantum mechanics the outcomes of measurement are restricted to eigenvalues corresponding to eigenfunctions of the observable. The mathematical ideas involved here are complex but, since we are dealing with communication and we are natural communicators, quantum measurement is easily explained in terms of human experience and communication theory.

Quantum mechanics tells us, in effect, that the things we see are the things we look for. From this point of view, the information available from quantum mechanics lies not in what we see, but in the frequency of appearance of the different things seen.

The measured observable S is the operator representing the system we are using to sense the unknown stateψ of the system to be measured. We can only observe elements of the set of the eigenfunctions {sk} of the operator S. No other vectors are visible, although we might assume that they exist at some point in the evolution of the measured system.

The probability of finding a given outcome is given by Born’s rule:
pk = |<sk|ψ>|2 where pk is the probability of observing the kth eigenfunction. Born rule - Wikipedia

Perhaps the biggest surprise in quantum mechanics is that quantum measurement does not reveal the fixed value of a given set of parameters at a given time, like the position of Mars at time t.

Instead it yields a probability distribution, telling us that if the observed system is in state ψ and we observe with operator S, we will see one or other of a fixed set of results, each appearing with a certain probability. When we measure the spectrum of an atom, we find ‘lines’ of fixed frequency corresponding to various processes in the atom, and each line has a certain ‘weight’ which tells us how frequently the corresponding process occurs.

The sum of the probabilities of the possible outcomes of an observation must be 1, since it is certain that one and only one of them will appear at each measurement. This constraint is identical to the constraint on a communication source: the sum of the probabilities of emission of the various letters of the source alphabet must also be 1.

This mathematical similarity leads us to consider that a sequence of quantum measurements is (at the level of letter frequencies) statistically identical to the output of a communication source.

back to top

5 Quantum field theory and the computer network

The formalism of quantum mechanics makes no particular reference to ordinary physical space or time, but deals essentially with the motion of state vectors in Hilbert space. Historically, quantum mechanics was first applied to Newtonian space, but it soon became clear that a true description of the world must combine quantum mechanics with special relativity in Minkowski space. The result of this marriage is quantum field theory (QFT).

QFT is the foundation of the Standard model, which although very successful, fails to comprehend gravitation and suffers from some logical and mathematical difficulties. Veltman

Perhaps the most counterintuitive feature of the Universe, from the point of view of the continuous mathematical models of classical physics, is that all physical observations are quantized or ‘digital’. Experimental physics revolves around classifying (‘binning’) and counting events. When we observe the output of physicists and mathematicians (‘the literature’) we see that it too is quantized, into discrete volumes, articles, words and symbols.

Many of the problems of QFT arise from the attempt to explain these quantized observations using continuous mathematics. Here we propose that these problems can be overcome by applying discrete mathematics, which embraces integral (Diophantine) arithmetic, logic and the theory of computation (Chaitin 1987). Chaitin

5.1 Why is the Universe quantized

Here we take the quantization of the Universe as evidence that it can be modelled as a communication system. Science proceeds by measurement. Shannon, who founded the mathematical theory of communication, saw that entropy can be used as measure of information [Shannon 1948, Khinchin 1957]. The information carried by a point in any space is equivalent to the entropy of the space. Claude E Shannon, Khinchin

Entropy is simply a count, usually converted to a logarithm for ease of computation. In communication theory we imagine a message source A with a source alphabet of i letters ai whose probability of emission is pi. The sum of these probabilities is taken to be 1, meaning that at any moment the source is emitting 1 and only 1 letter. The entropy H of such a source is defined to be H = - Σi pi log2 pi. By using the logarithm to base 2 we measure of entropy (and information) in bits.

The mathematical theory of communication is not concerned with the meaning of messages, only with the rate of error free transmission of strings of symbols from a certain source over a certain channel.

Given this measure of information, Shannon sought limits on the rate of error free communication over noiseless and noisy channels (Shannon 1948). The theory he developed is now well known and lies at the heart of communication engineering. Claude Shannon

In essence, Shannon showed that by encoding messages into larger blocks or packets, these packets can be made so far apart in message space that the probability of confusing them (and so falling into error) approaches zero. This is identical to the quantization observed wherever we look in the Universe.

For a given channel, Shannon’s theorems define a maximum rate information transmission C. A system that transmits without errors at the rate C is an ideal system. Features of an ideal system that are relevant here are:

1. In order to avoid error, there must be no overlap between signals representing different messages. They must, in other words, be orthogonal, as with the eigenfunctions of a quantum mechanical basis [Zurek 2007]. In other words, error free communication demands quantization of messages. Wojciech Hubert Zurek
2. Such ‘basis signals’ may be chosen at random in the signal space, provided only that they are orthogonal. The same message may be encoded in any orthogonal basis provided that the transformations used by the transmitter and receiver to encode and decode the message are modified accordingly.
3. The signals transmitted by an ideal system are indistinguishable from noise. The fact that a set of physical observations looks like a random sequence is not therefore evidence for meaninglessness. Until the algorithms used to encode and decode such a sequence are known, nothing can be said about its significance.
4. As the system approaches the ideal and the length of the transmitted signal increases, the delay at the transmitter while it takes in a chunk of message for encoding, and the corresponding delay at the receiver, increase indefinitely. The ideal rate C is only reached when packets comprise a countably infinite number of bits.
5. Only in the simplest cases are the mappings used to encode and decode messages linear and topological. For practical purposes, however, they must all be computable. In addition, in order to recover encoded messages, the computations used to encode messages must be invertible so that the decoded message is identical to the original.

5.2 Calculus and the quantum

Experience has shown that mathematical models are extraordinarily effective tools for predicting the behaviour of the physical world (Wigner 1960). The general approach to modelling a physical system is to create a ‘configuration space’ large enough to be placed into one to one correspondence with all possible physical states of the system and then to seek ‘laws’ or ‘symmetries’ (often expressed as equations) that constrain events in the configuration space to just those that are actually observed in the world. Eugene Wigner

We may date modern mathematical physics from Newton’s discovery of calculus which provides a means to discuss ephemeral states of motion using static mathematical symbolism to express the invariant features (stationary points) of a motion.

The configuration space for the Newtonian dynamics of the Solar system is a three dimensional Euclidean space and a universal time that applies equally to all points in the space. Newton’s fundamental insight is that acceleration (a) is proportional to force (F), expressed by the equation a = F/m, where the mass m is the constant of proportionality between force and acceleration.

Acceleration is the rate of change of velocity (v) with respect to time, in symbols a = dv/dt. Velocity itself is is the rate of change of position (x) with respect to time: v = dx/dt. Putting these together we write a = d2x/dt2 = F/m.

Such differential equations are statements of differential calculus, and enable us to calculate accelerations and forces from measurements of position. The moving system changes, the differential equation does not. Using the laws discovered by Kepler in the astronomical measurements of Brahe, Newton was able to arrive at his law of gravitation: the gravitational force acting between two heavenly bodies is proportional to the product of their masses divided by the square of the distance between them.

The inverse of differentiation is integration, which enables us to work in the opposite direction, from forces to positions. Given the law of gravitation, integration enables us to predict the future positions of the planets given their current positions. Together differentiation and integration are the subject of calculus.

Mathematically, calculus is closely related to continuity. Newton’s discoveries led close reexamination of the problems of continuity reflected in Zeno’s paradoxes and the the ancient discovery that no rational number corresponds to certain geometric magnitudes like the diagonal of a unit square.

The remarkable success of Newton’s methods led most physicists and philosophers to assume that physical Universe is continuous. Quantum mechanics was born, however, when Planck discovered that continuous mathematics cannot describe the relationship between ‘matter’ and ‘radiation’.

The development of quantum mechanics was rather slow and painful, since it was not clear how to apply the traditional mathematical methods of calculus to discrete quantum events. The eventual solution is Born’s rule, stated above (section 5.2).

The continuous equations of quantum mechanics do not describe the motions of particles in the Newtonian sense. They describe instead the probabilities of various discrete events. This raises the problem of reconciling the quantum mechanical and Newtonian descriptions of the world. A common route begins with Feynman’s path integral method (Feynman 1965).

5.3 Action and the path integral

Although Newton’s method of predicting the future positions of the planets looks simple in theory, the difficulty of applying it to systems three or more bodies motivated the search for more powerful methods.

An important results of this search is Hamilton’s principle: that the world appears to optimize itself using the principle of stationary action. Hamilton defined action (S) as the time integral of the Lagrangian function (L). The Lagrangian is the difference between the kinetic (T) and potential (V) energy of a system: L = T - V; S = ∫ L dt. The action is said to be stationary when small variations in the Lagrangian do not change the action.

Hamilton’s principe has proven enormously fruitful, and serves as a bridge between classical and quantum mechanics. One way to understand this is Feynman’s path integral formulation of quantum mechanics.

A classical particle moving from a to b moves on a definite path through space and time which may be precisely computed using Hamilton’s principle. In contrast, Feynman assumed that all possible paths from a to b are equally probable, and then used Hamilton’s principle and the quantum mechanical principle of superposition to select a particular path.

It is clear from the solution to the Schrödinger equation set down above that one quantum of action corresponds to one period of the state vector or, in our model, one elementary computation. The rate of rotation of a state vector is directly proportional to the energy associated with it so that the total number of revolutions made by the state vector associated with a particle following a certain path is exactly equivalent to the action associated with that path.

It is an axiom of quantum mechanics that the amplitude for an event which can occur in many independent and indistinguishable ways is obtained by adding the amplitudes for the individual events: ψ= Σiψi.

This is the superposition principle, and is the case in the famous two slit experiment. Feynman added amplitudes for all possible indistinguishable paths to obtain the total amplitude for the event, and idea that originated with Dirac.

Feynman’s principal contribution was to apply the principle of stationary action to this superposition. The path of stationary action is that path whose neighbours have almost the same action, so that the amplitudes add ‘constructively’ thus (according to Born’s rule) maximizing the probability that this is the path actually taken. Feynman’s approach thus provides a rational basis for Hamilton’s principle: the classical trajectory is the one whose neighbours all have the same action.

How does this look from the computational point of view? We regard each quantum of action as the physical manifestation of one logical operation. A ‘path’ then corresponds to an algorithm, an ordered sequence of logical operations.

The efficiency of such algorithms is measured by the number of operations necessary to go from the initial state a to the final state b, more efficient algorithms requiring fewer operations and being very similar to one another, so that the algorithms clustered around the most efficient algorithm will be very close together, that is, their action will be stationary under slight variations

5.4 Feynman diagrams

The Feynman diagram is a natural complement to the path integral. The path integral gives equal weight to all possible paths between two quantum states and uses superposition and Hamilton’s principle to select the most probable path. A Feynman diagram is a representation of such a path, and the probability of a particular transition is calculated as a superposition of all the relevant Feynman diagrams.

A Feynman diagram represents a network in which the lines represent particles (messages) and the vertices or nodes where the lines meet represent interactions between the particles. The path integral approach assigns equal weight to all possible paths, and from that computes the probability of a particular transition. The Feynman diagram aggregates these weights into a larger structure that represents the fact that particles interact in every possible way: many path integrals contribute to one Feynman diagram.

Taking an even broader view, one may see the whole Universe as an aggregate of aggregates of Feynman diagrams, thus building up a universal network that embraces all physical interactions. In the network picture, we see particles as messages and vertices in the Feynman diagram as computers in the network whose inputs and outputs are messages. We can now turn from this intuitive approach to the Universe as a computer network to a more formal development of the idea.

back to top

6 A transfinite computer network

6.1 The Cantor Universe

The purpose of this section is to construct a logical phase space in which to model the behaviour of the universal network.

We see a network as a set of memory locations whose states can be changed by the users of the network. As in everyday networks and computers, every memory location has an address and content which may be read from and written to by a processor.

In a finite network, these addresses may be placed into correspondence with the natural numbers. To address the memory in an infinite network, we turn to the transfinite numbers invented by Georg Cantor (1898). Cantor

Cantor established the transfinite cardinal and ordinal numbers by Cantor’s theorem: given a set of any cardinal n, there exists a set with a greater cardinal. He proved this using his famous diagonal argument. A modern axiomatic proof relies on the axiom of the power set: given a set S, there exists the set P(S) of all subsets of S called the power set of S. If the cardinal of S is n, the cardinal of P(S) is 2n. This is the true for all finite or infinite values of n greater than 1.

Using this theorem, Cantor constructed the formal ‘Cantor universe’ by transfinite recursion. The set N of natural numbers is said to be countably infinite. Cantor used the symbol 0 to represent the cardinal of N. The power set of N, P(N) has the next greatest cardinal 1, and so on without end. Cantor hypothesized that1was the cardinal of the continuum. Cohen (1966) later found that this continuum hypothesis is independent of the axioms of set theory. Cohen

6.2 A transfinite symmetric universe

Here we construct a Cantor universe in a slightly different way, using permutations rather than subsets to generate greater cardinals. Permutation simply means rearrangement, so the triplet {a, b, c} may be arranged in six ways abc, acb, bac, bca, cab, cba. In general n different things can be arranged in n × (n-1) × . . . × 2 × 1 different ways, called factorial n and written n!. The set of permutations of a number of objects form a group called the symmetric group.

We begin with the natural numbers, N. These numbers have a natural order, but we can also order them in any other way we like to give a set of permutations of N. We know that there are n! permutations of n objects and conjecture that there are n+1 permutations of n things. Every one of these permutations is formally different.

We take this structure as the configuration or memory space for a transfinite, symmetric network. The essential feature of this model is the fact that by ordering sets to create position significant numerals, we gain a huge increase on the complexity of numbers that can be represented with a given stock of symbols. This fact, it seems, lies at the root of the creative power of the Universe.

Cantor made a very bold claim when he presented his transfinite cardinal and ordinal numbers to the world. He wrote:

The concept of "ordinal type" developed here, when it is transferred in like manner to "multiply ordered aggregates", embraces, in conjunction with the concept of "cardinal number" or "power" introduced in §1, everything capable of being numbered (Anzahlmassige) that is thinkable and in this sense cannot be further generalized. It contains nothing arbitrary, but is the natural extension of the concept of number. (Cantor 1898; 1955: 117)

We might call this structure a transfinite symmetric network. Transfinite because it is isomorphic to Cantor’s transfinite cardinal and ordinal numbers. Symmetric because it contains all possible permutation (= symmetric) groups which in turn contain all possible groups.

6.3 ‘Cantor symmetry’

Cantor [1955: 109], explaining the genesis of transfinite numbers, writes:

We shall show that the transfinite cardinal numbers can be arranged according to their magnitude, and, in this order, form, like the finite numbers, a “well ordered aggregate” in an extended sense of the words. Out of 0 proceeds by a definite law, the next greater cardinal number 1, and out of this by the same law the next greater, 2 and so on . . . without end.

This ‘definite law’ is indifferent to the size of the number upon which is operates to create the successor to that number. Let us call this definite law Cantor symmetry or symmetry with respect to complexity.

In the symmetric universe as we have described it, the ‘definite law’ is the process of generating all the permutations of a given set of operations. The order of operations in the real world is often important (to make an omelette, beat the eggs then cook rather than cook then beat the eggs).

Such operations are ‘non-commutative’. One of the many features that distinguishes quantum from classical mechanics is non-communtative multiplication. Cantor symmetry provides us with an heuristic principle for understanding interactions at all levels of the Universe. At this level of abstraction, the gossip at a cocktail party and the inner workings of an atom share the same network features, and we may use one to throw light upon the other.

Cantor symmetry applies to quantum mechanics so that our understanding of two state systems is essentially the same as our understanding of system with a transfinite number of states.

6.4 Computation and quantum information theory

The transfinite symmetric universe is a 'Platonic' structure whose principal properties are:

1. Unlimited size, sufficient to be placed in one to one correspondence with any real system of events, no matter how complex. We measure the size of an event by the number of quanta of action involved in its execution. By this measure, the largest event is the live of the Universe itself, which contains all other events.
2. Layered, since each new level of complexity arises from permutations of the elements of the layer beneath it, giving a sequence of ever increasing complexity 0, 1, 2 . . .

Our task now becomes to find the constraints on this space which enable us to map it to the Universe of experience. We may imagine a communication system as a set of computable functions that can be strung together (as in an ordinary computer) to transform the input of the channel to its output. Isolated Turing machines are deterministic. This determinism is broken, however, when processes communicate. Turing envisaged this situation when he described the oracle- or o-machine. An oracle machine, like any real computer, can stop and wait for input from an outside source or oracle.

In practical communication networks, processing may also be interrupted by a message which claims priority over the current process. A message entering a network using an acceptable protocol changes the state of the machine receiving it. Given communication delay (which may effectively cause space-like separation of machines) it is impossible to predict when a machine will be interrupted and what the effect of the interrupt may be. Uncertainty (ie unpredictability) thus enters a network, even if all the functions driving the network are deterministically computable.

This opens the way to understanding the probabilistic nature of quantum mechanics if we interpret a quantum systems as networks. In this way we replace a deterministic continuous formalism with a probabilistic interpretation with a set of deterministic machines whose communication with one another produces a set of events whose occurrences are unpredictable.

The events themselves, however, fall into equivalence classes which are determined by the communication protocols in use. The number of possible protocols is restricted to the number of possible Turing machines, that is to a countable infinity. If we consider quantum eigenfunctions to be computable, this suggests that the number of eigenfunctions (ie observables) in the Universe is itself countably infinite. One motivation for the development of quantum computers is the belief that they will be much more powerful than digital machines. This hope is based on two features of current quantum mechanics, continuity and superposition.

First, because quantum mechanical superpositions in Hilbert space have continuous complex amplitudes, it is felt that even a vector in two dimensional Hilbert space (a ‘qubit’) may encode an infinity of detail and therefore be equal to an infinite digital memory.

Second, because in an infinite dimensional Hilbert space each vector may be decomposed into a superposition of an infinity of other vectors, all of which are acted upon simultaneously by operators on the space, it is felt that quantum computations can be massively parallel, computing all the instances of an algorithm represented by each element of the superposition simultaneously.

To the contrary, we have the approach taken by algorithmic information theory, which measures the total information in a transformation by the length of the program required to implement the transformation (Chaitin 1987).

From this point of view, the equation x = y defined on the real numbers does not represent an infinite amount of information, but only the few bits represented by the symbolic expression ‘x = y’. This alerts us to the fact that the entropy we assign to a set depends on how we decide to count its elements (Chaitin 1975). From a computational point of view, the algorithmic measure seems most appropriate.

Further, from the point of view of communication theory, a continuum is the essence of confusion. In continuous mathematics, the real information is carried only by the singularities, eigenvalues and other fixed points we may find within or on the boundaries of a continuous set.

From this we conclude that one can neither compute a continuum, nor use a continuum to compute, except insofar as it can be broken into disjoint pieces. All computation modelled by the Turing machine must be digital or quantized, that is executed using discrete symbols, that is or discrete physical entities.

Quantum mechanics makes sense if we look at it from the point of view of communication. It is an intuitively satisfactory point of view, since we are born natural communicators. It emerges from the above discussion that when we communicate with the world, even a quantum system, we are part of a network.

6.5 Creative evolution

Cosmological theory, supported by a wealth of observation, suggests that the Universe began as a ‘formless and void’ initial singularity. (Genesis; Hawking & Ellis 1975). We have a quite comprehensive picture of the development of the Universe within this singularity, but very little information about (i) why the initial singularity existed in the first place and (ii) what motivated its complexification (Teillhard de Chardin, 1980). Teilhard de Chardin

There is really nothing to be said about (i), except to accept that we are here. The second question is more amenable to treatment. As Parmenides noted, we can only know dynamic systems through their stationary points. The stationary points in the transfinite computer network are the elements of the Cantor universe which we understand to be memory locations which carry information.

The ‘expansion’ of the Cantor universe is driven by Cantor’s theorem, which tells us that given any set, there exists a set of higher cardinal. The proof of Cantor’s theorem is non-constructive: it is proved by showing that its falsity implies a contradiction. Insofar as the Universe is consistent, it necessarily increases its cardinality. We take this ‘Cantor force’ to be the ultimate driver of universal expansion.

The theory of evolution is founded on the Malthusian observation that the exponential growth of reproducing organisms will eventually exhaust any resource supply, no matter how great. As a result the number of organisms has an upper limit and fitter organisms will survive at the expense of the less fit.

We note first that while there are only a countably infinite number of different Turing machines (computable functions), the phase space of the model is transfinitely bigger, so that we may see computation as a limiting resource in the Universe. This lays the foundation for an evolutionary scheme: many possibilities, confronting limited resources select for those possibilities that use the resource most efficiently to secure their own survival and replication.

Although the evolutionary paradigm was first developed in the biological realm, the layered network model suggets that it accounts for the selection of all structure in the Universe, from the most elementary particles in the physical layer through all user layers to the the Universe as a whole.

back to top

7 Some corollaries

7.1 ‘The Unreasonable effectiveness of mathematics’

Eugene Wigner noted the ‘unreasonable effectiveness of mathematics’ in the physical sciences (Wigner 1960) If there is any truth in the picture painted here, this fact may have a simple explanation. Mathematics is a consistent symbolic system devised by the mathematical community. We may think of the mathematical literature as a set of stationary points in the dynamics of this community. Wigner

More generally stationary points (particles or messages) in the observed Universe also form a symbolic system whose consistency is guaranteed by the dynamic processes which contains them. Given that the symmetric network spans the whole space of consistent symbolic systems, it may not be surprising to find that mathematics is wonderfully effective as a descriptor of the world, as Galileo proposed.

7.2 The view from inside the Universe

The transfinite computer network is a dynamical model whose size is limited only by consistency. It was originally developed in a theological context as a candidate model of God. The idea is to map it onto the observed world as evidence for the thesis that the the Universe is divine. Traditional models of God and the World suggest that they could not be more different. This difference is partly based on ‘proofs’ for the existence of God which prove, in effect, that God is other than the World [Aquinas 1981].

In the picture proposed here, we see ourselves as part of the living God. As noted above, Christian theologians long ago developed models of God that allowed for a Trinity of personalities in God. Here we allow for a transfinity of personalities, where we mean by personality a communication source, any entity capable of sending and receiving messages, whether it be an atom or a God.

Our existence depends upon many layers of structure beginning with fundamental particles and moving through atoms, molecules and cells, each of which is a layered network within itself and connected to its peers through an ‘internet’.

Further we ourselves are elements of larger networks, families, tribes and nations. In all of these structures, the fundamental binding element is communication, and it is hard to imagine any feature of human experience that does not fall within the network paradigm.

7.3 Network intelligence

Only two sources may be involved in any particular communication within a network, but, since each of these machines is connected to many other machines, what we see if we look at a given machine for long enough is not just the machine we are looking at, but a superposition of all the inputs from all the machines in the network, weighed by the probability of communication between them and the machine we are watching.

We may see this scenario as analogous to the structure of a biological neural network like our own central nervous systems, where memory in the network is encoded in the weights of synaptic junctions between the individual neurons in the network. We may see these weights as analogous to the amplitudes for the propagation of signals between the various vertices of a Feynman diagram.

There can be little doubt that the neural networking in our central nervous systems is the hardware layer of the human mind and the seat of our intelligence. In the network paradigm proposed here, the Universe as a whole is a network similar to our individual neural networks so that we may see the cosmic intelligence as isomorphic (up to a Cantor symmetry) with our microcosmic minds.

7.4 Past and future

Observable events lie on the boundary between past and future. Turing answered Hilbert’s Entscheidungsproblem by devising a machine that could perform anything that might reasonably be called a computation and then showing that there was a set of incomputable problems which such a machine could never complete.

In the context of Cantorian set theory, Turing showed that while there is a just a countable infinity of different Turing machines, capable of evaluating a countable infinity of computable functions. Since there is a transfinite number of possible functions mapping the set of natural numbers onto itself, computable functions represent at most a small fraction of possible functions.

Until a computer has halted, its state is uncertain. Once it is halted, its state becomes definite. In the network no computer can determine its inputs, and so its output, if any, is also indeterminate. As a consequence, the past is definite, and not subject to any uncertainty principle. The future, however, cannot be completely determined by the past. The network model therefore suggests that the boundary between past and future can be modelled by the boundary between halted and not-halted computations.

7.5 ‘Entanglement’ and ‘spooky action at a distance’
[Einstein, Podolsky and Rosen 1935]

In a layered network, each layer provides an alphabet of processes or tools which are used by higher layers for their own purposes. Thus molecules use atoms, cells use molecules, people use computers, corporations use people and so on. If we assume that the fundamental hardware of the Universe is the initial singularity, peer processes in all the layers of complexity which have arisen from this singularity still use it as the ultimate physical layer for all their communications with one another.

In this picture, space-time as we experience it is an an emergent property of the Universe, a layer built on the lower layer described by quantum mechanics. The structure of space-time is described by special relativity and depends upon the finite velocity of light.

As noted above, the algorithms used to defeat error in communication delay the transmission and reception of messages. In order to encode a message of n symbols into a transmissable signal function, the encoding machine must wait for the source to emit n symbols, even if the computation required for encoding is instantaneous. Since computation itself involves error free communication within the computer, we can expect it too to add delay. We therefore are led to believe that communication delay and quantization are connected. Further, since the least delay corresponds to the maximum velocity, we propose that the algorithm used to encode information into signals that travel at the velocity of light is the fastest algorithm in the Universe.

As ‘nature abhors a vacuum’ physicists traditionally abhor ‘action at a distance’. This abhorrence has been the historical motivation for field theories and the notion that the forces between systems are mediated by the exchange of particles.

The Einstein, Podolsky Rosen thought experiment, since frequently realized, proposed that entangled particles could act upon one another at a distance even though their separation was spacelike, requiring something greater than the velocity of light to account for their correlation (if it is due to communication) (Aspect et al 1982). Aspect

We assume that the velocity of light is finite rather than infinite because of the delay in the transmission of a photon from point to point due to error preventative encoding. Conversely, we might speculate that communications that cannot go wrong, that is communications that effectively carry no information, might require no encoding and therefore travel at infinite velocity.

The observed correlations between entangled photons have been found to propagate at many times c, and the measurements are compatible with infinite velocity, ie instantaneous transmission over any distance. Salart: Testing the speed of 'spooky action at a distance'.

In a practical communication network a message originating with user A is passed down through the software layers in A’s computer to the physical layer which carries it to B’s machine. It is then passed up through B’s software layers until reaches a form that B can read. By analogy, communication between one system in the Universe must pass down to the ultimate physical layer (which we might identify as the initial singularity) then up again to the peer system receiving the message. It may be that the simplicity in the lower layers of this network make encoding for error prevention unnecessary, so that instantaneous communication is possible.

7.6 Numerical results

The first successful quantum field theory, quantum electrodynamics, has enabled physicists to compute the outcome of experiments with precision in the realm of parts per billion (Feynman 1988). Such precision compels one to believe that the theory is an accurate description of reality.

Quantum mechanical measurements are essentially frequencies, that is rational numbers expressing the ratio of the occurrences of two events. A probability is a prediction of a frequency. When we say that the probability of a tossed fair coin coming up heads is 1/2 we are predicting that if we throw a coin a sufficient number of times, the ratio of head to tails will approach as closely as we like to 1/2.

The measurement and computation of frequencies requires both definitions of the events to be compared (heads, tails) and a count of these events. The quantum mechanical eigenvalue equation embodies both these requirements: the eigenfunction which defines the event and the eigenvalue which measures its frequency.

One must ask: if the continuum calculus methods of conventional quantum mechanics work so well, how are they to be reconciled with the digital approach suggested here. The matter clearly requires much further discussion in the genre more appropriately published in physics journals but an outline of the answer seems to be present in the layering of networks. Physics deals with the physical layer of the universal network, which may nevertheless comprise a number of layers beginning with the totally simple, isolated and meaningless initial singularity upon which are built quantum mechanics and space-time.

A foundation of quantum mechanics is the principle of superposition which works because quantum mechanics is linear. When the amplitudes of two states are added the resulting amplitude is simply the sum of the two added states: the whole is no greater or less than the sum of its parts. This contrasts with more complex non-linear systems for which the relationship between whole and parts is more complex, as illustrated by the old adage ‘the straw that breaks the camels’s back’: a minor input leading to a major effect. Thus, in the quantum mechanical context, continuous and digital mathematics are indistinguishable.

back to top

Back to esys, toc

Copyright:

You may copy this material freely provided only that you quote fairly and provide a link (or reference) to your source.


Further reading

Books

Click on the "Amazon" link below each book entry to see details of a book (and possibly buy it!)

Aquinas, Thomas, Summa Theologica (translated by Fathers of the English Dominican Province), Tabor Publishing 1981 'Brother Thomas raised new problems in his teaching, invented a new method, used new systems of proof. To hear him teach a new doctrine, with new arguments, one could not doubt that God, by the irradiation of this new light and by the novelty of this inspiration, gave him the power to teach, by the spoken and written word, new opinions and new knowledge.' (William of Tocco, T's first biographer) 
Amazon
  back
Aristotle, and (translated by H Tredennick and G Cyril Armstrong), Metaphysics X-XIV, Oeconomica and Magna Moralia, Harvard University Press, ; William Heinemann Ltd. 1977 Introduction III Aristotle's Metaphysical Theory: 'The theory of universal science, as sketched by Plato in The Republic, was unsatisfactory to Aristotle's analytical mind. He felt that there must be a regular system of sciences, each concerned with a different aspect of reality. At the same time it was only reasonable to suppose that there is a supreme science, which is more ultimate, more exact, more truly Wisdom than any of the others. The discussion of this science, Wisdom, Primary Philosophy or Theology, as it is variously called, and of its scope, forms the subject of the Metaphysics. page xxv 
Amazon
  back
Augustine, Saint, and Edmond Hill (Introduction, translation and notes), and John E Rotelle (editor), The Trinity, New City Press 1991 Written 399 - 419: De Trinitate is a radical restatement, defence and development of the Christian doctrine of the Trinity. Augistine's book has served as a foundation for most subsequent work, particularly that of Thomas Aquinas.  
Amazon
  back
Cantor, Georg, Contributions to the Founding of the Theory of Transfinite Numbers (Translated, with Introduction and Notes by Philip E B Jourdain), Dover 1955 Jacket: 'One of the greatest mathematical classics of all time, this work established a new field of mathematics which was to be of incalculable importance in topology, number theory, analysis, theory of functions, etc, as well as the entire field of modern logic.' 
Amazon
  back
Carnot, Sadi, and Translated by R H Thurston; edited and with an introduction by E Mendoza, Reflections on the Motive Power of Fire: and other papers on the second law of thermodynamics by E Clapeyron and R Clausius., Peter Smith Publisher 1977 Reflections: Everyone knows that heat can produce motion. ... in these days when the steam-engine is everywhere so well known. ... To develop this power, to appropriate it to our uses, is the object of heat engines. ... Notwithstanding the work of all kinds done by steam-engines, notwithstanding the satisfactory condition to which they have been brought today, their theory is very little understood, and the attempts to improve them are still directed almost by chance. ... In order to consider in the most general way the principle of the production of motion by heat, it must be considered independently of any mechanism or any particular agent. It is necessary to establish principles applying not only to steam-engines but to all imaginable heat engines, whatever the working substance and whatever the method by which it is operated. ... [Here enters the seed of entropy] The production of motive power is then due in steam-engines not to an actual consumption of caloric, but to its transportation from a warm body to a cold body, that is, to its reestablishment of equilibrium - an equilibrium considered as destroyed by any cause whatever, by chemical action such as combustion, or by any other.' pages 3-7. 
Amazon
  back
Casti, John L, Five Golden Rules: Great Theories of 20th-Century Mathematics - and Why They Matter, John Wiley and Sons 1996 Preface: '[this book] is intended to tell the general reader about mathematics by showcasing five of the finest achievements of the mathematician's art in this [20th] century.' p ix. Treats the Minimax theorem (game theory), the Brouwer Fixed-Point theorem (topology), Morse's theorem (singularity theory), the Halting theorem (theory of computation) and the Simplex method (optimisation theory). 
Amazon
  back
Cercignani, Carlo, Ludwig Boltzmann: The Man Who Trusted Atoms, Oxford University Press, USA 2006 'Cercignani provides a stimulating biography of a great scientist. Boltzmann's greatness is difficult to state, but the fact that the author is still actively engaged in research into some of the finer, as yet unresolved issues provoked by Boltzmann's work is a measure of just how far ahead of his time Boltzmann was. It is also tragic to read of Boltzmann's persecution by his contemporaries, the energeticists, who regarded atoms as a convenient hypothesis, but not as having a definite existence. Boltzmann felt that atoms were real and this motivated much of his research. How Boltzmann would have laughed if he could have seen present-day scanning tunnelling microscopy images, which resolve the atomic structure at surfaces! If only all scientists would learn from Boltzmann's life story that it is bad for science to persecute someone whose views you do not share but cannot disprove. One surprising fact I learned from this book was how research into thermodynamics and statistical mechanics led to the beginnings of quantum theory (such as Planck's distribution law, and Einstein's theory of specific heat). Lecture notes by Boltzmann also seem to have influenced Einstein's construction of special relativity. Cercignani's familiarity with Boltzmann's work at the research level will probably set this above other biographies of Boltzmann for a very long time to come.' Dr David J Bottomley  
Amazon
  back
Chaitin, Gregory J, Algorithmic Information Theory, Cambridge UP 1987 Foreword: 'The crucial fact here is that there exist symbolic objects (i.e., texts) which are "algorithmically inexplicable", i.e., cannot be specified by any text shorter than themselves. Since texts of this sort have the properties associated with random sequences of classical probability theory, the theory of describability developed . . . in the present work yields a very interesting new view of the notion of randomness.' J T Schwartz 
Amazon
  back
Cohen, Paul J, Set Theory and the Continuum Hypothesis, Benjamin/Cummings 1966-1980 Preface: 'The notes that follow are based on a course given at Harvard University, Spring 1965. The main objective was to give the proof of the independence of the continuum hypothesis [from the Zermelo-Fraenkel axioms for set theory with the axiom of choice included]. To keep the course as self contained as possible we included background materials in logic and axiomatic set theory as well as an account of Gödel's proof of the consistency of the continuum hypothesis. . . .'  
Amazon
  back
Davies, Paul, The Mind of God: Science and the Search for Ultimate Meaning, Penguin Books 1992  
Amazon
  back
Davis, Martin, Computability and Unsolvability, Dover 1982 Preface: 'This book is an introduction to the theory of computability and non-computability ususally referred to as the theory of recursive functions. The subject is concerned with the existence of purely mechanical procedures for solving problems. . . . The existence of absolutely unsolvable problems and the Goedel incompleteness theorem are among the results in the theory of computability that have philosophical significance.' 
Amazon
  back
Dirac, P A M, The Principles of Quantum Mechanics (4th ed), Oxford UP/Clarendon 1983 Jacket: '[this] is the standard work in the fundamental principles of quantum mechanics, indispensible both to the advanced student and the mature research worker, who will always find it a fresh source of knowledge and stimulation.' (Nature)  
Amazon
  back
Faraday, Michael, Experimental Researches in Electricity (Volume 1), General Books LLC 2010  
Amazon
  back
Feynman, Richard P, and Robert B Leighton, Matthew Sands, The Feynman Lectures on Physics (volume 3) : Quantum Mechanics, Addison Wesley 1970 Foreword: 'This set of lectures tries to elucidate from the beginning those features of quantum mechanics which are the most basic and the most general. ... In each instance the ideas are introduced together with a detailed discussion of some specific examples - to try to make the physical ideas as real as possible.' Matthew Sands 
Amazon
  back
Feynman, Richard P, and Albert P Hibbs, Quantum Mechanics and Path Integrals, McGraw Hill 1965 Preface: 'The fundamental physical and mathematical concepts which underlie the path integral approach were first developed by R P Feynman in the course of his graduate studies at Princeton, ... . These early inquiries were involved with the problem of the infinte self-energy of the electron. In working on that problem, a "least action" principle was discovered [which] could deal succesfully with the infinity arising in the application of classical electrodynamics.' As described in this book. Feynam, inspired by Dirac, went on the develop this insight into a fruitful source of solutions to many quantum mechanical problems.  
Amazon
  back
Feynman, Richard, QED: The Strange Story of Light and Matter, Princeton UP 1988 Jacket: 'Quantum electrodynamics - or QED for short - is the 'strange theory' that explains how light and electrons interact. Thanks to Richard Feynmann and his colleagues, it is also one of the rare parts of physics that is known for sure, a theory that has stood the test of time. ... In this beautifully lucid set of lectures he provides a definitive introduction to QED.' 
Amazon
  back
Feynman, Richard, Feynman Lectures on Computation, Perseus Publishing 2007 Amazon Editorial Reviews Book Description 'The famous physicist's timeless lectures on the promise and limitations of computers When, in 1984-86, Richard P. Feynman gave his famous course on computation at the California Institute of Technology, he asked Tony Hey to adapt his lecture notes into a book. Although led by Feynman, the course also featured, as occasional guest speakers, some of the most brilliant men in science at that time, including Marvin Minsky, Charles Bennett, and John Hopfield. Although the lectures are now thirteen years old, most of the material is timeless and presents a "Feynmanesque" overview of many standard and some not-so-standard topics in computer science such as reversible logic gates and quantum computers.'  
Amazon
  back
Galilei, Galileo, and Stillman Drake (translator), Discoveries and Opinions of Galileo: Including the Starry Messenger (1610 Letter to the Grand Duchess Christina), Doubleday Anchor 1957 Amazon: 'Although the introductory sections are a bit dated, this book contains some of the best translations available of Galileo's works in English. It includes a broad range of his theories (both those we recognize as "correct" and those in which he was "in error"). Both types indicate his creativity. The reproductions of his sketches of the moons of Jupiter (in "The Starry Messenger") are accurate enough to match to modern computer programs which show the positions of the moons for any date in history. The appendix with a chronological summary of Galileo's life is very useful in placing the readings in context.' A Reader. 
Amazon
  back
Hallett, Michael, Cantorian set theory and limitation of size, Oxford UP 1984 Jacket: 'This book will be of use to a wide audience, from beginning students of set theory (who can gain from it a sense of how the subject reached its present form), to mathematical set theorists (who will find an expert guide to the early literature), and for anyone concerned with the philosophy of mathematics (who will be interested by the extensive and perceptive discussion of the set concept).' Daniel Isaacson. 
Amazon
  back
Hawking, Steven W, and G F R Ellis, The Large Scale Structure of Space-Time , Cambridge UP 1975 Preface: Einstein's General Theory of Relativity ... leads to two remarkable predictions about the universe: first that the final fate of massive stars is to collapse behind an event horizon to form a 'black hole' which will contain a singularity; and secondly that there is a singularity in our past which constitutes, in some sense, a beginning to our universe. Our discussion is principally aimed at developing these two results.' 
Amazon
  back
Heath, Thomas Little, Thirteen Books of Euclid's Elements (volume 1, I-II), Dover 1956 'This is the definitive edition of one of the very greatest classics of all time - the full Euclid, not an abridgement. Utilizing the text established by Heiberg, Sir Thomas Heath encompasses almost 2500 years of mathematical and historical study upon Euclid.' 
Amazon
  back
Honderich, Ted, and (editor), The Oxford Companion to Philosophy, Oxford University Press 1995 Preface: 'The brave, large aim of this book is to bring philosophy together between two covers better than ever before. This is not a job for one man, or one woman, or a few, or a team, although it has been tried often enough. So 249 of us have joined forces.' 
Amazon
  back
Huang, Kerson, Statistical Mechanics, John Wiley 1987 'Preface: ... The purpose of this book is to teach statistical mechanics as an integral part of theoretical phyiscs, a discipline that aims to describe all natural phenomena on the basis of a single unifying theory. This theory, at present, is quantum mechanics. ... Before the subject of statistical mechanics proper is presented, a brief but self contained discussion of thermodynamics and the classical kinetic theory of gases is given. The order of this devlopment is imperative, from a pedagogical point of view, for two reasons. First, thermodynamics has successfully described a large part of macroscopic experience, which is the concern of statistical mechanics. It has done so not on the basis of molecular dynamics but on the basis of a few simple and intuitive postulates stated in everyday terms. If we first falimiarize ourselves with thermodynamics, the task of statistical mechanics reduces to the explanation of thermodynamics. Second, the classical kinetic theory of gases is the only known special case in which thermodynics can be derived nearly from first principles, ie, molecular dynamics. A study of this special case will help us to understand why statstical mecahnics sorks.' 
Amazon
  back
Khinchin, A I, Mathematical Foundations of Information Theory (translated by P A Silvermann and M D Friedman), Dover 1957 Jacket: 'The first comprehensive introduction to information theory, this book places the work begun by Shannon and continued by McMillan, Feinstein and Khinchin on a rigorous mathematical basis. For the first time, mathematicians, statisticians, physicists, cyberneticists and communications engineers are offered a lucid, comprehensive introduction to this rapidly growing field.' 
Amazon
  back
Kuhn, Thomas S, Black-Body Theory and the Quantum Discontinuity 1894-1912, University of Chicago Press 1987 Jacket: '[This book] traces the emergence of discontinuous physics during the early years of this century. Breaking with historiographic tradition, Kuhn maintains that, though clearly due to Max Planck, the concept of discontinuous energy change does not originate in his work. Instead it was introduced by physicists trying to understand the success of his brilliant new theory of black-body radiation.' 
Amazon
  back
Lonergan, Bernard J F, and Robert M. Doran, Frederick E. Crowe (eds), Verbum : Word and Idea in Aquinas (Collected Works of Bernard Lonergan volume 2) , University of Toronto Press 1997 Jacket: 'Verbum is a product of Lonergan's eleven years of study of the thought of Thomas Aquinas. The work is considered by many to be a breakthrough in the history of Lonergan's theology ... . Here he interprets aspects in the writing of Aquinas relevant to trinitarian theory and, as in most of Lonergan's work, one of the principal aims is to assist the reader in the search to understand the workings of the human mind.' 
Amazon
  back
Lonergan, Bernard J F, and Michael G Shields (translator), Robert M Doran & H Daniel Monsour (editors), The Triune God: Systematics, University of Toronto press 2007 Translated from De Deo Trino: Pars systematica (1964) by Michael G Shields. Amazon Product Description 'Buried for more than forty years in a Latin text written for seminarian students at the Gregorian University in Rome, Bernard Lonergan's 1964 masterpiece of systematic-theological writing, De Deo trino: Pars systematica, is only now being published in an edition that includes the original Latin along with an exact and literal translation. De Deo trino, or The Triune God, is the third great installment on one particular strand in trinitarian theology, namely, the tradition that appeals to a psychological analogy for understanding trinitarian processions and relations. The analogy dates back to St Augustine but was significantly developed by St Thomas Aquinas. Lonergan advances it to a new level of sophistication by rooting it in his own highly nuanced cognitional theory and in his early position on decision and love. Suggestions for a further development of the analogy appear in Lonergan's late work, but these cannot be understood and implemented without working through this volume. This is truly one of the great masterpieces in the history of systematic theology, perhaps even the greatest of all time.' 
Amazon
  back
Maxwell, James Clerk, Treatise on Electricity and Magnetism (vol 1), OUP 1998 First published 1873 
Amazon
  back
Mendelson, Elliott, Introduction to Mathematical Logic, van Nostrand 1987 Preface: '... a compact introduction to some of the principal topics of mathematical logic. . . . In the belief that beginners should be exposed to the most natural and easiest proofs, free swinging set-theoretical methods have been used."  
Amazon
  back
Misner, Charles W, and Kip S Thorne, John Archibald Wheeler, Gravitation, Freeman 1973 Jacket: 'Einstein's description of gravitation as curvature of spacetime led directly to that greatest of all predictions of his theory, that the universe itself is dynamic. Physics still has far to go to come to terms with this amazing fact and what it means for man and his relation to the universe. John Archibald Wheeler. . . . this is a book on Einstein's theory of gravity. . . . ' 
Amazon
  back
Nielsen, Michael A, and Isaac L Chuang, Quantum Computation and Quantum Information, Cambridge University Press 2000 Review: A rigorous, comprehensive text on quantum information is timely. The study of quantum information and computation represents a particularly direct route to understanding quantum mechanics. Unlike the traditional route to quantum mechanics via Schroedinger's equation and the hydrogen atom, the study of quantum information requires no calculus, merely a knowledge of complex numbers and matrix multiplication. In addition, quantum information processing gives direct access to the traditionally advanced topics of measurement of quantum systems and decoherence.' Seth Lloyd, Department of Quantum Mechanical Engineering, MIT, Nature 6876: vol 416 page 19, 7 March 2002. 
Amazon
  back
Pais, Abraham, 'Subtle is the Lord...': The Science and Life of Albert Einstein, Oxford UP 1982 Jacket: In this ... major work Abraham Pais, himself an eminent physicist who worked alongside Einstein in the post-war years, traces the development of Einstein's entire ouvre. ... Running through the book is a completely non-scientific biography ... including many letters which appear in English for the first time, as well as other information not published before.' 
Amazon
  back
Russell, Bertrand, The Principles of Mathematics, W W Norton & Co 1903, 1938, 1996 Amazon Product Description 'Russell's classic The Principles of Mathematics sets forth his landmark thesis that mathematics and logic are identical—that what is commonly called mathematics is simply later deductions from logical premises. His ideas have had a profound influence on twentieth-century work on logic and the foundations of mathematics.' 
Amazon
  back
Streater, Raymond F, and Arthur S Wightman, PCT, Spin, Statistics and All That, Princeton University Press 2000 Amazon product description: 'PCT, Spin and Statistics, and All That is the classic summary of and introduction to the achievements of Axiomatic Quantum Field Theory. This theory gives precise mathematical responses to questions like: What is a quantized field? What are the physically indispensable attributes of a quantized field? Furthermore, Axiomatic Field Theory shows that a number of physically important predictions of quantum field theory are mathematical consequences of the axioms. Here Raymond Streater and Arthur Wightman treat only results that can be rigorously proved, and these are presented in an elegant style that makes them available to a broad range of physics and theoretical mathematics.' 
Amazon
  back
Tanenbaum, Andrew S, Computer Networks, Prentice Hall International 1996 Preface: 'The key to designing a computer network was first enunciated by Julius Caesar: Divide and Conquer. The idea is to design a network as a sequence of layers, or abstract machines, each one based upon the previous one. ... This book uses a model in which networks are divided into seven layers. The structure of the book follows the structure of the model to a considerable extent.'  
Amazon
  back
Teilhard de Chardin, Pierre, The Phenomenon of Man, Collins 1965 Sir Julian Huxley, Introduction: 'We, mankind, contain the possibilities of the earth's immense future, and can realise more and more of them on condition that we increase our knowledge and our love. That, it seems to me, is the distillation of the Phenomenon of Man.'  
Amazon
  back
van Heijenoort, Jean, From Frege to Goedel: A Source Book in Mathematical Logic 1879 - 1931. , iUniverse.com 1999 Amazon book description: 'Collected here in one volume are some thirty-six high quality translations into English of the most important foreign-language works in mathematical logic, as well as articles and letters by Whitehead, Russell, Norbert Weiner and Post…This book is, in effect, the record of an important chapter in the history of thought. No serious student of logic or foundations of mathematics will want to be without it.' 
Amazon
  back
Veltman, Martinus, Diagrammatica: The Path to the Feynman Rules, Cambridge University Press 1994 Jacket: 'This book provides an easily accessible introduction to quantum field theory via Feynman rules and calculations in particle physics. The aim is to make clear what the physical foundations of present-day field theory are, to clarify the physical content of Feynman rules, and to outline their domain of applicability. ... The book includes valuable appendices that review some essential mathematics, including complex spaces, matrices, the CBH equation, traces and dimensional regularization. ...' 
Amazon
  back
von Neumann, John, and Robert T Beyer (translator), Mathematical Foundations of Quantum Mechanics, Princeton University Press 1983 Jacket: '. . . a revolutionary book that caused a sea change in theoretical physics. . . . JvN begins by presenting the theory of Hermitean operators and Hilbert spaces. These provide the framework for transformation theory, which JvN regards as the definitive form of quantum mechanics. . . . Regarded as a tour de force at the time of its publication, this book is still indispensable for those interested in the fundamental issues of quantum mechanics.' 
Amazon
  back
Wigner, Eugene, Symmetries and Reflections: Scientific Essays , MIT Press 1970 Jacket: 'This volume contains some of Professor Wigner's more popular papers which, in their diversity of subject and clarity of style, reflect the author's deep analytical powers and the remarkable scope of his interests. Included are articles on the nature of physical symmetry, invariance and conservation principles, the structure of solid bodies and of the compound nucleus, the theory of nuclear fission, the effects of radiation on solids, and the epistemological problems of quantum mechanics. Other articles deal with the story of the first man-made nuclear chain reaction, the long term prospects of nuclear energy, the problems of Big Science, and the role of mathematics in the natural sciences. In addition, the book contains statements of Wigner's convictions and beliefs as well as memoirs of his friends Enrico Fermi and John von Neumann. Eugene P. Wigner is one of the architects of the atomic age. He worked with Enrco Fermi at the Metallurgical Laboratory of the University of Chicago at the beginning of the Manhattan Project, and he has gone on to receive the highest honours that science and his country can bestow, including the Nobel Prize for physics, the Max Planck Medal, the Enrico Fermi Award and the Atoms for Peace Award. '. 
Amazon
  back
Zemansky, Mark W, Heat and Thermodynamics, McGraw-Hill 1957 back
Papers
Aspect, Alain, P. Grangier, G. Roger, "Experimental Realization of the Einstein-Podolsky-Rosen-Bohm Gedankenexperiment: A New Violation of Bell's Inequalities", Physical review Letters, 49, 2, 12 July, 1982, page 91-94. The linear-polarization correlation of pairs of photons emitted in a radiative cascade of calcium has been measured. The new experimental scheme, using two-channel polarizers (i.e., optical analogues of Stern-Gerlach filters), is a straightforward transposition of Einstein-Podolsky-Rosen-Bohm gedankenexperiment. The present results, in excellent agreement with the quantum mechanical predictions, lead to the greatest violation of generalized Bell's inequalities ever achieved.'. back
Chaitin, Gregory J, "Randomness and Mathematical Proof", Scientific American, 232, 5, May 1975, page 47-52. 'Although randomness can be precisely defined and can even be measured, a given number cannot be proved random. This enigma establishes a limit in what is possible in mathematics'. back
Einstein, Albert, "Zur Elektrodynamik bewegter Körper (On the electrodynamics of moving bodies)", Annalen de Physik, 17, , 1905, page 891. Introduction: 'It is known that Maxwell's electrodynamics - as understood at the present time - when applied to moving bodies, leads to asymmetries which do not appear to be inherent in the phenomena. Take, for example, the reciprocal electrodynamic action of a magnet and a conductor. The observable phenomenon here depends only on the relative motion of the conductor and the magnet, whereas the customary view draws a sharp distinction between the two cases in which either the one or the other of these bodies is in motion. For if the magnet is in motion and the conductor at rest, there arises in the neighbourhood of the magnet an electric field of a certain definite energy, producing a current at the places where parts of the conductor are situated. But if the magnet is stationary and the conductor is in motion, no electric field arises in the neighbourhood of the magnet. In the conductor, however, we find an electromotive force, to which in itself there is no corresponding energy, but which gives rise - assuming equality of relative motion in the two cases discussed - to electric currents of the same path and intensity as those produced by the electric forces in the former case. Examples of this sort together with the unsuccessful attempts to discover any motion of the earth relatively to the "light medium" suggest that the phenomena of electrodynamics as well as of mechanics prossess no properties corresponding to the idea of absolute rest.'. back
Nowak, Martin A, Joshua B Plotkin and Vincent A A Jansen, "The evolution of syntactic communication", Nature, 404, 6777, 30 March 2000, page 495-498. Letters to Nature: 'Animal communication is typically non-syntactic, which means that signals refer to whole situations. Human language is syntactic, and signals consist of discrete components that have their own meaning. Syntax is requisite for taking advantage of combinatorics, that is 'making infinite use of finite means'. ... Here we present a model for the population dynamics of language evolution, define the basic reproductive ratio of words and calculate the maximum size of a lexicon.'. back
Nyquist, Harry, "Certain topics in telegraph transmission theory", Transactions of the American Institute of Elecrical Engineering, 47, 617-644, April 1928, page . back
Planck, Max, "On the Law of Distribution of Energy in the Normal Spectrum", Annalen der Physik, 4, , 1901, page 553-. 'Moreover, it is necessary to interpret ... [the total energy of blackbody radiation] not as a continuous infinitely divisible quantity, but as a discrete quantity composed of an integral number of finite equal parts.' . back
Salart, Daniel, et al, "Testing the speed of 'spooky action at a distance'", Nature, 454, , 14 August 2008, page 861-864. 'Correlations are generally described by one of two mechanisms: either a first event influences a second one by sending information encoded in bosons or other physical carriers, or the correlated events have some common causes in their shared history. Quantum physics predicts an entirely different kind of cause for some correlations, named entanglement. This reveals itself in correlations that violate Bell inequalities (implying that they cannot be described by common causes) between space-like separated events (implying that they cannot be described by classical communication). Many Bell tests have been performed, and loopholes related to locality and detection have been closed in several independent experiments. It is still possible that a first event could influence a second, but the speed of this hypothetical influence (Einstein's 'spooky action at a distance') would need to be defined in some universal privileged reference frame and be greater than the speed of light. Here we put stringent experimental bounds on the speed of all such hypothetical influences. We performed a Bell test over more than 24 hours between two villages separated by 18 km and approximately east–west oriented, with the source located precisely in the middle. We continuously observed two-photon interferences well above the Bell inequality threshold. Taking advantage of the Earth's rotation, the configuration of our experiment allowed us to determine, for any hypothetically privileged frame, a lower bound for the speed of the influence. For example, if such a privileged reference frame exists and is such that the Earth's speed in this frame is less than 10-3 times that of the speed of light, then the speed of the influence would have to exceed that of light by at least four orders of magnitude.. back
Shannon, Claude E, "The mathematical theory of communication", Bell System Technical Journal, 27, , July and October, 1948, page 379-423, 623-656. 'A Note on the Edition Claude Shannon's ``A mathematical theory of communication'' was first published in two parts in the July and October 1948 editions of the Bell System Technical Journal [1]. The paper has appeared in a number of republications since: • The original 1948 version was reproduced in the collection Key Papers in the Development of Information Theory [2]. The paper also appears in Claude Elwood Shannon: Collected Papers [3]. The text of the latter is a reproduction from the Bell Telephone System Technical Publications, a series of monographs by engineers and scientists of the Bell System published in the BSTJ and elsewhere. This version has correct section numbering (the BSTJ version has two sections numbered 21), and as far as we can tell, this is the only difference from the BSTJ version. • Prefaced by Warren Weaver's introduction, ``Recent contributions to the mathematical theory of communication,'' the paper was included in The Mathematical Theory of Communication, published by the University of Illinois Press in 1949 [4]. The text in this book differs from the original mainly in the following points: • the title is changed to ``The mathematical theory of communication'' and some sections have new headings, • Appendix 4 is rewritten, • the references to unpublished material have been updated to refer to the published material. The text we present here is based on the BSTJ version with a number of corrections.. back
Shannon, Claude E, "Communication in the Presence of Noise", Proceedings of the IEEE, 86, 2, February 1998, page 447-457. Reprint of Shannon, Claude E. "Communication in the Presence of Noise." Proceedings of the IEEE, 37 (January 1949) : 10-21. 'A method is developed for representing any communication system geometrically. Messages and the corresponding signals are points in two function spaces, and the modulation process is a mapping of one space into the other. Using this representation, a number of results in communication theory are deduced concerning expansion and compression of bandwidth and the threshold effect. Formulas are found for the maximum rate of transmission of binary digits over a system when the signal is perturbed by various types of noise. Some of the properties of "ideal" systems which transmit this maximum rate are discussed. The equivalent number of binary digits per second of certain information sources is calculated.' . back
Tarski, Alfred, "The Semantic Conception of Truth and the Foundations of Semantics", Philosophy and Phenomenological Research, 4, 3, 1944, page 341 - 376. back
Links
Alain Aspect, Philippe Grangier, and Gerard Roger, Experimental Realization of the Einstein-Podolsky-Rosen-Bohm Gedankenexperiment: A New Violation of Bell's Inequalities, 'The linear-polarization correlation of pairs of photons emitted in a radiative cascade of calcium has been measured. The new experimental scheme, using two-channel polarizers (i.e., optical analogues of Stern-Gerlach filters), is a straightforward transposition of Einstein-Podolsky-Rosen-Bohm gedankenexperiment. The present results, in excellent agreement with the quantum mechanical predictions, lead to the greatest violation of generalized Bell's inequalities ever achieved.' back
Alan Turing, On Computable Numbers, with an application to the Entscheidungsproblem, 'The “computable” numbers may be described briefly as the real numbers whose expressions as a decimal are calculable by finite means. Although the subject of this paper is ostensibly the computable numbers, it is almost equally easy to define and investigate computable functions of an integral variable or a real or computable variable, computable predicates, and so forth. The fundamental problems involved are, however, the same in each case, and I have chosen the computable numbers for explicit treatment as involving the least cumbrous technique.' back
Albert Einstein, On the Electrodynamics of Moving Bodies, An english translation of the paper that founded Special relativity. 'Examples of this sort, [in the contemporary application of Maxwell's electrodynamics to moving bodies] together with the unsuccessful attempts to discover any motion of the earth relatively to the ``light medium,'' suggest that the phenomena of electrodynamics as well as of mechanics possess no properties corresponding to the idea of absolute rest. They suggest rather that, as has already been shown to the first order of small quantities, the same laws of electrodynamics and optics will be valid for all frames of reference for which the equations of mechanics hold good.' back
Alfred Tarski, The semantic concept of truth and the foundation of semantics, Philosophy and Phenomenological Research 4 (1944)., Originally published in Philosophy and Phenomenological Research 4(1994). 'Our discussion will be centered around the notion of truth. The main problem is that of giving a satisfactory definition of this notion, i.e. a definition that is materially adequate and formally correct. . . . ' back
Aquinas 113, Summa I, 18, 3: Is life properly attributed to God?, Life is in the highest degree properly in God. In proof of which it must be considered that since a thing is said to live in so far as it operates of itself and not as moved by another, the more perfectly this power is found in anything, the more perfect is the life of that thing. ... back
Aquinas 13, Summa: I 2 3: Whether God exists?, I answer that the existence of God can be proved in five ways. The first and more manifest way is the argument from motion. . . . The second way is from the nature of the efficient cause. . . . The third way is taken from possibility and necessity . . . The fourth way is taken from the gradation to be found in things. . . . The fifth way is taken from the governance of the world. back
Aquinas 160, Summa: I 27 1 Is there procession in God?, 'Our Lord says, "From God I proceeded" (Jn. 8:42).' back
Aristotle, Metaphysics, 'Written 350 B.C.E, Translated by W. D. Ross. Book I Part 1 "ALL men by nature desire to know. An indication of this is the delight we take in our senses; for even apart from their usefulness they are loved for themselves; and above all others the sense of sight. For not only with a view to action, but even when we are not going to do anything, we prefer seeing (one might say) to everything else. The reason is that this, most of all the senses, makes us know and brings to light many differences between things. ' back
Born rule - Wikipedia, Born rule - Wikipedia, the free encyclopedia, 'The Born rule (also called the Born law, Born's rule, or Born's law) is a law of quantum mechanics which gives the probability that a measurement on a quantum system will yield a given result. It is named after its originator, the physicist Max Born. The Born rule is one of the key principles of the Copenhagen interpretation of quantum mechanics. There have been many attempts to derive the Born rule from the other assumptions of quantum mechanics, with inconclusive results. . . . The Born rule states that if an observable corresponding to a Hermitian operator A with discrete spectrum is measured in a system with normalized wave function (see Bra-ket notation), then the measured result will be one of the eigenvalues λ of A, and the probability of measuring a given eigenvalue λi will equal <psi,|Pi|psi> where Pi is the projection onto the eigenspace of A corresponding to λi'. back
Brouwer fixed point theorem - Wikipedia, Brouwer fixed point theorem - Wikipedia, the free encyclopedia, 'Brouwer's fixed-point theorem is a fixed-point theorem in topology, named after Luitzen Brouwer. It states that for any continuous function f with certain properties there is a point x0 such that f(x0) = x0. The simplest form of Brouwer's theorem is for continuous functions f from a disk D to itself. A more general form is for continuous functions from a convex compact subset K of Euclidean space to itself. back
Christopher Shields, Aristotle (Stanford Encyclopedia of Philosophy), First published Thu Sep 25, 2008 Aristotle (384–322 B.C.E.) numbers among the greatest philosophers of all time. Judged solely in terms of his philosophical influence, only Plato is his peer: . . . A prodigious researcher and writer, Aristotle left a great body of work, perhaps numbering as many as two-hundred treatises, from which approximately thirty-one survive.[1] His extant writings span a wide range of disciplines, from logic, metaphysics and philosophy of mind, through ethics, political theory, aesthetics and rhetoric, and into such primarily non-philosophical fields as empirical biology, where he excelled at detailed plant and animal observation and taxonomy. In all these areas, Aristotle's theories have provided illumination, met with resistance, sparked debate, and generally stimulated the sustained interest of an abiding readership. back
Claude E Shannon, A Mathematical Theory of Communication, 'The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point. Frequently the messages have meaning; that is they refer to or are correlated according to some system with certain physical or conceptual entities. These semantic aspects of communication are irrelevant to the engineering problem. The significant aspect is that the actual message is one selected from a set of possible messages.' back
Claude Shannon, Communication in the Presence of Noise, 'A method is developed for representing any communication system geometrically. Messages and the corresponding signals are points in two “function spaces,” and the modulation process is a mapping of one space into the other. Using this representation, a number of results in communication theory are deduced concerning expansion and compression of bandwidth and the threshold effect. Formulas are found for the maximum rate of transmission of binary digits over a system when the signal is perturbed by various types of noise. Some of the properties of “ideal” systems which transmit at this maximum rate are discussed. The equivalent number of binary digits per second for certain information sources is calculated.' back
Daniel W Graham - Heraclitus, Heraclitus, 'A Greek philosopher of Ephesus (near modern Kuşadası, Turkey) who was active around 500 BCE, Heraclitus propounded a distinctive theory which he expressed in oracular language. He is best known for his doctrines that things are constantly changing (universal flux), that opposites coincide (unity of opposites), and that fire is the basic material of the world. The exact interpretation of these doctrines is controversial, as is the inference often drawn from this theory that in the world as Heraclitus conceives it contradictory propositions must be true.' back
Einstein, Podolsky and Rosen, Can the Quantum Mechanical Description of Physical Reality be Considered Complete??, A PDF of the classic paper. 'In a complete theory there is an element corresponding to each element of reality. A sufficient condition for the reality of a physical quantity is the possibility of predicting it with certainty, without disturbing the system. In quantum mechanics in the case of two physical quantities described by non-commuting operators, the knowledge of one precludes the knowledge of the other. Then either (1) the description of reality given by the wave function in quantum mechanics is not complete or (2) these two quantities cannot have simultaneous reality. Consideration of the problem of making predictions concerning a system on the basis of measurements made on another system that had previously interacted with it leads to the result that if (1) is false then (2) is also false, One is thus led to conclude that the description of reality given by the wave function is not complete.' back
Eugene Wigner, The Unreasonable Effectiveness of Mathematics in the Natural Sciences, 'The first point is that the enormous usefulness of mathematics in the natural sciences is something bordering on the mysterious and that there is no rational explanation for it. Second, it is just this uncanny usefulness of mathematical concepts that raises the question of the uniqueness of our physical theories.' back
Genesis, The Book of Genesis, 'Genesis is the first book of the Pentateuch (Genesis, Exodus, Leviticus, Numbers, Deuteronomy), the first section of the Jewish and the Christian Scriptures. Its title in English, “Genesis,” comes from the Greek of Gn 2:4, literally, “the book of the generation (genesis) of the heavens and earth.” Its title in the Jewish Scriptures is the opening Hebrew word, Bereshit, “in the beginning.”' back
George Smith, Isaac Newton (Stanford Encyclopedia of Philosophy) , First published Wed Dec 19, 2007 'Isaac Newton (1642–1727) is best known for having invented the calculus in the mid to late 1660s (most of a decade before Leibniz did so independently, and ultimately more influentially) and for having formulated the theory of universal gravity — the latter in his Principia, the single most important work in the transformation of early modern natural philosophy into modern physical science. Yet he also made major discoveries in optics beginning in the mid-1660s and reaching across four decades; and during the course of his 60 years of intense intellectual activity he put no less effort into chemical and alchemical research and into theology and biblical studies than he put into mathematics and physics.' back
Jenann Ismael, Quantum Mechanics (Stanford Encyclopedia of Philosophy), First published Wed Nov 29, 2000; substantive revision Tue Sep 1, 2009 'Quantum mechanics is, at least at first glance and at lea st in part, a mathematical machine for predicting the behaviors of microscopic particles — or, at least, of the measuring instruments we use to explore those behaviors — and in that capacity, it is spectacularly successful: in terms of power and precision, head and shoulders above any theory we have ever had. . . . The question of what kind of a world it describes, however, is controversial; there is very little agreement, among physicists and among philosophers, about what the world is like according to quantum mechanics.' back
John Palmer - Parmenides, Parmenides (Stanford Encyclopedia of Philosophy), First published Fri Feb 8, 2008 'Parmenides of Elea, active in the earlier part of the 5th c. BCE., authored a difficult metaphysical poem that has earned him a reputation as early Greek philosophy's most profound and challenging thinker. His philosophical stance has typically been understood as at once extremely paradoxical and yet crucial for the broader development of Greek natural philosophy and metaphysics. He has been seen as a metaphysical monist (of one stripe or another) who so challenged the naïve cosmological theories of his predecessors that his major successors among the Presocratics were all driven to develop more sophisticated physical theories in response to his arguments.' back
Margherita Barile, Epsilon-Delta Proof -- from Wolfram MathWorld, 'A proof of a formula on limits based on the epsilon-delta definition. An example is the following proof that every linear function f(x) = ax + b is continuous at every point . The claim to be shown is that for every epsilon > 0 there is a delta > 0 such that whenever |x - x0| < delta, |f(x) = f(x0)| < epsilon., back
Meinard Kuhlmann, Quantum Field Theory (Stanford Encyclopedia of Philosophy), First published Thu Jun 22, 2006 'Quantum Field Theory (QFT) is the mathematical and conceptual framework for contemporary elementary particle physics. . . . QFT is presently the best starting point for analysing the fundamental features of matter and interactions.

During the last two decades QFT became a more and more vividly discussed topic in philosophy of physics. QFT is an attractive topic for philosophers with respect to methodology, semantics as well as ontology. Indeed, from a methodological point of view QFT is much more a set of formal strategies and mathematical tools than a closed theory. Its development was accompanied by problems provoked by the application of badly defined mathematics. Nevertheless, empirically such pragmatic approaches have been far more successful so far than more rigorous formulations. How could such a theory work for more than 70 years? Since mathematical reasoning dominated the heuristics of QFT, its interpretation is open in most areas which go beyond the immediate empirical predictions.' back

Peter MacHamer, Galileo Galilei (Stanford Encyclopedia of Philosophy), First published Fri Mar 4, 2005; substantive revision Thu May 21, 2009 'Galileo Galilei (1564–1642) has always played a key role in any history of science and, in many histories of philosophy, he is a, if not the, central figure of the scientific revolution of the 17th century. His work in physics or natural philosophy, astronomy, and the methodology of science still evoke debate after over 360 years. His role in promoting the Copernican theory and his travails and trials with the Roman Church are stories that still require re-telling. This article attempts to provide an overview of these aspects of Galileo's life and work, but does so by focusing in a new way on his discussions of the nature of matter.' back
Richard Kraut - Plato, Plato (Stanford Encyclopedia of Philosophy), First published Sat Mar 20, 2004; substantive revision Thu Sep 17, 2009 'Plato (429–347 B.C.E.) is, by any reckoning, one of the most dazzling writers in the Western literary tradition and one of the most penetrating, wide-ranging, and influential authors in the history of philosophy. . . . Few other authors in the history of philosophy approximate him in depth and range: perhaps only Aristotle (who studied with him), Aquinas, and Kant would be generally agreed to be of the same rank.' back
Rolf Landauer, Information is a Physical Entity, 'Abstract: This paper, associated with a broader conference talk on the fundamental physical limits of information handling, emphasizes the aspects still least appreciated. Information is not an abstract entity but exists only through a physical representation, thus tying it to all the restrictions and possibilities of our real physical universe. The mathematician's vision of an unlimited sequence of totally reliable operations is unlikely to be implementable in this real universe. Speculative remarks about the possible impact of that, on the ultimate nature of the laws of physics are included.' back
Thales - Wikipedia, Thales - Wikipedia, the free encyclopedia, 'Thales of Miletus (Greek: Θαλῆς (ὁ Μιλήσιος), Thalēs; c. 624 – c. 546 BC) was a pre-Socratic Greek philosopher from Miletus in Asia Minor, and one of the Seven Sages of Greece. Many, most notably Aristotle, regard him as the first philosopher in the Greek tradition. Aristotle reported Thales' hypothesis about the nature of matter – that the originating principle of nature was a single material substance: water.
According to Bertrand Russell, "Western philosophy begins with Thales." Thales attempted to explain natural phenomena without reference to mythology and was tremendously influential in this respect.'
back
Wojciech Hubert Zurek, Quantum origin of quantum jumps: breaking of unitary symmetry induced by information transfer and the transition from quantum to classical, 'Submitted on 17 Mar 2007 (v1), last revised 18 Mar 2008 (this version, v3)) "Measurements transfer information about a system to the apparatus, and then further on -- to observers and (often inadvertently) to the environment. I show that even imperfect copying essential in such situations restricts possible unperturbed outcomes to an orthogonal subset of all possible states of the system, thus breaking the unitary symmetry of its Hilbert space implied by the quantum superposition principle. Preferred outcome states emerge as a result. They provide framework for the ``wavepacket collapse'', designating terminal points of quantum jumps, and defining the measured observable by specifying its eigenstates. In quantum Darwinism, they are the progenitors of multiple copies spread throughout the environment -- the fittest quantum states that not only survive decoherence, but subvert it into carrying information about them -- into becoming a witness.' back

www.anewtheology.net is maintained by The Theology Company Proprietary Limited ACN 097 887 075 ABN 74 097 887 075 Copyright 2000-2014 © Jeffrey Nicholls