##### vol **6:** Essays

### Physical Theology, 2006

##### Contents

Abstract

The standard model of God

A new model of God

A symmetric Universe

Relativity

Quantum mechanics

Creation

Evolution

Conclusion

##### Abstract

This essay proposes a transfinite network to link physics and theology. It follows the time-worn scientific path of uniting apparently disparate elements of the world by creating a mathematical space large enough to hold them both. By redefining continuity in terms of logical inference and infinity in terms of permutation, we hope to construct a model of reality which unifies our notions of 'God' and 'the world'. The approach is 'formalist' Cohen, Hofstadter, back

##### Introduction

I address a fundamental issue for humanity: how do we construct and validate our models of God? 'God' here means the whole of reality. Since we must survive within God so defined, knowledge of God is of supreme practical value.

Traditional western theology divides reality into 'God' and 'The World' and holds that God is essentially invisible. Our knowledge of God comes through ancient texts. Modern scientific epistemology, on the other hand, takes Einstein's view that we can trust information only when it is obtained by contact with the entity we wish to know. A trustworthy theology must therefore be scientific, that is devoted to producing models based on direct experience of God. Scientific theology is only possible if God is observable, that is if, in the spirit of Occam, we ignore thet distinction between God and the world.

The scientific method is empirical. Because we are all studying the same Universe our results are compatible, comparable and communicable. One Universe plus scientific method will eventually lead us to one theology, as it does in the other sciences. In this theology, God is truly creator and judge: all our activity is divine activity, and our schemes work or do not work depending on whether they are judged fit or unfit by reality.

For a traditional believer (which I once was) the identification of God and the Universe seems a priori impossible. While God is perfect, eternal and unchanging the World is imperfect, temporary and subject to change. A little reflection shows that this impossibility is model dependent, and need not appear in a revised model of God. We begin with a brief account of the Western model. back

##### The standard model of God

The standard model of God has a very long history which culminated in the work of the medieval theologian Thomas Aquinas (1224 - 1274). Aquinas built on the work of Aristotle (384 - 322 bc). Aristotle's model of reality distinguished two elements, potential (the ability to be) and act (actual being), and proposed an axiom of actualization: no potential being can actualize itself.

Aristotle used his model to establish
to existence of an unmoved mover. To move is to realize the potential
to move. Since no potential can actualize itself, any motion implies
a chain of moved movers which must have a beginning if anything is to
move at all. Since things do move, there must be an unmoved mover
Aristotle, *Physics,*, Book VIII Chapter vi

Aquinas provides five ways to prove the existence of God, the first of which is identical to Aristotle's argument for the unmoved mover, renamed 'God'. Although called 'proofs for the existence of God', these texts are rather arguments for the partition of reality into God and not-God. As the proofs make clear, we and our world are part of not-God. Aquinas 13

In each case, this partition is based on the premiss that the world cannot explain some aspect itself, which must therefore be explained by God. Since the traditional God is invisible however, we do not have access to this explanation, but must 'take it on faith'. Science, on the other hand, believes that there is an accessible and reasonable explanation for all phenomena, that is all messages received from our environment, which includes our hearts.

For Aquinas, God is pure act, the realization of all possibility. From this axiom he derives all the traditional properties of God, simplicity, perfection, goodness, infinity, eternity, unity, omniscience, life omnipotence, justice, mercy, providence and beatitude Aquinas 14, Summa, qq 3-26

In addition to the monotheism
inherited from the Hebrews, Christianity embraced a divine Trinity,
asserting the existence of three distinct divine Persons in the one
God. Aquinas modeled the Trinity on the properties of relationship,
knowledge and desire, also derived from Aristotle . The Trinity takes Christianity some distance in the direction of religions which attribute many
personalities or incarnations to the power that controls human life.
We follow this course below, unfolding a transfinite array of
'personalities' (independent sources of information) from an initial
single source which shares many of the attributes of the traditional
God. Aquinas 160, *Summa,* qq 27-43

The Christian God is a living God.
Since Aquinas accepted the traditional definition of life as
self-motion he faced the difficulty that motion implies potential and
God is pure act. Following Aristotle, Aquinas distinguished between transeunt and immanent action. Transeunt action actualizes something other than the agent while immanent action, like the life of God, perfects the agent itself. Aristotle 1 * Metaphysics* Book IX chapter viii, 1050a22

On the present hypothesis we are part of God, and so all our actions may be considered part of God's life. The modern descendants of Aristotle potency and act are potential and kinetic energy. Here there is no axiom of actualization: the two forms of energy are 'peers' freely interchangeable and equally real, as illustrated by the harmonic oscillator. back

##### A new model of God

Theology creates and studies models of God. The hypothesis here is that the terms God and the Universe refer to the same thing, whose defining property is that there is nothing outside it and so no external constraints upon it. Any constraints we find must arise from self-consistency.

To unite God and the Universe, we need a new model of God. In effect, I want to replace classical continua, which are traditionally a characteristic physical bodies, with logical continua (represented by Turing machines), which is are a characteristic of networks.

Whatever God is, it is big. At least as big physically as the observed Universe. The use of mathematical function spaces in quantum mechanics suggests that the Universe is even bigger behind the scenes. We estimate big here in terms of complexity (measured by information or entropy) rather than by spatial extent.

The mathematical archetype of big is 'Cantor's Paradise', the transfinite Universe of sets and correspondences invented by Georg Cantor. Although our engineered realizations of computation and communication theory are finite, the relevant mathematics often holds in transfinite domains as well. It is in this respect invariant with respect to complexity. It derives this feature from set theory itself, which deals indifferently with aggregates of any size. We therefore propose to model God with a transfinite network whose basic properties are complexity invariant. Cantor, Dauben

Symmetry with respect to complexity is the foundation of abstract knowledge. It allows us to map the transfinite scale of complexity to finite models, talking (as we do) with equal facility about atoms, people, planets, and feelings. The essential features of communication in the transfinite network are the same regardless of the complexity of the sources and of the messages exchanged. Since physics is the most general and abstract theory of the Universe, we might expect the foundations of physics to lie in the nature of communication alone, with no further specialization.

Soon after the development of set theory, Cantor and others realized that any attempt to work with the set of all sets, (which might serve as a model of God) leads to contradiction Formally, the set of all sets does not exist. This result is in accord with the ancient notion that we cannot comprehend God. Nevertheless, set theory provides a means to talk consistently about subsets of some larger aggregate. Modelling and communication of God must therefore be local. This is also a feature of our Universe imposed by the finite velocity of communication and explicated in detail by Einstein's theory of relativity. Mendelson, back

##### A symmetric Universe

Let us construct a 'symmetric Universe', modelled on the Cantor
Universe. We begin with the set *N* of natural numbers 1, 2, 3, . . . . *N* is said to be infinite because given any natural number *n+1*. Cantor chose the symbol*
ℵ_{0}* to represent the cardinal of N.

*ℵ*is the first transfinite number.

_{0}
In modern set theory the axiom of the power
set provides that given any set *S,* there exists a power set *P(S)*
which contains all the subsets of S. Cantor's theorem states
that the cardinal of P(S) is strictly greater than the cardinal of S,
even when S is transfinite. This theorem gives rise to the second
transfinite number, *ℵ _{1}*, the third

*ℵ*and so on without end. The axiom of the power set is blind to the size of any set we choose, and so acts in a uniform (or symmetric) way on all sets. Cantor's theorem is based on the hypothesis that counting by one to one correspondence is also blind to the size of the sets compared. Jech

_{2}
Cardinal number ignores the nature of
the elements of a set. Set theory relies heavily on a second, more
concrete abstraction, the ordinal number or ordinal type of a set.
The *ℵ _{0}* natural numbers have a natural order, but we may permute this order to obtain the

*distinct symbols, and so on without end. Let us call the all permutations associated with each cardinal a peer group, analogous to the peer levels in computer network engineering.Tanenbaum*

*ℵ*permutations of the_{1}*ℵ*natural numbers. These_{0}*ℵ*permutations may be ordered alphabetically based on the order of the natural numbers. These symbols may themselves be permuted to create_{1}*ℵ*_{1}
The set of permutations of a set *S* of cardinal *n* form the
symmetric group on *n* objects. Group theory finds that every group of
order *n* is a subgroup of the symmetric group. We call the Cantor
Universe interpreted as a transfinite hierarchy of groups of
permutations the symmetric Universe. The symmetric Universe is
unlimited in size and complexity; every element of it is unique; and
it contains, as a subgroup, every possible abstract group. As Cantor
stated, it is large enough to represent anything thinkable, which we
assume to include all local manifestations of God.

We use permutation rather than combination to generate our model because we are ultimately modelling a concrete set of symbols (the Universe) and accept Landauer's thesis that all information is encoded physically. We assume that nothing is different (ie no message is different) unless it is physically different and that every physical event is unique and can be placed into correspondence with a unique sequence of symbols. The address of an element arises partly from within itself and partly from its place in the overall system. Landauer

The symmetric Universe serves as a configuration space for the transfinite network. We can map this space onto our Universe by assuming that every event in our Universe corresponds to a transition between permutations or sets of permutations in the symmetric Universe.

We gain insight into the functioning of the transfinite network by comparing it to an ordinary computer with multiple processors. The symmetric Universe corresponds to the memory. We assume the logical equivalent of Newton's first law, that the state of a memory location is only changed when it is written to by a process. We model processes with Turing machines. Each execution of each computable function results in change of state of the entire system.

Simple logical machine operations are 'not', 'and' and 'or'. We may imagine the operation of a conventional computer as a time evolving network. Connections are continually being made and broken, each cycle transferring a certain amount of information. The making and breaking of connections corresponds to the physical creation and annihilation of particles.

Such atomic operations may be organized into more complex processes. This embedding confers meaning on physical events. A multiplication operation may itself be part of the execution of a climate model and the execution of the climate model part of a more complex scientific, social and political process of deciding how to manage our interface with the Universe.

Conflict between processes and failure can result if a deterministic processor is faced with memory changed by processes outside its control. Change in the environment of a deterministic processor is the source of indeterminism in the model.

We can continue the development of this model by fitting it to quantum mechanics and relativity. back

##### Relativity

At the heart Einstein's theory of relativity is an epistemological hypothesis: that information comes from contact. Here we propose a vast network of 'personalities' talking to one another, a transfinite extension of the triune threesome developed by Platonic theologians in the Patristic age. Everywhere we look we experience things talking to us and we only experience those that do talk to us. We begin with the conversation which holds us on the Earth and models the large scale structure of our Universe. Kelly

The network model suggests that gravitation is a protocol that constrains the way the whole system fits together. General relativity is founded on two propositions. The first, in Einstein's words 'All Gaussian coordinate systems are essentially equivalent in for the formulation of the general laws of nature'. The second is that all communication takes place by contact. These propositions constitute the 'principle of general covariance'. Einstein

Gaussian coordinates generalize Cartesian coordinates, the principal requirement being that there is a continuous relationship between Gaussian coordinates and the space they measure. Intervals between Gaussian coordinates are defined with the help of a metric tensor. Two events are in contact (ie become one event) if the interval between them is zero.

Einstein developed relativity in two
stages. Newton worked in a rigid Euclidean space endowed with a
universal time and instantaneous communication at a distance. The special theory
follows as soon as we introduce delay into communication. The result
is a four dimensional space-time with a Minkowski metric and a group
of Lorentz transformations which tell us what uniformly moving
observers look like to each other when they communicate at light
speed. The only special thing about the velocity of light, *c*, is that
it is fixed and finite. Horse drawn communications also induce
relativistic structure on the networks they serve, where *c* = horse
speed. Newton

By his own account, the happiest thought in Einstein's life was the realization that an observer in free fall does not feel his own weight . Although gravitation can be transformed away locally, this cannot be done globally. Each observer has its own path of free fall (geodesic), and the web of geodesics marks out the curvature of space. Pais

General covariance accepts that every event is unique and goes its own way. The complexity of transformations required to transform a randomly chosen event into my local frame of reference must be equal to the variety of relative motions possible in the Universe. The set of transformations allowed under general relativity is much larger that the set of Lorentz transformations. Here we assume that relativity operates at every scale in the Universe, wherever there is communication. In the human sphere, general covariance allows us to put ourselves in one another's shoes and see the world from one another's point of view.

In his 1915
paper. Einstein represented general covariance in the language of
Riemann's differential geometry. Riemann represented *n* dimensional space by an* n* dimensional Gaussian coordinate system and found that the whole structure of the resulting curved or dynamic space could be encoded as a field of metric tensors, *g _{mn}*, He also found that 'The basis of
metrical determination must be sought outside the manifold in the
binding forces which act on it'. Einstein 1, Jammer

This determination for the actual space we inhabit was provided by
Einstein, who found that the metrical structure of space-time,
represented by the Einstein tensor G. is a function of the energy
tensor *T: G = 8 π T*. The curvature in a region of space-time
is a function of the energy in that region.

The symmetric Universe is naturally covariant, since each event in a peer group is represented by a unique permutation of the relevant alphabet and group transformations exists to convert every element of the peer group into every other. Of these transformations, a subset are computable. The computable subset of the transfinite network we will call a computable manifold, the logical analogue of the continuous differentiable manifold of general relativity.

We define computable transformations as a transformations whose
symbolic expressions can be represented as a Turing machine. Turing
found that the set of computable functions is *countably
infinite*, ie its cardinal is *ℵ _{0}*.

Let us identify every execution of Turing machine in the
computable manifold with an event in the world, measured by one or
more 'quanta of action'. The rate of communication* f * between
two points is therefore (from a quantum mechanical point of view)
measured by energy according to the relationship *E= hf.*

In general relativity, the shape of space is determined by the distribution of energy. Energy, that it traffic, attracts. At this level of abstraction, the accretion of stars and planets and the accretion of human groups and cities are driven by the same force, the attraction of the 'bright lights' of a network. Bandwidth attracts bandwidth.

In addition to its position in the physical layer of the transfinite network, each event has has a symbolic role in the higher layers of the network. As in a computer network, symbols are processed through multiple layers of software before they are presented to the user. So we may model God (that is the whole system) as the ultimate user of the transfinite network. Every particle is a user at its own peer level.

It may seem at first sight that since the transformations in
Riemann space are continuous and differentiable, the cardinal of the
set of permissible transformations must be the cardinal of the
continuum, that is *ℵ _{1}* or greater. To explore this question, we turn to quantum mechanics. back

##### Quantum mechanics

Since its birth in 1900, quantum mechanics has enjoyed a reputation for profound obscurity. The clouds began to part in the eighties, when Feynman , Deutsch and others began to interpret the quantum formalism in terms of communication and computation. Feynman 1, Deutsch

The foundation of quantum mechanics, first seen by Planck, is that everything that we see in the Universe is a discrete (quantum) event. Here we suppose that there are many formally different quanta of action as there are Turing machines. This observation connects the logical model to the physical Universe of experience.

Why is all communication quantized and countable? The answer to this question may lie in the theory of communication. Assume that stable structures in this Universe, like ourselves and atoms, are maintained by some sort of dynamic control. Assume further that effective control requires a sufficiently low error rate in communication. The theory of communication discovered by Shannon tells us that we can avoid error if the messages we use to communicate are so far apart (in some abstract space) that the probability of their being confused approaches zero. Error free communication requires quantization. Shannon

Shannon's scheme for the defeat of error also introduces delay into communications, since the encoder must wait until the source has emitted a certain number of symbols before it can encode them into an error resistant packet. Since the velocity of light is the maximum attainable, we might assume that the code used to transmit information by massless particles is the shortest possible.

Quantum mechanics began as 'wave mechanics' and gave rise to talk of 'particle-wave duality'. This duality is epistemologically asymmetrical. We observe particles. We postulate waves. By waves we mean periodic functions, traditionally continuous functions like sines, cosines and complex exponentials. There is also digital periodicity, as the name recursive function theory applied to the theory of computation suggests. Much of the power of a digital computer arises from its ability to repeat simple cycles of operations very quickly.

The waves are part of the abstract model, and provide a means of visualizing quantum processes. No-one can deny the utility of wave models as a bridge between the classical notion of continuum and the particulate nature of experience, but the classical continuous formalism of wave mechanics may not have the power to tell the whole story. Brandt

Quantum mechanics is a natural fit to the symmetric Universe because the function space, Hilbert space, which it inhabits is a direct descendant of the Cantor Universe. Quantum mechanics represents states of a physical system by functions or vectors in spaces of finite, countably infinite or transfinite dimensions. Observations or measurements of a physical system are represented by operators in this space. The laws of quantum mechanics are expressed as constraints on these vectors and operators. von Neumann

In the century of its existence, quantum mechanics has shown itself to be perfect, as far as it goes. For some, particularly Einstein, this is not far enough. Quantum mechanics is incomplete in that it does not predict the precise outcome of individual events, but it does predict (to apparently unlimited precision) both the 'stationary states' that represent permanent structures in the physical world, and the probability of transitions between various pairs of stationary states.

This probabilistic interpretation is reflected in quantum algebra. All vectors are normalized to 1, which insures (in the Born interpretation of the formalism) that the sum of the probabilities of the different possible outcomes of a particular quantum history is 1. This feature makes a source of quantum events formally identical to an information source as defined in the mathematical theory of communication. The alphabet of the source is the set of eigenvalues of the operator used to observe it. Khinchin

The Hilbert space required to represent the joint state of two
particles is the tensor product of the Hilbert spaces representing
the individuals. The state space of a quantum system thus grows
exponentially with the number of particles involved. From this we
conclude that the cardinal of the joint state of *ℵ _{0}* particles is

*ℵ*, and so on. This rate of growth allows a bijective mapping between the Cantor Universe and quantum mechanics.

_{1}An important feature of the construction of joint states is entanglement. The states of entangled particles are correlated in a non-classical manner which enables information about the state of one particle to be inferred from measurements of the state of another. The particles in entangled states are thus 'logically bound' to one another and can be used to transmit information about quantum states by 'teleportation'.Nielsen

The quantum states are not visible, but part of the model. In this respect quantum mechanics resembles all other explanations of the world which postulate a hidden process to explain a visible one. What we see are events, modelled as the operations that transform one state into another.

Let us think of the four-space of general relativity as the physical user interface of the Universe. All events are observed in this space. Behind this user interface is the process which gives meaning to the observed structure. In a computer the pixels on the screen are controlled by software and user input to give an image whose behaviour is explained by the software.

The software of the universal process is described by quantum mechanics. In quantum mechanics, one looks with an operator, an observable. What one sees after repeated observations is the frequency with which the observed system makes transitions between various pairs of eigenstates of the observable. From this point of view, quantum mechanics is equivalent to network traffic analysis, predicting, for instance, how often an atom will emit or absorb a photon in a given environment.

In more anthropomorphic (human friendly) terms, to observe is to ask a question. To ask a question is to set up a frame of reference (the 'eigenvectors of the question') which the respondent may use to frame an answer meaningful to the questioner. back

##### Creation

The union of quantum mechanics and relativity is quantum field theory. One consequence of the special relativity is that mass and energy are equivalent. Physical particles may be created and annihilated provided that energy, momentum and action are conserved. Quantum field theory sees every event as the coupled creation and annihilation of particles, and computes the frequency of these events. Zee

The most successful quantum field
theory is quantum electrodynamics, described by Feynman whose approach
is called the path integral method. One integrates the action along
every possible path from* a* to *b*. Action is measured by phase, a
number readily available from the formalism. The most probable path
is that whose action is stationary: phases in the vicinity of this
path are relatively constant and so add to give a large probability
amplitude for the path. Feynman 2

A logical version of this scenario may be constructed using the
ancient principle of *bonum ex integro* ('good comes form the whole').
An engine will not run (or anything else happen for that matter)
unless all systems are go. Let us suppose that the inner product
('overlap integral') of quantum mechanics tells us how often a
logical continuum forms between *a* and *b*. In logical terms the
transition from *a* to *b* only happens when it is proven. This is
equivalent to 100% overlap between input and output, that is logical
contact. Each of these contacts is modelled by a Turing machine in
the computable manifold.

We assume that we what observe is computable. The frequency with which various computable functions are executed thus corresponds to the frequencies of certain events. In neural network terms, we may think of Turing machines or computable functions as synapses in the cosmic nervous system. Synapse - Wikipedia

Much difficulty in quantum field theory comes from the infinities arising from division by zero in the continuous formalism. These infinities, and the large (unobserved) energy of the vacuum predicted by quantum field theory, might be overcome by using the countable formalism of the computable network. We are only required to explain the phenomena, not the vagaries of inappropriate mathematical models.

A Turing machine is a deterministic entity, implementing predicate calculus which was shown by Gödel to be completeThe principle of requisite variety in cybernetics tells us that simple systems cannot control more complex ones. Every quantum event has a history (or as a physicist would say, a preparation). This preparation is effected by communicating with the system to be studied. If that system is more complex than the preparation, the outcome will not be determined. The relationship between relatively finite communication and relatively infinite possibility may thus explain the incompleteness in quantum mechanics. Gödel, Ashby

Although Einstein felt the incompleteness of quantum mechanics to be a defect, it is seen here as a clear manifestation of the openness of the Universe. It also suggests that no local manifestation of God is capable of intelligent design, that is deterministic progress into the future along a preordained path. It is always at risk of 'decoherence' arising from communications with its environment. back

##### Evolution

Quantum mechanics and relativity describe the basic network structure of the Universe, and can be applied a any level of complexity, but what about the detail, the actual messages that pass over the network?

It is of the essence of a network that it be able to convey any possible message. In practical terms, all possible (ie 'well formed') strings are transmissible. The designation 'well formed' is relative to a protocol or grammar which partitions the full set of permutations of an alphabet into grammatical and ungrammatical. How is this partition implemented? The answer seems to be evolution by natural selection. This occurs because there are more possibilities in most situations than can be expressed in physical messages.

Modern solutions of Einstein's equation coupled with astronomical observations give unequivocal support for the big bang hypothesis, that in the last thirteen billion years, the Universe has grown from an initial singularity to its current size.

The metaphysical properties of the initial singularity were analyzed by ancient theologians. like Aquinas. During its growth, the Universe has remained forever one because there is nothing outside to fragment it. From a formal point of view we might ascribe this expansion to Cantor's theorem: for every set there is set of greater cardinal number. A countable string of actions may be permuted to give an uncountable variety of such strings. Cantor's proof is non-constructive: it is held to be formally true because its negation implies a formal contradiction.

If the hypotheses of the proof are fulfilled, we must expect the conclusion to follow. God, communicating with itself using distinct countable symbols, can, by permutation, create uncountably infinite structures. Since this happens at all scales, we too are part of the creation and communication process. We model this as a network, because looked at as a whole, a network communicates with itself.

We thus have a force for unity and a force for diversity. Between them, these forces exert selective pressure on the Turing machines that inhabit the computable network. In the competition for limited computing resources, inefficient algorithms and strings of algorithms are selected against.

We may gain further insight into the evolutionary process by considering the 'P—NP' question, which explores the relationship between foresight and hindsight. Turing established that there are truths inaccessible to finitely defined deterministic processes. We take this to mean that in certain circumstances the future cannot be determined by any process executing in 'polynomial time', a member of the class P. Many steps into the future must therefore be classified as NP, requiring 'exponential time' or chance to find a solution, which may then be checked in polynomial time. Applying this to evolution, we see natural selection as a P process which tests the solutions reached by indeterminate processes classifies them as fit or unfit. back

##### Conclusion

The beautiful fit between the large scale structure of the Universe and Einstein's general theory of relativity and between quantum mechanics and the detailed structure of the Universe are examples of the 'unreasonable effectiveness of mathematics in the natural sciences' noted by Wigner .Wigner

The effectiveness of mathematics seems more reasonable if we note that both mathematics and the observed Universe are symbolic systems. An event, as understood by quantum mechanics, is a symbol, that is a definite and separate addressable thing which may be used with a suitable code to convey meaning, like these letters. Logically, there is no difference between the way elementary particles talk and people talk. The conversation of elementary particles is the physical layer of human conversation.

The 'religions of the book' hold that God communicated with us once for all. The source of this notion would seem to be epistemological ideas attributed to Parmenides We can only have reliable communication with (ie truly know) our environment if it does not change. Fundamental reality must therefore be, eternal or rigid. Since God does not change, there is no need for us to receive updates from God. Burnet: Parmenides

Here we take a dynamic view. We are always communicating with God. Both we and God can change and still maintain true communication as long as our frequency of interaction with God matches God's frequency of change.

Uncontrolled power tends to corrupt communication because it can prevent negative reviews of its actions. Traditionally, God is so great that He judges the powerful and the weak with complete indifference. Historically, dictators have overcome this problem by taking control of God, claiming to rule by divine right.

We can reassert the power of God by recognizing that it exists independently of any subset of itself, and is open to observation and verification by all. Given a scientific theology, we can sift spurious claims about God from genuine ones, and so place the governance of our societies on a sound realistic footing.

The best feature of this model, to my mind, is its community. It sees a Universe structured by conversation. The totality of this conversation is measured by the space-time size of the Universe, and its local intensity by local energy density.

Conversation is natural to us. We know its properties intuitively, The network paradigm allows us to transform the subject of our inquiry into our own frame of reference and then ask how we would behave in the circumstances. With this model, we can use our personal experience to explore all aspects of the life of God. back