Notes
[Notebook: DB 61 Warm]
[Sunday 6 May 2007 - Saturday 12 May 2007]
Sunday 6 May 2007
[page 34]
Monday 7 May 2007
Let us assume that things are unitary in the initial singularity but that it is simple, no coding, no delay, no spacelike separation, no memory and no past or future light cones. Let us further assume that the probability flow in the singularity is ℵ0 events per second or, given the peculiarities of transfinite arithmetic, ℵ0 per any discrete unit of time. Since we may as well make the unit of time 1/ℵ0, the rate of probability flow becomes 1. With no spatial extension, we can safely say complexity 0, spatial extension 0, no light cone past or future, no questions of consistency or error.
No complexity without memory. No memory without frequency difference - memory = low frequency change.
How does this thing bootstrap itself into its present form. Memory, coding, space, error correction etc all come into existence simultaneously, like the Word, the Spirit and the ℵ0 other gods around us.
[page 35]
These are formed by an evolutionary process whose selective and optimizing tendencies are exposed by the effectiveness of the variational method in deriving possible equations of motion of particle sin the world.
In the course of an event, like a mating, observer and observed become one, yet retain their proper frames of reference (of proper time).
If gravitation represents the hardware of the Universe, then quantum mechanics may already be logical, giving meaning to the hardware. Each layer of the network gives meaning to its alphabet, which is the set of functions available in the layer below it. [function = tool]
Energy scales: if the energy of the Universe is conserved the number of particles is an inverse function of their mass energy. [this is not an exact inverse because there is some loss through the conversion of the particulate energy into gravitational potential, eg the loss of energy of photons in an expanding space.]
We are reminded of MTW's words . . . Everything comes into existence at once, chicken and egg are effectively simultaneous. Misner, Thorne and Wheeler, page 71.
FLUCTUATION - CAPITAL INVESTMENT [POTENTIAL = CAPITAL]
Reality puts certain bounds on business proposals, but human credulity regularly breaks those bounds.
Gravitation is completely indifferent to meaning, as we should expect from a wide open communication channel, ie one with a complete basis 'alphabet'.
[page 36]
As sets can be elements of sets and networks elements of networks, so (I have forgotten) . . . I am a particle comprising many sub-particles and [I am] an element of various superparticles. The binding between the particulate elements of particles is a function of their communication rate.
e have the initial singularity, the classical god modelled as pure act, ie as a clock pulsing at ℵ0 'Hz'. Now break it into father and son. The son is the image (ie a copy of) the father, but the copy, although perfectly representing the original is not the original and so we have two divine persons. They fall in love with one another (so obviously they have been communicating) and their offspring is the Holy Spirit, [insert from page 38] Now put this myth mathematically. Retell it in the Catholic idiom on the assumption that the initial singularity is the classical God.
Science itself is emerging from the mist.
Tuesday 8 May 2007
Wednesday 9 May 2007
Consider a system in which we set ℵ0 = 1. By Cantor's theorem e might then say ℵ1 = 2 and so on, giving us a transfinite way to generate the natural numbers. Now we ask how does the Universe bootstrap itself out of the mist? The answer comes from uncertainty in communication and the exponential power of error correction which is made visible in the numbers.
Here some transfinite arithmetic. The scheme above depends upon the existence and uniqueness of the
[page 39]
empty set which is a member of every set. see Jech. Jech
A small increase in the cardinal of the alphabet gives a large increase in the number f words. There are 256 different 8 letter binary strings, and 6561 trinary strings, ie 26 times as many. In error correcting terms, this means that if we code a set of binary strings into trinary strings we can put these strings further apart in message space and so reduce the probability of confusion. The point is that the gain in complexity in the message space more than justifies (ie gains positive fitness from) increasing the length of the alphabet above the minimal two letters.
This encoding requires memory and process (that is complexity) that can pay for itself in selective advantage.
In physics a system at equilibrium has n tendency to change, ie it is not subject to a potential and its entropy is at a maximum. Although this equilibrium condition makes many calculations easier (entropy is conserved), things are always being pushed or drawn away from their equilibrium by outside influences, as the sun does the earth. S to stay at equilibrium a system must either be isolated or (less satisfactorily) be so big that nothing affects it.
The big bang (which continues) shows that the initial singularity was not at equilibrium and it continues to 'decay' into lower and lower energy levels, one of which is us.
We also gradually differentiate our minds by contact with the world of our individual birth. This explains the growth
[page 40]
of networks where generic hardware becomes differentiated into a user node at a set of software levels.
We have no trouble representing complex numbers in a digital computer which simply uses longer strings if it wants to better approximate to the real numbers.
The symmetric Universe us the biggest structure that we can make given any chosen value of the cardinal of the natural numbers.
Transfinite cardinal arithmetic has built in uncertainty. aleph(n) = aleph(n)aleph(n) = aleph(n+1) etc. At this level of resolution the lengths of the alphabet (2 .. aleph(n)) is irrelevant. All that is important is the transfinite exponent, which measures the length of the string of multiplications.
We might call this the Platonic limit. The practical limit, which leads to complexification by error correction, taking advantage of the power of finite (computable) exponentiations (finite in terms of algorithmic information theory, ie the length of the list of machine states).
We begin with an identity operator and an inversion operator defined by the property that two inverses applies one after the other make an identity, something like electron spin or photon polarization.
Perturbation is natural in a network. Here I am sitting in the sun then the phone rings and . . . my life takes on a new direction which could be seen as a new vector in a Hilbert space resulting from the
[page 41]
transformation of the original by the phone call operator. The caller is my ex, so I mount a particular observable (operator) to interface between my inner states and the input and output going over the phone line.
How does entropy increase? it is orthogonal to unitarity. We increase entropy by increasing the cardinal of the alphabet while reducing the probabilities of the individual letters so the system remains normalized. And we increase the number of states by 'stringing' states together in an ordered way, so that n states give us n! possible strings.
SHANNON - EVOLUTION (survival = accurate communication)
All communications have a timelike component, and may are purely timelike (like sitting still in my own reference frame).
Time = local sequence of events: Aristotle: the number of motions relative to before and after. Aristotle. Physics, Book 4 chapter 11, 220a25
Equilibrium = effectively isolated for a sufficient relaxation time to maximize entropy over a local (potential free) energy surface.
Evolution is driven by scarcity of resources. The scarce resource in formal systems is computability, the finite number of Turing machines,
All sounds as good as can be expected, but now we must find the strongest objections:
1. We can compute nothing. We are entitled to take over the entire existing formalism and methods (which are implemented in digital computers anyway). The path integral method and Feynman diagrams are all reminiscent of network behaviour. We try to compute the probability of an event by examining everything in its past light cone and creating a tree of possible consequences for each relevant input state.
2. Just too weird. Does not do much for physics but be compatible with it but provides an overall network vision of life which embraces the physical layer and the software layers built upon it that lead to stable structures like you, me and the sun. In other words, the apparent weirdness is a small price to pay for breaking the ancient barrier between matter and spirit.
3. There is infinite information content in the continuum? The quantum computers hope that because the coefficients a and b in the expression for a qubit |q> = a|0> + b|1> are continuous this state represents an uncountable infinity of superposed sates which may be used in parallel to create computations much faster than the digital Turing machine. There is (and can be) no real evidence for this. We can never make enough measurements on the same qubit to show that a and b are really real numbers and not rationals (a countable set, nevertheless large).
4. What about the Lagrangian: survival of the fit.
[page 43]
The simplest way to imagine the network is in the time domain, Local frequency and energy density are at a maximum in the initial singularity. This frequency represents the minimum of memory. Then, as loops get longer and more complex they become slower. A given processor may be able to operate at gigaflop speeds but still take months to run an instance of a climate model. The climate model, in turn, gives meaning or context to every bit that moves during a long computation. An analyst chasing a bug in the code has sufficient information (if it can be used) to locate a bug down to a few bits. This is possible because there are deterministic threads running through the model.
The problems raised by calculus and continuity were solved by algorithms most of which involved going to the limit in some way. My favourites are the epsilon-delta arguments which establish an algorithm which is held to hold all the way to the limit.
Finally, if the Universe can be modelled as a transfinite computer network we have an explanation for the unreasonable effectiveness of mathematics in the physical science (Wigner) Wigner
Thursday 10 May 2007
Hille Analytic Function Theory Hille
In gravitation there are no cycles longer than the smallest and so no memory, so all processes are strictly meaningless and must be carried out as in set theory
[page 44]
by exhaustive one-one correspondence rather than by the more complex algorithms that make compressed addition, multiplication and other operations possible.
NEAREST NEIGHBOUR processes are both logically and classically continuous. The constraint of classical continuity separates the functions of general relativity from all possible functions. Because there is no meaning in gravitation, there can be no error. We might say that all mutations are silent.
Memory and algorithm arise 'simultaneously', the algorithm defining the cycle that constitutes the memory by lasting through many shorter cycles.
MEMORY is a concept from the FREQUENCY (TIME) DOMAIN)
Distance Physical/Meaningful.
Physical = a set of touching rods defining the distance (no overlap). Meaningful - a symbol that represents the distance (eg the number of unit rods it contains).
Hille page 7: 'On the basis of the Dedekind postulate we can obtain the representation of the elements of the real number system by means of decimal fractions."
We talk about the continuum using discrete language, eg books like Hille.
[page 45]
A network is a connected system of events. Sometimes we draw networks as though all the events happen simultaneously. This is a mathematical ruse, integrating over time to measure the traffic through a node. Or we can consider the network in spacetime, each link being created and annihilated. Quantum mechanically we integrate to find the overlap between different states In real life the time ordering of connections and the correlations between nodes so induced give us a causal structure. In this structure we replace the classical cause-of - not-cause-of dichotomy with a spectrum running from o (no connection) to 1 (100% correlated).
Symmetric network implements no cloning.
The transfinite network embraces all the interesting features found in mathematics.
Complexity in our Universe has both a space and a time dimension, proportional to energy and momentum. Not perhaps complexity but cardinality. The same if we ignore the actual structure and just count the parts.
What does gravitation really mean? There are 20 degrees of freedom in the curvature tensor which arise from ten degrees of freedom from each of the metrics required to compute the curvature tensor.
Each degree of freedom is an independent variable
[page 46]
that must be stored in memory. We need ten memories for each metric, 20 to express the curvature.
What does hardware do? It provides a set of states that can be manipulated to model logical operations. It is bound by logical confinement. At no point must it admit inconsistency in its structure. Inconsistency cannot be without meaning, since it requires the comparison of two representations of the same thing that happen to be different.
The thing to do now is to write a network simulation that behaves like gravitation.
The Universe has to cook up the whole minimal story at once, the bootstrap.
Gleich Genius page 348: 'Why should the correct theories be the computable ones?' Gleich
Still need to know what curvature of space means in network terms, the 65$ q. We say gravitational communication ie dust or maybe a fluid of bosons, ie indistinguishable particles.
Fermions obey Shannon an have a structural role. Bosons are blind to order and transformation.
. . .
[page 47]
The 'correspondence principle' for the digital version of the world is that we simply assume that nature computes things in exactly the same way as physicists, embodying the same theorems. Why does the world want to compute a line element ds2 ? What is it used for? Where does it fit in the process?
What is 'flat' in network terms? Velocity of transmission of messages is constant, ie coding delay is constant.
Friday 11 May 2007
Two particles may be said to exist when the hardware is executing different software at different times.
Physics is 'real time; processing. Do we assume that all processing in the Universe is devoted to error prevention and correction, in line with the concept that the only goal for an entity is to survive, avoiding particularly fatal errors and nipping potential fatal errors in the bud as soon as they are detected. The best performance is achieved if exponentially growing errors are detected almost as soon as they occur so that they do not propagate.
Mathematical physics applies complex arithmetical to physical situations without postulating much in the way of mechanism. Eg Feynman's path integral method breaks spacetime down to infinitesimal 'cubes', calculates the propagation of amplitude across each one and integrates the lot, using a variational method to select, 'sniff out' the path 'actually taken' by a particles. This seems very messy. What underlying events
[page 48]
does it (or the theory of relativity, say) actually represent?
Mechanism = series of logical connections, thigh bone-leg bone-ankle bone which are represented in spacetime by bearings, bolts, glue etc and in physics by energy minima - things will bond if they can reduce their energy by doing so.
What is the 'mechanism' for differentiation and integration. We examine two states in the building of a house, which is represented by an algorithm whose infinitesimal steps are something like drive in a nail, pour concrete slab etc. In an abstract way we are looking for a set whose cardinal is the number of parts in the house and whose differential (at any point in construction) is 'add one part', the integral being the whole house.
So differentiation exposes each step in a process (algorithm) and integration runs the whole algorithm to get the overall effect.
In ordinary continuous calculus the differentials are merely quantities, dx, ie cardinals with no formal structure, whereas in mechanistic differentiation each differential is an atomic operation at whatever level of resolution we are working.
I am trying to create 'concrete' images to go with the mathematical hocus-pocus that e use to describe the world, often mutually incompatible strings of reasoning. In other words introducing the notion of building blocks in the form of logical function where the older generation of physics used only continuous functions. Continuity
[page 49]
is a strong constraint, so we would expect it to apply at the simpler (less complex) levels of the Universe. Only a small (countable) fraction of functions are continuous. The rest, which include all except nearest neighbour permutations, are not continuous and differentiable in the analytic sense.
By moving to discrete transfinite mathematics we have at last a model that can match the cardinal of the Universe at any level of integration.
Size of structure S = sum pi where pi is the size of piece i. Complexity of stricture = i!
So what is the mechanism (the logical system) of gravitation that explains Einstein's equations.
We explain them by saying that they are the only self consistent set of equations that fit the large scale facts.
In relativity we observe with photons, in war with scouts (among other techniques)
The scale invariance of networks means that I can apply my own experience of being a node in a network to understanding the feelings of any node at any scale.
We interpret the layers in a network as layers of meaning, since general covariance means that we can assign any meaning to any symbol, just as long as the behaviour of the symbol (eg an employee) is consistent
[page 50]
with the layer that is using it (the corporation).
Incorporate = make into a body. One person may be a member of many corporations.
There is very little information in a well behaved function whose epitome is the plane wave of fixed spatial frequency which we interpret in spacetime as fixed momentum.
Continuous = tape
discrete = ram (permutation)
The transfinite cardinals and ordinals are digital in that (in the Platonic world) they are all distinct and discrete, not to be confused with one another. In the real world, however, 'error' limits resolution and so many things are rather vague. Not error but 'effective cardinality' which is the inverse of the 'error rate'.
Given the cardinal of a process (system) evolution will eventually approach the most efficient exploitation of that cardinal in the given environment.
CARDINAL <==> ENERGY (or anything countable)
Continuity is a strong constraint on cardinality, reducing the group of permutations to only those permutation which move every element one step in one direction or another. Given a group of n generators (n! elements), there are only n continuous functions?
[page 51]
continuous / permutation = aleph(n) / aleph(n+1)
Quantum mechanics has already turned every point in a continuum of any cardinality into a dimension of Hilbert space. We go no further to turn every part into a process or event of transfinite complexity communicating with other processes through 4-space.
A stock price is the integral of many opinions about the value of the stock. The price is an equilibrium between those who think the stock is overvalues (sellers) and those who think it is undervalued (buyers).
Saturday 12 May 2007
'Wide open communication channel) = complete basis set = no possibility of error, since every symbol is equally acceptable, ie no 'meaning' to partition the set of possible symbols into acceptable and unacceptable, as in a dictionary.
She attracts me. I love being attracted.
LOVE <==> ATTRACTION. Empedocles, Wikipedia Empedocles Empedocles A rose by any other name. So each protocol i a communication channel that may mediate attraction or repulsion.
Quantum field theory (Zee) explains attraction and repulsion. [can we explain it in terms of entropy: increased entropy attracts, reduced entropy repels]
What are gravitation's methods of calculation. Let u say
[page 52]
that it can be implemented in some Turing machine. A Turing machine itself requires communication. First the machine must communicate with its tape, moving back and forth, much more laborious than random access memory, but universal nevertheless (and very slow).
Our fundamental observation is that the Standard Model works, with few exceptions. From a theoretical point of view, however, some of the assumptions necessary to make the model work seem far fetches, particularly when it comes to points and infinity. This shows in Dirac's delta function. One must suppose an infinite number of frequencies of a pane wave t define a point. The digital approach seems to avoid the problem. Points are defined not in terms of a vector of n real numbers in an n dimensional manifold, but by what? Events = processes. Every process within a process is a unique event addresses by a tree rooted in some prior process.
How many channels of communication does it take to make a Turing machine? a ram machine? It would be nice if there were 10, like the metric.
head <--> tape position (address)
reader <--> tape character
writer <--> tape character
We can explore the space of computable functions limited only by time and memory.
We align kinetic energy with processing speed and potential
[page 53]
energy to the size of memory, so that in the world so modelled the sum of processing speed and memory is conserved. [and there is an optimum partition of this sum predicted by a stationary value of the Lagrangian for the process]
Processing speed = communication speed. The closer we are together the faster the conversation can proceed. This condition is imposed by communication delay. [which becomes our metric for 'distance']
Every point is a communication between two processes in the Universe. Every now and then one of these events contains me. My uniqueness is maintained by my sharing some of the processing power of the Universe.
What I am vaguely swimming toward is an expression of the software of the Universe that explains our experience and links our experience to all other events in the world. Reductionism? Yes, we are all one. But not complete reductionism. The network shows how one can be many and many one.
Davis: page vii : 'anything that can be made completely precise can be programmed for an all purpose digital computer. However, in this form, the assertion is false. In fact one of the basic results of the theory of computability (namely the existence of non-recursive, recursively enumerable sets) may be interpreted as asserting the possibility of programming a given computer in such a way that it is impossible to program given computer (either a copy of the given computer or another machine) so as to determine whether or not a given item will be part of the output of the given computer.' Davis
[page 54]
page xvi; 'Problems . . . which inquire as to the existence of an algorithm for deciding the truth or falsity of a whole class of statements are called decision problems to distinguish them from ordinary mathematical questions concerning the truth or falsity of single propositions. '
Message <--> decision. I have made a decision and now I am communicating it to other nodes on the network.
'the problem . . . of verifying that an alleged algorithm is indeed an algorithm is not so simple as might be supposed and is, in fact, unsolvable.' So the digital Universe is unsolvable.
The quarks in a proton exchange gluons to keep themselves together.
Davis page xvii: 'III. There is no algorithm that enables one to decide whether an alleged algorithm for computing the values of a function whose domain of definition is the set of positive integers, and all of whose values are positive integers, is indeed such an algorithm.
Because there are ℵ1 functions and only ℵ0 algorithms?
Meaningless communications: one merely counts the number of symbols. Gravitation sees only the number [Macro error: Can't evaluate the expression because the name "frequency" hasn't been defined.] of events, not their content and relationships.
Davis page 3: Turing machine has countably infinite memory. It is all potential except for the dynamics
[page 55]
of the processor.
Finite number of internal states.
Operation defined by the internal state and the character read.
next act: 1 Halt. 2 Change symbol or move and change internal configuration.
. . .
[A quadruple is the set
'A Turing machine is a finite (non empty) set of quadruples no two quadruples whose first two symbols are the same. Davis page 5.
We imagine the cosmic Turing machine whose alphabet is particles and configurations are fields.
particle in + field in --> particle out + field out = 1 move.
Instantaneous state of Turing machine =
1. tape (= environment - external state)
2. Internal state (= internal state)
3. Square scanned = communication between machine and environment.
[page 56]
Turing machine is locally determined: '. . . we visualize each instantaneous description of a Turing machine as determining as most one immediately subsequent instantaneous description of the same Turing machine in a manner determined by an appropriate quadruple of this machine.' Davis page 6.
Weinberg Three Minutes page 39: 'Thus the wavelength of any ray of light simply increases in proportion to the separation between typical galaxies as the Universe expands.'
Independence of gravitation and quantum mechanics indicated by the expansion of the Universe where spacetime has grown without any change to quantum mechanics?
Weinberg page 43: '. . . the galaxies are moving apart because they were thrown apart by some sort of explosion in the past.' How do you throw spacetime?
'In order to calculate the motion of any typical galaxy relative to our own draw a sphere with us at the centre and the galaxy of interest on the surface; the motion of this galaxy is precisely the same as if the Universe consisted only of the matter within this sphere, with nothing outside. Birkhoff's Theorem. Wikipedia Birkhoff