Notes
[Sunday 25 January 2009 - Saturday 31 January 2009]
[Notebook: DB 65 Symmetric U]
[page 80]
Sunday 25 January 2009
What we call the laws of physics are in fact (?) the constraints imposed by the simplicity of its early stages, ie its lower layers.
[page 81]
Monday 26 January 2009
PERIODIC = BORING (ie repetitive like the boring old soldier who tells the same tales repeatedly). The more skilled storyteller, like the more interesting musician, brings in unpredictable details every now and then.
Tuesday 27 January 2009
Hughes page 77: Systems are indifferent to (symmetric with respect to) coordinates not included in their Lagrangian. Hughes (Emmy Noether Emmy Noether - Wikipedia ) As systems become more complex they become connected to more coordinates and their lagrangians become more complex until we arrive at something like the Lagrangian of the Standard Model (Veltman page 249 sqq. Veltman). If the layered network model is good, we should be able to order the elements of the standard model lagrangian into layers, thus giving ourselves a more digestible perspective on the whole thing.
Wednesday 28 January 2009
To say that a coordinate is missing from a Lagrangian is to say that it has no input into the energy of the system. In physics the possession of energy (ie the ability to communicate) is the touchstone of existence, as we say a particle exists when the corresponding field mode is excited (has energy). In human government terms, a policy can be said to exist when it is funded, ie has cashflow.
CLASSICAL - PAST (COMPLETE)
QUANTUM = PRESENT (INCOMPLETE).
[page 82]
Thursday 29 January 2009
Zurek: Decoherence, Einselection and the quantum origins of the classical: arXiv: quant-ph/0105127v3 19 June 2003. Zurek
Einselection = environment induced superselection
Envariance = environment assisted invariance
Zurek's idea seems completely consistent with the network model insofar as the choice of a common set of eigenfunctions between two communicating systems is equivalent to sharing a protocol or common language in order to communicate and so can be seen as a negotiated joint effort between the system of interest and the environment with which it is communicating.
Pointer states: = set of orthogonal eigenfunctions shared by both parties in a communication.
Continuous encodings are a tiny subset of all possible encodings and restrict our ability to apply Shannon's theorems to error correction (Shannon 1949 Shannon)
There are no quantum measurements. All are classical, only the explanations are quantum, ie quantum mechanics explains them.
Wheeler 1978, 1983) quoted in Zurek (page 2): 'No [quantum] phenomenon is a phenomenon until it is a recorded (observed) phenomenon'. Wheeler & Zurek
[page 83]
Zurek page 3: Many Worlds Interpretation (Everett, Everett III) 'MWI is incomplete. It does not explain what is effectively classical and why.'
Classical is the outcome of a computation. Nothing is worse than a computer that 'hangs' (refuses to halt and yield an output). Often the only way out is to restart and lose work in progress in the process.
Friday 30 January 2009
Zurek page 4: 'Our aim is to explain why does the quantum Universe appear classical.'
Maybe more to the point is why does the classical Universe have a quantum explanation? We have two opposite meanings of classical: 1. It obeys classical logic, as in a classical computer; and 2: It is continuous. 1 is good. 2 is a simplifying assumption which appears true macroscopically because Planck's quantum of action is so small, but if we consider larger events like reproduction or a hurricane, which are of themselves quantized, we see that the classical Universe is not continuous.
Zurek (continues): 'This question can be motivated only in a Universe divided into systems [ie a discontinuous Universe] . . . In the absence of systems the Schrödinger equation dictates deterministic evolution and the problem of interpretation seems to disappear.
'There is no need for 'collapse' in a Universe with no systems. The division into systems is imperfect. As a consequence, the Universe is a collection of open (interacting) quantum systems.'
[page 84]
In other words a network of o[pen]-systems (oracle machines Turing, Oracle machine - Wikipedia).
All of the Universe appears classical to observers: all that is non-classical is the explanation of the classical appearances. Quantum mechanics currently does this with continuous functions, and the collapse hypothesis. The network model does it with Turing machines.
Zurek page 4: '. . . typically observers use environment as a "communication channel" and monitor it to find out about the system.'
Why do continuous functions work so well in physics? Because we are mainly concerned with 'material' measures like mass, velocity and so on which can be expressed in cardinal numbers without reference to any order which may exist within the set so measured. From a Newtonian dynamic point of view all we need to know is that a certain book has a mass of one kilogram. The meaning of the book, whether Bible of fashion magazine full of advertisements is irrelevant. Pure quantum mechanics has the same property, dealing with a conserved quantity we call probability. It is only when we begin to apply these models to particular cases that we begin to take into account the interaction of ordered parts, as a Newtonian investigation of vehicle suspension or the quantum mechanical examination of an atom. Quantum mechanics, because it works in infinite dimensional vector space, is better suited to examining complex systems, but it is still very restrained by its assumption of continuity compared to the ability of a Turing Machine
(Ross Ashby page 133 ['A common and very powerful constraint is that of continuity. It is a constraint because whereas the function that changes arbitrarily can undergo any change, the continuous function can change, at each step, only to a neighbouring value.] Ashby
Zurek p 6: 'The freedom is basis choice -- basis ambiguity -- is guaranteed by the principle of superposition.'
[page 85]
We believe in musical superposition because we can hear the different voices in an orchestra or choir. Quantum mechanical superpositions arise because the same differential equation may be satisfied by any frequencies that honour the boundary conditions (overtones).
Let us say that the layers of the cosmic network are parametrized by inverse time, that is energy. The aim of high energy physics is to get as near as possible to the physical layer of the network by generating higher and higher energies per particle. Ultimately we expect all the energy in the Universe to be concentrated in a two state system, the fundamental clock which is the dynamic hardware for the whole system. As the Universe complexifies, this clock is shared among all the processes in the Universe by a sort of time or frequency division multiplexing. So my life depends on my tiny fraction of time on the universal Central Processing Unit and all the other personalities in the Universe (users) similarly get their cut just like the users on a computer network with a single CPU.
In orthodox quantum mechanics the state vectors are never at rest but always moving like an orchestra with a superposition of orthogonal frequencies whose sum is the energy of the system of interest, and whose relative energies are encoded in the amplitudes associated with each possible state. In a classical computer on the other hand, the clock ticks discontinuously and the algorithms in the computer are executed in discrete steps, the transitions between steps being masked by the clock. The clock frequency represents the total energy of the process, which is shared by the different subroutines which make up the overall process, each being called and returning its value with a particular (and possibly variable) frequency as the
[page 86
computation progresses. In the Schrödinger equations of motion of a quantum system, the energy attributed to each frequency is encoded in the Hamiltonian matrix which also encodes the transition probabilities (frequencies) between each mode represented by a basis vector. In a classical computer the role of the Hamiltonian is played by the program, which is made up of layered subroutines, some used more frequently than others.One section of why_quantized must be devoted to mapping a Turing machine into the Schrödinger equation, insofar as that is possible, and pointing out the differences between the two, particularly as concerns time ordering and the restrictions of continuity and linearity.
Wigner, Symmetries page 155: 'The Orthodox view: The possible states of a system can be characterized according to quantum mechanical theory by state vectors. These state vectors -- and this is an almost verbatim quotation of von Neumann - change in two ways. As a result of the passage of time they change continuously according to Schrödinger 's time dependent equation -- this equation will be called the equation of motion of quantum mechanics. The state vector also changes discontinuously, according to probability laws, if a measurement is carried out on the system. This second type of change is often called the reduction of the wavefunction. It is this reduction of the state vector which is unacceptable to many of our colleagues. Wigner, von Neumann
page 166: '. . . the state vector is a shorthand expression of that part of our information concerning the past of the system which is relevant for predicting (as far as possible) the future behaviour thereof.'
[page 87]
Shannon's theory tells us that we can construct orthogonal vectors in message space, ie vectors with no overlap which are sharply distinct from one another, just like orthogonal basis states in Hilbert space.
Shannon (1949) 'III Geometrical Representation of Signals. . . . Essentially, we have replaced a complex entity (say a television signal) in a simple environment [the signal requires only a plane for its representation as f (t)] by a simple entity (a point) in a complex environment (2TW dimensional space [where T is the duration of the signal and W its bandwidth]).
'If noise is added to the signal in transmission, it means that the point corresponding to the signal has been moved a certain distance in the space proportional to the rms value of the noise. This noise produces a small region of uncertainty about each point in the space.
'Mathematically the simplest types of mappings are those in which [the signal space and the message space] have the same number of dimensions.
It seems that this is the case in quantum mechanics. [but what about projectors?]
'It is not possible to map the message space onto the signal space in a one-to-one continuous manner (this is known mathematically as a topological mapping) unless the two spaces have the same dimensionality.'
[page 88]
However, there is no good reason to confine ourselves to topological mappings apart from the fact that non-topological mappings may be complex and non-linear, although well within the powers of a suitably powerful computer.
Theorem: Let P be the average transmitter power and suppose the noise is white thermal noise of power N in band W. By sufficiently complicated encoding systems it is possible to transmit binary digits at the rate
with as small a frequency of errors as desired. It is not possible by any encoding method to send at a higher rate and have an arbitrarily low frequency of errors.'
Here enters the velocity of light via coding delay.
'. . . we can send at the rate C but reduce errors by using more involved coding and longer delays at the transmitter and receiver. The transmitter will take long sequences of binary digits and represent the entire sequence by a particular signal function of long duration. The delay is required because the transmitter must wait for the full sequence before the signal can be determined Similarly the receiver must wait for the full signal function before decoding the binary digits.
'VIII Discussion. We will call a system that transmits without errors at the rate C an ideal system. Such a system cannot be achieved with any finite encoding process, but can be approximated as closely as desired. As we approximate
[page 89]
more closely to the ideal the following effects occur.
1. The rate of transmission . . . approaches C
2. The rate of errors approaches zero
3. The transmitted signal approaches white noise . . .
4. The threshold effect becomes very sharp. If the noise is increased over the value for which the system was designed, the frequency of errors increases very rapidly
5. The required delays at transmitter and receiver increase indefinitely . . . '
Khinchin Khinchin
Saturday 31 January 2009
Wigner: Relativistic Invariance and Quantum Phenomena
page 51: 'The principal theme of this discourse is the great difference between the relation of special relativity and quantum theory on the one hand and general relativity and quantum theory on the other.
'The difference between the two relations is, briefly, that while there are no conceptual problems to separate the theory of special relativity from quantum theory, there is hardly any common ground between the general theory of relativity and quantum mechanics. '
Quantum Electrodynamics: Tomonaga, Schwinger, Feynman and Dyson. Quantum electrodynamics - Wikipedia
'What is meant is . . . that the concepts which are used in quantum mechanics, measurements of positions, momenta and the like are the same concepts in terms of which the special relativistic postulate is formulated.'
[page 90]
Wigner page 52: 'This is not so with the general theory of relativity. The basic premise of this theory is that coordinates are only auxiliary quantities which can be given arbitrary values for every event.' [but they are strongly constrained by the need for continuity]
That is, the space-time coordinates for an event are not intrinsic to the event itself, which is a private communication between the participants in the event who, like lovers, are oblivious to their surroundings [ie can only couple pairwise at a given time]. It is only when a higher layer makes use of the communications between peers of the lower layer for it own purposes that the relationship between events acquires meaning and the coordinates of 'atomic' events (like the execution of a logical function) become important to some more complex event (the computation of the product of two numbers).
Arithmetic and logic meet in the binary domain, arithmetic giving higher meaning to binary logical operations.
'Evidently, the usual statements about future positions of particles as specified by their coordinates, are not meaningful statements in [general] relativity. This is a point which cannot be emphasized strongly enough and is the basis of a much deeper dilemma than the more technical questions of the Lorentz invariance of the quantum field equations.'
'Real coordinates' are relationships established in a higher layer (a 'user') between elements of a lower layer ('an alphabet of tools').
I have been able to think myself so far from orthodoxy because I am an isolated particle, scientifically speaking part of neither the theological or the physical establishment.
[page 91]
Wigner page 53: 'Relativistic Quantum Theory of Elementary Systems: . . . Two cases have been distinguished: the particle either can or cannot be transformed to rest. . . . if a particle cannot be transformed to rest, its velocity must always be equal to the velocity of light. Every other velocity can be transformed to rest. The rest-mass of these particles is zero because a non-zero rest-mass would entail infinite energy if moving with light velocity.'
On the other hand 'rest-mass' is meaningless for a particle that cannot be transformed to rest! So photons and gravitons are 'outside' the mechanism of special relativity. Space-time has no meaning for them and so on the layered model they are lower than the space-time layer. They do not have space-time coordinates, which is consistent, however, with the coordinate free approach of general relativity.
Higher layers are invisible to lower layers and so although a higher layer may manipulate lower layer, the lower layer does not see it and so is indifferent to it. This indifference (we suppose) explains the universality of the velocity of light. From this point of view direction of motion and spin (polarization) are attributes given to photons by the higher layers using them.
page 54: 'Instead of the question: "Why do particles with zero rest-mass have only two directions of polarization?" the slightly different question 'Why do particles with a finite rest mass have more than two directions of polarization:" is proposed.
= Why can't the angular momentum of a particle with finite rest mass be parallel to its velocity?
[page 92]
Wigner page 54: 'The statement that the spin is parallel to the velocity is a relativistically invariant statement [for particles with m = 0]
Time is reversible in the domain where Turing machines are reversible.
Wigner page 63: We want to see space-time as an emergent property of the Universe, somehow made by using massless particle photon and graviton which are never at rest to create the space-time Universe of experience. How? From this point of view, the ur-dimension is velocity, specifically c. In special relativity c is a local phenomenon, all observers in all inertial frames observing the same c. We try to explain this by the coding delay associated with error free communication. The simplest coding takes one bit from the source and maps it to one signal in the channel and vice versa. In a Universe with no choice of signals or messages, this process is essentially error free and so, we suspect, instantaneous. Salart et al
'The events of special relativity are coincidences, that is collisions between particles.'
The less violent among us might see this as mapping from one particle to another. On the Trinitarian analogy, we might say gravitation = father, photon = son or vice versa, and the system works by mapping [photon to graviton] and back. There yet being no space-time there is no velocity involved in the process but we may associate a quantum of action with each instance of mapping, and from the frequency of mapping derive an energy.
Assuming c = 1, we can equate distance and time,
[page 93]
given that they exist. [We might say the possibility of this equations derives from them being the 'same thing' underneath]
Wigner page 69:
'. . . states will be generated by looking at the same state -- the standard state -- [for us something near the initial singularity] from various coordinate systems. Hence each Lorentz frame of reference will define a state of the system, -- the state which the standard state appears to be from the point of view of this coordinate system. . . . two states of the system will be identical only if the Lorentz frames of reference which define them are identical.
Here no cloning says no identical frames.
The scheme embodies the what we see depends on how we look paradigm, and given that relations of looking (= transformation, communication) are real as in Thomas' god and Einstein's physics) we might see how the same logical operation (eg mapping from photon to graviton) can be seen and be different from different points of view, just as each operation the the Central Processing Unit of a computer is differentiated from every other by its relationship to all the other operations in the execution of the program.
Let our ur-alphabet be C, P, T (Wigner page 74) Streater & Wightman Space inversion, time inversion, particle conjugation. For every annihilation there is a creation.
#bbinclude#="MacBook:Users:jeffrey:Sites:NT:NT_text:aBBIncludes:back_end.txt" -->