natural theology

This site is part of The natural religion project
dedicated to developing and promoting the art of peace.

Contact us: Click to email
vol III Development:

Chapter 4: Physics

page 8: Why is the Universe quantized?

The observed universe is quantized

The observable world comprises a large number of discrete objects and events, ranging from stars and galaxies to grains of dust, atoms and fundamental particles. Even those things which apear continuous to the naked eye yield a discrete structure under sufficient magnification.

Historically, physicists have found it very difficult to accept that the universe is quantized, even though the evidence is everywhere. Perhaps we are deceived by the apparent continuity of motion and large smooth objects. Modern physics shows us that this quantization or pixellation of the world continues down to the tiny scale measured by Planck's constant, about 7 × 10-34 Joule.seconds. There are no events smaller than this atom of action. Planck constant - Wikipedia

We conceive of the universe as an infinite communication network. Everything in the universe can communicate with everything else, at least by gravitation. We link this formal network model to the world by assuming that the quantum of action measures smallest possible message from one point in space and time to another.

Larger messages, that is larger actions, involve ordered sets of quanta of action. Messages travel from the past to the future and may travel through space. The network model enables me to conceive of myself as a message from my birth to my death, from my creation to my annihilation. In our dynamic universe almost everything has a finite lifetime, so that everything may be seen as an event, an instance of a message.

Dissecting a message

We can understand a network as an ordered set of messages. Each message involves two sources and a channel. A message begins with a connection between the sources, continues with an exchange of information and ends with a disconnection. We have become very familiar with this paradigm through telephone networks. We establish a phone call by dialling a connection. We then converse for a period, and when the conversation is finished, we hang up, breaking the connection. We may think of this process as the fundamental symmetry or algorithm of a network.

Although many messages appear to involve analogue and continuous processes, we assume here that in the final analysis all messages involve digital processes. We therefore discuss the communication process in by analogy with familiar digital computer networks like the internet. History of the Internet - Wikipedia

Why is the world quantized?

We suppose that the world is quantized because it is an error resisting communication network. The mathematical theory of communication shows that we can make communication error free by coding our messages into packets that are so far apart in message space that the probability of their confusion is negligible. Claude E Shannon: A Mathematical Theory of Communication

Error free communication therefore requires discrete symbols to represent information. A continuum is the opposite of discrete, and so, from an information theoretic point of view, a continuum can carry no information because there are no discrete symbols.

Shannon sought the limits of error free communication over noiseless and noisy channels. The theory he developed is now well known and lies at the heart of communication networks worldwide. The validity of these strategies is illustrated by our current ability to send gigabytes of information error free over noisy physical channels. The quantization of communication at the microscopic level suggests the hypothesis that our world embodies a communication network that has evolved to resist error. Survival requires error free operation, since a fatal error is the end of survival. Khinchin: Mathematical Foundations of Information Theory

Computation and coding

Shannon showed that appropriate coding enables free communication but his work did not reveal the codes to be used. The search for optimal codes has involved much work and continues. We can be certain, however, that encoding and decoding processes must be deterministic so that the original message can be recovered exactly. Davis: Computability and Unsolvability, Hill: Coding Theory

In the early days of communication technology, electronic analogue methods like frequency modulation were used to implement Shannon's ideas. These are quite limited, however, and the full power of encoding and decoding can only be realized using digital computers. Seen in this way, encoding and decoding messages are instances of logical continuity. When the output of an encoding algorithm is fed to the input of the decoding algorithm, the ultimate output is the original message. Frequency modulation - Wikipedia

Communication and quantum mechanics

The mathematical formalism of quantum mechanics assumes that the state space of the physical Universe can be represented by state vectors in complex Hilbert space of finite or infinite dimension. The joint state of two communicating quantum systems is represented by vectors in the tensor product space of the Hilbert spaces of the constituent systems.

In the standard theory, the continuous evolution of state vectors in an isolated quantum system is described by unitary operators on their Hilbert space governed by Schrödinger’s equation. Since such a system is isolated, however, this continuous evolution is not directly observed but is inferred from the observed success of its consequences.

Mathematically this evolution is deterministic and reversible so that we may think of it as a process of encoding the same message in different bases. The Schroedinger equation applies equally at all energies and all levels of complexity of state vectors. The only truly isolated system is the Universe as a whole, represented in its simplest state by the initial singularity.

Quantum amplitudes φ are complex numbers, and so not directly observable. The result of a quantum computation is a probability which is computed by taking the absolute square of the relevant amplitude, |φ|2, which is equivalent to the product of φ with its complex conjugate φ*, ie φφ*. The theory tells us that the amplitude of a transformation from state ψ to the state φ, written <φ|ψ> is the complex conjugate of the inverse transformation <ψ|φ>, ie <φ|ψ> = <ψ|φ>*. This formalism suggests that a complete quantum event requires a a message to pass in both directions between two sources. Complex number - Wikipedia

'Collapse of the wave function'

The continuous evolution of an isolated quantum system is believed to be interrupted by an observation or measurement. When we observe a system, we do not see the whole continuous system of superposed states, but only one or other of the eigenvalues corresponding to the basis states (eigenvectors) of the operator we use to observe the system. The mathematical formalism of quantum mechanics cannot predict which eigenvector we will observe, only the relative frequencies of the observed eigenstates.

Zurek has shown that this restriction on the completeness of observation is necessary if we are to obtain information from a quantum system. This suggests that the quantization of observation and the requirements of mathematical communication theory are consistent with one another. Wojciech Hubert Zurek

The mathematical formalism of quantum mechanics envisages the simultaneous existence all of the infinity of solutions to the Schrödinger equation (see axiom 3 in the previous page, Quantum mechanics). This poses something of a logical problem, since it appears to violate the principle of contradiction which requires that a system cannot be both p (state a say) and not p (state b) at the same time.

When we observe a quantum system (axioms 5 and 6) we see only one of these possible solutions. This phenomenon is called the 'collapse' of the wave function. Wave function collapse - Wikipedia

The standard theory cannot explain why this happens. Although the eigenvalues are determined to a high degree of precision, we have only a probabilistic prediction of which eigenvalue will be observed in a particular event. There has been a long debate, since the invention of quantum mechanics, about the relationship between the continuous evolution of quantum states and the result of an observation which picks out one element of a the formally infinite superposition of solutions to the wave equation.

Perhaps the most extreme solution to this problem is the 'many worlds' interpretation of quantum mechanics proposed by Hugh Everett III in 1957, which has been supported by many authors since. The idea is that all the solutions to the differential equation are real, but we only see one of them in this world. The others are real in other worlds, unobservable to us. This implies that every observation of a quantum system effectively generates a large number of new worlds, which, given the enormous frequency of quantum events and the great age of the Universe, implies the existence of a high transfinite number of parallel universes. This idea is sometimes called the 'multiverse'. Everett III, Deutsch: The Fabric of Reality

The computer network approach to quantum mechanics may reveal a more plausible explanation. There are three points of importance. First, quantum processes appear to determine eigenvalues to unlimited precision, that is quantum evolution is deterministic. Second, the evolution of a quantum process is invisible, and its actual structure must be inferred from observation. Third, the occurrence of particular eigenvalues in a quantum observation is random, even though this randomness has a probability structure determined by the Born rule (axiom 6 in the previous page 6: Quantum mechanics). Because theory and observation agree to a high degree of precision, we can be confident that classical quantum mechanics is close to the mark, even thought its interpretation may be open to question.

The network model assumes that quantum evolution is a computation process. It assumes that eigenvectors represent the structure of these computations and that observed eigenvalues are the halted states that emerge when these computations are complete. Such computation are deterministic. Further, as explained in Chapter 2, page 6: Invisibility a computational process cannot be observed without interrupting it.

Standard quantum mechanics assumes that there may be a real number of different eigenvectors. The network model restricts the set of eigenvectors to computable functions, of which there are only a countable infinity equivalent to the integers. Alan Turing

The probabilistic nature of quantum outcomes may be explained by the properties of an ideal communication system.

Shannon writes:

We will call a system that transmits without errors at the [limiting] rate C an ideal system. Such a system cannot be achieved with any finite encoding process, but can be approximated as closely as desired. As we approximate more closely to the ideal, the following effects occur:

1. The rate of transmission of binary digits approaches C.
2. The frequency of errors approaches zero.
3. The transmitted signal approaches a white noise in statistical properties. This is true, roughly speaking, because the various signal functions must be distributed at random . . .
4. The threshold effect becomes very sharp. If the noise is increased over a value for which the system was designed, the frequency of errors increases very rapidly.
5. The required delays at transmitter and receiver increase indefinitely. . . . Claude Shannon: Communication in the Presence of Noise

We can see these features of the ideal system reflected in the structure of the world revealed by quantum mechanics and everyday observation:

1. To avoid error there must be no overlap between signals representing different messages. They must, in other words, be orthogonal, as with the eigenfunctions of a quantum mechanical measurement operator, or otherwise distinct, like all the distinct things we observe in the world. Every stable object in the world, like myself, may be seen as a message moving through space-time. This idea was noted by Rene Descartes (1596 - 1650), who felt that knowledge should be based on 'clear and distinct ideas'. Descartes: Meditations on First Philosophy, Rene Descartes - Wikipedia

2. Such ‘basis signals’ may be chosen at random in the signal space, provided only that they are orthogonal. The same message may be encoded into any satisfactory basis provided that the transformations used by the transmitter and receiver to encode the message into the signal and decode the signal back to the message are inverses of one another.

3. Since the signals transmitted by an ideal system are indistinguishable from noise the fact that a set of physical observations looks like a random sequence is not therefore evidence for meaninglessness. Until the algorithms used to encode and decode such a sequence are known, little can be said about its significance.

4. Only in the simplest cases are the mappings used to encode and decode messages linear and topological. For practical purposes, however, they must all be computable with available machines. In the abstract mathematial treatment, these transformations must be Turing computable. Computable function - Wikipedia

5. As a system approaches the ideal, the length of the transmitted packets, the delay at the transmitter while it takes in a chunk of message for encoding, and the corresponding delay at the receiver, increase indefinitely. On the assumption that all information is embodied physically, we may consider any fixed physical object like an atom (or myself) as a message. From this point of view, we may see the tendency for larger and more complex systems to evolve as a response to the need to eliminate communication errors if systems are to be stable. Natural selection selects for stable systems.

Further, the statistical properties of a quantum observations are identical to the statistical properties of a communication source. Like the probability of emission of the various letters of a source, the probabilities of observing various eigenstates of a quantum system are normalized to 1. This constraint is established in quantum theory by the unitarity of the evolution and observation operators. This leads us to think of the eigenstates of a quantum observation as the letters of alphabet of a communication source.

Planck's quantum of action corresponds to the smallest possible message

The universal network is built from events involving just one quantum of action. We may see this as analogous to the fact that engineered computations, no matter how complex, can be constructed from the two logical operations, not and and which combined gives us the nand or Sheffer stroke. Sheffer stroke - Wikipedia

Here I imagine that the fixed points of the Universe are messages in the universal network. Every particle in the Universe, large and small may be understood as a message, that is physically embodied information. The universal network runs on a countable infinity of different fundamental processes corresponding to the computable functions represented by halting turing machines. Geometrical continuity is replaced by the more powerful notion of logical continuity implemented formally by mathematical proof and practically by symbolic computing machines. Rolf Landauer: Information is a physical entity Model page 4: Logical continuity

Since all information is encoded physically, practical computer networks are built of a physical layer which correlates physical states or signals with the information sent and received by higher layers in the network. This hardware layer is driven by various strata of software. A stable network requires error free communication so that the first software layer in practical networks is usually devoted to error detection and correction.

An ‘atomic’ communication is represented by the transmission of single packet from one source to another. Practical point to point communication networks connect many sources, all of which are assigned addresses so that addressed packets may be steered to their proper recipient. This ‘post office’ work is implemented by further network layers.

Each subsequent software layer uses the layer beneath it as an alphabet of operations to achieve its ends. The topmost layer, in computer networks, comprises human users. These people may be a part of a corporate network, reporting through further layers of management to the board of an organization. By analogy to this layered hierarchy, we may consider the Universe as a whole as the ultimate user of the universal network.

Processes in corresponding layers (‘peers’) of two sources in a network may communicate if they share a suitable protocol. All such communication uses the services of all layers between the peers and the physical layer. These services are generally invisible or transparent to the peers unless they fail. Thus two people in conversation are generally unaware of the huge psychological, physiological and physical complexity of the systems that make their communication possible.

In the physical world, this may imply that all communications in the Universe are routed through the initial singularity understood as the lowest hardware layer of the universal network. Insofar as we see the initial singularity as formally identical to the classical God, this provides an interpretation of the ancient belief that God sees everything.

Feynman's path integral

Since Newton's time, the standard approach to the study of motion has been through differential and integral calculus. We represent motion of a particle by a mathematical function whose range is often time and its domain space, so that the function represents the spatial motion of the particle as time passes. We differentiate this function to find the instantaneous rate of change, or velocity, and we integrate the velocity between two times to find the overall motion in that period. Calculus - Wikipedia

Quantum mechanics, as we have described it in a previous page (page 7: Quantum mechanics) operates purely in the domain of energy / frequency. Since the world exists in space as well as time, quantum mechanics must be combined with special relativity to give a comprehensive description of the world. The resulting marriage is called quantum field theory. It has been a very difficult union which owes a lot to Richard Feynman's invention of the path integral method. Quantum field theory - Wikipedia

Feynman started from Dirac's use of the Lagrangian approach in quantum mechanics. The beauty of this approach is that it studies processes as a whole in terms of an 'action' which is an integral over the entire motion from an initial to a final state. An important feature of the Lagrangian approach is that the action is a relativistic invariant, so that it opens the way for quantum theory to extend to space-time. P. A. M. Dirac: The Lagrangian in Quantum Mechanics, Lagrangian - Wikipedia, Path integral formulation - Wikipedia

The essence of the Lagrangian method is to seek states of stationary action, either maximum or minimum, using the calculus of variations. The path integral method is based on the idea that a quantum system may take any possible path from the initial to the final state, and that all these paths interfere, that is they are to be superposed (integrated) to arrive at the actual path taken by the system. This path will, in the limit, be the path of constructive interference, since all other paths will be cancelled out by destructive interference.

The result of this the functional integration is to identify a path with an stationary integral phase. Such a path, on the present hypothesis, is equivalent to the halting of the computation described by one eigenfunction, yielding one eigenvalue.

(revised 19 May 2016 )

Copyright:

You may copy this material freely provided only that you quote fairly and provide a link (or reference) to your source.

Further reading

Books

Click on the "Amazon" link below each book entry to see details of a book (and possibly buy it!)

Davis, Martin, Computability and Unsolvability, Dover 1982 Preface: 'This book is an introduction to the theory of computability and non-computability ususally referred to as the theory of recursive functions. The subject is concerned with the existence of purely mechanical procedures for solving problems. . . . The existence of absolutely unsolvable problems and the Goedel incompleteness theorem are among the results in the theory of computability that have philosophical significance.' 
Amazon
  back
de Witt, Bryce S and Neill Graham (eds) , and Hugh Everett III, J A Wheeler, B S DeWitt, L N Cooper, D van Vechten, N Graham (contributors), The Many-Worlds Interpretation of Quantum Mechanics, Princeton UP 1973 Jacket: 'A novel interpretation of quantum mechanics, first proposed in brief form by Hugh Everett in 1957, forms the nucleus around which this book is developed. The volume contains Dr Everett's short paper from 1957, "'Relative State' Formulation of Quantum Mechanics", and a far longer exposition of his interpretation, entitled "The Theory of the Universal Wave Function", never before published. In addition, other papers by De Witt, Graham and Cooper and van Vechtem provide further dicussion of the same theme. Together they constitute virtually the entire world output of scholarly commentary on the Everett interpretation.' 
Amazon
  back
Descartes, Rene, and John Cottingham (editor), Bernard Wiliams (Introduction), Descartes: Meditations on First Philosophy: With Selections from the Objections and Replies, Cambridge University Press 1996 Amazon Product Description 'This authoritative translation by John Cottingham of the Meditations is taken from the much acclaimed three-volume Cambridge edition of the Philosophical Writings of Descartes. It is based on the best available texts and presents Descartes' central metaphysical writings in clear, readable modern English.' 
Amazon
  back
Deutsch, David, The Fabric of Reality: The Science of Parallel Universes - and its Implications, Allen Lane Penguin Press 1997 Jacket: 'Quantum physics, evolution, computation and knowledge - these four strands of scientific theory and philosophy have, until now, remained incomplete explanations of the way the universe works. . . . Oxford scholar DD shows how they are so closely intertwined that we cannot properly understand any one of them without reference to the other three. . . .' 
Amazon
  back
Everett III, Hugh, and Bryce S Dewitt, Neill Graham (editors), The Many Worlds Interpretation of Quantum Mechanics, Princeton University Press 1973 Jacket: 'A novel interpretation of quantum mechanics, first proposed in brief form by Hugh Everett in 1957, forms the nucleus around which this book has developed. The volume contains Dr Everett's short paper from 1957, "'Relative State' formulation of quantum mechanics" and a far longer exposition of his interpretation entitled "The Theory of the Universal Wave Function" never before published. In addition other papers by Wheeler, DeWitt, Graham, Cooper and van Vechten provide further discussion of the same theme. Together they constitute virtually the entire world output of scholarly commentary on the Everett interpretation.' 
Amazon
  back
Hill, Raymond, A First Course in Coding Theory, Oxford University Press, USA 1990 Amazon Editorial Reviews Book Description: 'Algebraic coding theory is a new and rapidly developing subject, popular for its many practical applications and for its fascinatingly rich mathematical structure. This book provides an elementary yet rigorous introduction to the theory of error-correcting codes. Based on courses given by the author over several years to advanced undergraduates and first-year graduated students, this guide includes a large number of exercises, all with solutions, making the book highly suitable for individual study.' 
Amazon
  back
Khinchin, Aleksandr Yakovlevich, Mathematical Foundations of Information Theory (translated by P A Silvermann and M D Friedman), Dover 1957 Jacket: 'The first comprehensive introduction to information theory, this book places the work begun by Shannon and continued by McMillan, Feinstein and Khinchin on a rigorous mathematical basis. For the first time, mathematicians, statisticians, physicists, cyberneticists and communications engineers are offered a lucid, comprehensive introduction to this rapidly growing field.' 
Amazon
  back
Papers
d'Espagnat, Bernard, "Quantum theory and reality", Scientific American, 241, 5, November 1979, page 128-140. 'Most particles or aggregates of particles that are ordinarily regarded as separate objects have interacted at some time in the past with other objects. The violation of separability seems to imply that in some sense all these objects constitute an indivisible whole. Perhaps in such a world the concept of an independently existing reality can reatain some meaning, but it will be an altered meaning and one remove from everyday experience.' (page 140). back
Dirac, P A M, "The Lagrangian in Quantum Mechanics", Physikalische Zeitschrift der Sowjetunion, 3, 1, 1933, page 64-72. 'Quantum mechanics was built up on a foundation of analogy with the Hamiltonian theory of classical mechanics. . . . there is an alternative formulation of classical dynamics provided by the Lagrangian. This requires one to work in terms of coordinates and velocities instead of coordinates and momenta. The two formulations are, of course, closely related, but there are reasons for believing that the Lagrangian one is the more fundamental.' Reprinted in Julian Schwinger (editor), Selected Papers on Quantum Electrodynamics, Dover, New York, 1958.. back
Landauer, Rolf, "Information is a physical entity", Physica A, 263, 1, 1 February 1999, page 63-7. 'This paper, associated with a broader conference talk on the fundamental physical limits of information handling, emphasizes the aspects still least appreciated. Information is not an abstract entity but exists only through a physical representation, thus tying it to all the restrictions and possibilities of our real physical universe. The mathematician's vision of an unlimited sequence of totally reliable operations is unlikely to be implementable in this real universe. Speculative remarks about the possible impact of that, on the ultimate nature of the laws of physics are included.'. back
Planck, Max, "On the Law of Distribution of Energy in the Normal Spectrum", Annalen der Physik, 4, , 1901, page 553-. 'Moreover, it is necessary to interpret ... [the total energy of blackbody radiation] not as a continuous infinitely divisible quantity, but as a discrete quantity composed of an integral number of finite equal parts.' . back
Shannon, Claude E, "The mathematical theory of communication", Bell System Technical Journal, 27, , July and October, 1948, page 379-423, 623-656. 'A Note on the Edition Claude Shannon's ``A mathematical theory of communication'' was first published in two parts in the July and October 1948 editions of the Bell System Technical Journal [1]. The paper has appeared in a number of republications since: • The original 1948 version was reproduced in the collection Key Papers in the Development of Information Theory [2]. The paper also appears in Claude Elwood Shannon: Collected Papers [3]. The text of the latter is a reproduction from the Bell Telephone System Technical Publications, a series of monographs by engineers and scientists of the Bell System published in the BSTJ and elsewhere. This version has correct section numbering (the BSTJ version has two sections numbered 21), and as far as we can tell, this is the only difference from the BSTJ version. • Prefaced by Warren Weaver's introduction, ``Recent contributions to the mathematical theory of communication,'' the paper was included in The Mathematical Theory of Communication, published by the University of Illinois Press in 1949 [4]. The text in this book differs from the original mainly in the following points: • the title is changed to ``The mathematical theory of communication'' and some sections have new headings, • Appendix 4 is rewritten, • the references to unpublished material have been updated to refer to the published material. The text we present here is based on the BSTJ version with a number of corrections.. back
Shannon, Claude E, "Communication in the Presence of Noise", Proceedings of the IEEE, 86, 2, February 1998, page 447-457. Reprint of Shannon, Claude E. "Communication in the Presence of Noise." Proceedings of the IEEE, 37 (January 1949) : 10-21. 'A method is developed for representing any communication system geometrically. Messages and the corresponding signals are points in two function spaces, and the modulation process is a mapping of one space into the other. Using this representation, a number of results in communication theory are deduced concerning expansion and compression of bandwidth and the threshold effect. Formulas are found for the maximum rate of transmission of binary digits over a system when the signal is perturbed by various types of noise. Some of the properties of "ideal" systems which transmit this maximum rate are discussed. The equivalent number of binary digits per second of certain information sources is calculated.' . back
Links
Alan Turing, On Computable Numbers, with an application to the Entscheidungsproblem, 'The "computable" numbers may be described briefly as the real numbers whose expressions as a decimal are calculable by some finite means. Although the subject of this paper is ostensibly the computable numbers, it is almost equally easy to define and investigate computable functions of an integral variable of a real or computable variable, computable predicates and so forth. . . . ' back
Aristotle - Physics, The Internet Classic Archive | Physics Aristotle, Written 350 B.C.E Translated by R. P. Hardie and R. K. Gaye back
Calculus - Wikipedia, Calculus - Wikipedia, the free encyclopedia, 'Calculus (Latin, calculus, a small stone used for counting) is a discipline in mathematics focused on limits, functions, derivatives, integrals, and infinite series. This subject constitutes a major part of modern university education. It has two major branches, differential calculus and integral calculus, which are related by the fundamental theorem of calculus. Calculus is the study of change, in the same way that geometry is the study of shape and algebra is the study of equations.' back
CERN, LHC Homepage, 'The Large Hadron Collider (LHC) sits in a circular tunnel 27 km in circumference. The tunnel is buried around 50 to 175 m. underground. It straddles the Swiss and French borders on the outskirts of Geneva. The first collisions at an energy of 3.5 TeV per beam took place on 30th March 2010. The LHC is designed to collide two counter rotating beams of protons or heavy ions. Proton-proton collisions are foreseen at an energy of 7 TeV per beam. The beams move around the LHC ring inside a continuous vacuum guided by magnets. The magnets are superconducting and are cooled by a huge cryogenics system. The cables conduct current without resistance in their superconducting state. The beams will be stored at high energy for hours. During this time collisions take place inside the four main LHC experiments.' back
Claude E Shannon, A Mathematical Theory of Communication, 'The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point. Frequently the messages have meaning; that is they refer to or are correlated according to some system with certain physical or conceptual entities. These semantic aspects of communication are irrelevant to the engineering problem. The significant aspect is that the actual message is one selected from a set of possible messages.' back
Claude Shannon, Communication in the Presence of Noise, 'A method is developed for representing any communication system geometrically. Messages and the corresponding signals are points in two “function spaces,” and the modulation process is a mapping of one space into the other. Using this representation, a number of results in communication theory are deduced concerning expansion and compression of bandwidth and the threshold effect. Formulas are found for the maximum rate of transmission of binary digits over a system when the signal is perturbed by various types of noise. Some of the properties of “ideal” systems which transmit at this maximum rate are discussed. The equivalent number of binary digits per second for certain information sources is calculated.' back
Complex number - Wikipedia, Complex number - Wikipedia, the free encyclopedia, 'IA complex number is a number that can be expressed in the form a + bi, where a and b are real numbers and i is the imaginary unit, which satisfies the equation i2 = −1. In this expression, a is the real part and b is the imaginary part of the complex number. Complex numbers extend the concept of the one-dimensional number line to the two-dimensional complex plane (also called Argand plane) by using the horizontal axis for the real part and the vertical axis for the imaginary part.' back
Computable function - Wikipedia, Computable function - Wikipedia, the free encyclopedia, 'Computable functions (or Turing-computable functions) are the basic objects of study in computability theory. They make precise the intuitive notion of algorithm. Computable functions can be used to discuss computability without referring to any concrete model of computation such as Turing machines or register machines. Their definition, however, must make reference to some specific model of computation.' back
Frequency modulation - Wikipedia, Frequency modulation - Wikipedia, the free encyclopedia, 'In telecommunications and signal processing, frequency modulation (FM) is the encoding of information in a carrier wave by varying the instantaneous frequency of the wave. This contrasts with amplitude modulation, in which the amplitude of the carrier wave varies, while the frequency remains constant.' back
History of the Internet - Wikipedia, History of the Internet - Wikipedia, the free encyclopedia, 'IThe history of the Internet begins with the development of electronic computers in the 1950s. Initial concepts of packet networking originated in several computer science laboratories in the United States, United Kingdom, and France.[1] The US Department of Defense awarded contracts as early as the 1960s for packet network systems, including the development of the ARPANET (which would become the first network to use the Internet Protocol).' back
Lagrangian - Wikipedia, Lagrangian - Wikipedia, the free encyclopedia, 'The Lagrangian, L, of a dynamical system is a function that summarizes the dynamics of the system. It is named after Joseph Louis Lagrange. The concept of a Lagrangian was originally introduced in a reformulation of classical mechanics by Irish mathematician William Rowan Hamilton known as Lagrangian mechanics. In classical mechanics, the Lagrangian is defined as the kinetic energy, T, of the system minus its potential energy, V. In symbols, L = T - V. ' back
Louis de Broglie - Wikipedia, Louis de Broglie - Wikipedia, the free encyclopedia, 'Louis-Victor-Pierre-Raymond, 7th duc de Broglie . . . 15 August 1892 – 19 March 1987) was a French physicist who made groundbreaking contributions to quantum theory. In his 1924 PhD thesis he postulated the wave nature of electrons and suggested that all matter has wave properties. This concept is known as the de Broglie hypothesis, an example of wave-particle duality, and forms a central part of the theory of quantum mechanics.' back
Max Planck, On the Law of Distribution of Energy in the Normal Spectrum, Annalen der Physik, vol. 4, p. 553 ff (1901) 'The recent spectral measurements made by O. Lummer and E. Pringsheim and even more notable those by H. Rubens and F. Kurlbaum which together confirmed an earlier result obtained by H. Beckmann show that the law of energy distribution in the normal spectrum, first derived by W. Wien from molecular-kinetic considerations and later by me from the theory of electromagnetic radiation, is not valid generally.' back
Path integral formulation - Wikipedia, Path integral formulation - Wikipedia, the free encyclopedia, 'The path integral formulation of quantum mechanics is a description of quantum theory which generalizes the action principle of classical mechanics. It replaces the classical notion of a single, unique trajectory for a system with a sum, or functional integral, over an infinity of possible trajectories to compute a quantum amplitude. . . . This formulation has proved crucial to the subsequent development of theoretical physics, since it provided the basis for the grand synthesis of the 1970s which unified quantum field theory with statistical mechanics. . . . ' back
Planck constant - Wikipedia, Planck constant - Wikipedia, the free encyclopedia, 'Classical statistical mechanics requires the existence of [Planck's constant] (but does not define its value). Eventually, following upon Planck's discovery, it was recognized that physical action cannot take on an arbitrary value. Instead, it must be some multiple of a very small quantity, the "quantum of action", now called the Planck constant. Classical physics cannot explain this fact. In many cases, such as for monochromatic light or for atoms, this quantum of action also implies that only certain energy levels are allowed, and values in between are forbidden.' back
Planck-Einstein relation - Wikipedia, Planck-Einstein relation - Wikipedia, the free encyclopedia, 'The Planck–Einstein relation. . . refers to a formula integral to quantum mechanics, which states that the energy of a photon (E) is proportional to its frequency (ν). E = hν. The constant of proportionality, h, is known as the Planck constant.' back
Quantum field theory - Wikipedia, Quantum field theory - Wikipedia, the free encyclopedia, 'Quantum field theory (QFT) provides a theoretical framework for constructing quantum mechanical models of systems classically described by fields or (especially in a condensed matter context) of many-body systems. . . . In QFT photons are not thought of as 'little billiard balls', they are considered to be field quanta - necessarily chunked ripples in a field that 'look like' particles. Fermions, like the electron, can also be described as ripples in a field, where each kind of fermion has its own field. In summary, the classical visualisation of "everything is particles and fields", in quantum field theory, resolves into "everything is particles", which then resolves into "everything is fields". In the end, particles are regarded as excited states of a field (field quanta). back
Rene Descartes - Wikipedia, Rene Descartes - Wikipedia, the free encyclopedia, 'René Descartes (. . . 31 March 1596 – 11 February 1650) was a French philosopher, mathematician and writer who spent most of his life in the Dutch Republic. He has been dubbed the father of modern philosophy, and much subsequent Western philosophy is a response to his writings,' back
Rolf Landauer, Information is a Physical Entity, 'Abstract: This paper, associated with a broader conference talk on the fundamental physical limits of information handling, emphasizes the aspects still least appreciated. Information is not an abstract entity but exists only through a physical representation, thus tying it to all the restrictions and possibilities of our real physical universe. The mathematician's vision of an unlimited sequence of totally reliable operations is unlikely to be implementable in this real universe. Speculative remarks about the possible impact of that, on the ultimate nature of the laws of physics are included.' back
Sheffer stroke - Wikipedia, Sheffer stroke - Wikipedia, the fre encyclopedia, 'The NAND operation is a logical operation on two logical values. It produces a value of true, if — and only if — at least one of the propositions is false., , , Like its dual, the NOR operator (also known as the Peirce arrow or Quine dagger), NAND can be used by itself, without any other logical operator, to constitute a logical formal system (making NAND functionally complete). This property makes the NAND gate crucial to modern digital electronics, including its use in NAND flash memory and computer processor design.' back
Sylvia Berryman, Ancient Atomism (Stanford Encyclopedia of Philosophy), 'A number of important theorists in ancient Greek natural philosophy held that the universe is composed of physical ‘atoms’, literally ‘uncuttables’. Some of these figures are treated in more depth in other articles in this encyclopedia: the reader is encouraged to consult individual entries on Leucippus, Democritus, Epicurus and Lucretius. These philosophers developed a systematic and comprehensive natural philosophy accounting for the origins of everything from the interaction of indivisible bodies, as these atoms—which have only a few intrinsic properties like size and shape—strike against one another, rebound and interlock in an infinite void.' back
Universal Turing Machine - Wikipedia, Universal Turing Machine - Wikipedia, the free encyclopedia, 'Alan Turing's universal computing machine (alternately universal machine, machine U, U) is the name given by him (1936-1937) to his model of an all-purpose "a-machine" (computing machine) that could process any arbitrary (but well-formed) sequence of instructions called quintuples. This model is considered by some (for example, Davis (2000)) to be the origin of the stored program computer -- used by John von Neumann (1946) for his "Electronic Computing Instrument" that now bears von Neumann's name: the von Neumann architecture. This machine as a model of computation is now called the Universal Turing machine.' back
Wave function collapse - Wikipedia, Wave function collapse - Wikipedia, the free encyclopedia, 'In quantum mechanics, wave function collapse is the phenomenon in which a wave function—initially in a superposition of several eigenstates—appears to reduce to a single eigenstate (by "observation"). It is the essence of measurement in quantum mechanics, and connects the wave function with classical observables like position and momentum. Collapse is one of two processes by which quantum systems evolve in time; the other is continuous evolution via the Schrödinger equation.' back
Wojciech Hubert Zurek, Quantum origin of quantum jumps: breaking of unitary symmetry induced by information transfer and the transition from quantum to classical, 'Submitted on 17 Mar 2007 (v1), last revised 18 Mar 2008 (this version, v3)) "Measurements transfer information about a system to the apparatus, and then further on -- to observers and (often inadvertently) to the environment. I show that even imperfect copying essential in such situations restricts possible unperturbed outcomes to an orthogonal subset of all possible states of the system, thus breaking the unitary symmetry of its Hilbert space implied by the quantum superposition principle. Preferred outcome states emerge as a result. They provide framework for the ``wavepacket collapse'', designating terminal points of quantum jumps, and defining the measured observable by specifying its eigenstates. In quantum Darwinism, they are the progenitors of multiple copies spread throughout the environment -- the fittest quantum states that not only survive decoherence, but subvert it into carrying information about them -- into becoming a witness.' back

www.naturaltheology.net is maintained by The Theology Company Proprietary Limited ACN 097 887 075 ABN 74 097 887 075 Copyright 2000-2016 © Jeffrey Nicholls