The theology company logo


vol 3: Development
chapter 2: Model
page 10 Entropy

New pages


Site map
Directory
Search this site

Home

1: About
2: Synopsis
3: Development

Next: page 11: Knowledge
Previous: page 9: Selection

4: Glossary
5: Questions

6: Essays
7: Notes
8: History

9: Persons

10: Supplementary
11: Policy

 

 

a personal journey to natural theology


This site is part of the natural religion project The natural religion project     A new theology    A commentary on the Summa    The theology company

 

Entropy

We describe the world in terms of information and information processing rather than the energy and energy processing studied by classical physics and engineering. Although information and entropy seem conceptually different, they are both measured simply by counting. Entropy - Wikipedia

The amount of information carried by any point in a certain space is equal to the entropy of that space. The entropy of the space is a count of the number of points in the space.

Entropy first entered science through the study of heat engines which led to thermodynamics and statistical mechanics. Thermodynamically, entropy is the quantity which is conserved by a formal reversible heat engine like the the Carnot engine. Carnot, Zemansky

Entropy is a measure of the complexity of a system, usually expressed as a logarithm of the number of states the system can occupy. A reversible system is a system in which transformations occur at constant complexity.

In classical physics, each degree of freedom contributes equally to the complexity of the system, so that the calculation of entropy is a simple count, expressed as a logarithm for mathematical convenience. This is the Boltzmann entropy, expressed in the famous equality

S = k log W. Cercignani

where W is the number of 'complexions' or states of the system to be measured.

Information theory adds a weighting to this count to give us the Shannon entropy

H = SUMi pi log pi . Khinchin

When all the [probabilities of the states, symbols or letters] pi are equal, H is at a maximum and the two equations become the same.

The classical second law of thermodynamics tells us that the entropy of a closed physical system never decreases. No explanation is given for this phenomenon, but it is seen to be closely linked to the direction of time. Increasing time correlates with increasing entropy. Entropy (arrow of time) - Wikipedia

A system is closed when it is not in communication with its environment: effectively it has no environment. The universe is such a system, having, by definition, nothing outside it. We see the second law as a consequence of Cantor's theorem: given any transfinite cardinal, there is always a greater cardinal.

From the point of view of engineering thermodynamics, entropy has had bad press over the years, being associated with the 'heat death' of the universe and bounds on the efficiency of heat engines. Heat death of the universe - Wikipedia

Here we are less interested in the energy-momentum physical view of the universe and more interested in its intelligence, complexity and creativity. From a physical point of view, the second law of thermodynamics seems to point to increasing disorder. From an information theoretical point of view, however, entropy and information are numerically the same thing.

The network model allows us to picture the flow of information in the universe as a flow of entropy. The flow of entropy is bandwidth, and the universe (as seen by physics) is an extremely broadband system. Physics / action

In most meaningful messages, like this text, the symbols are not equiprobable. In English, letters like space, e, i and a are very much more frequent than letters like j, q, x and z. The lack of equiprobability reduces the entropy of sentences. Shannon showed that the extra entropy made available by coding messages so as establish equiprobability among the symbols could be used to reduce the probability of error when transmitting a message over a noisy channel. Shannon

The 'equipartition of energy' manifest in classical statistical mechanics suggests that the universe is 'aware' of Shannon's discovery. Huang

From a static (or spatial) point of view we would be inclined to give the initial singularity an entropy of zero, since it is only one state, and log(1) = 0 to any base. If we think of the singularity dynamically as a Turing machine executing NOPs, however, we might attribute to it an infinite sequence of states distinguished only by their position in the sequence with entropy in proportion to the length of the sequence.

We may then imagine the complexification of our model universe as the conversion of dynamic entropy into static entropy, or perhaps the conversion of action into memory to produce the large scale durable structures we see around us. We return to this question in more detail in physics.

We may see entropy as a measure of meaning. The 'amount of meaning' in an event is its cardinal, and places it somewhere in the transfinite hierarchy of meaning. The unreachable 'god' stands at the topless top of this hierarchy, Cantor's Absolute. The absolute is a mathematically impossible figment of the imagination that effectively bounds mathematics and any mathematical representation of the universe. Hallett, Absolute infinite - Wikipedia

Further reading

Books

Click on the "Amazon" link to see details of a book (and possibly buy it!)

Carnot, Sadi, and Translated by R H Thurston; edited and with an introduction by E Mendoza, Reflections on the Motive Power of Fire: and other papers on the second law of thermodynamics by E Clapeyron and R Clausius., Peter Smith Publisher 1977 Reflections: Everyone knows that heat can produce motion. ... in these days when the steam-engine is everywhere so well known. ... To develop this power, to appropriate it to our uses, is the object of heat engines. ... Notwithstanding the work of all kinds done by steam-engines, notwithstanding the satisfactory condition to which they have been brought today, their theory is very little understood, and the attempts to improve them are still directed almost by chance. ... In order to consider in the most general way the principle of the production of motion by heat, it must be considered independently of any mechanism or any particular agent. It is necessary to establish principles applying not only to steam-engines but to all imaginable heat engines, whatever the working substance and whatever the method by which it is operated. ... [Here enters the seed of entropy] The production of motive power is then due in steam-engines not to an actual consumption of caloric, but to its transportation from a warm body to a cold body, that is, to its reestablishment of equilibrium - an equilibrium considered as destroyed by any cause whatever, by chemical action such as combustion, or by any other.' pages 3-7. 
Amazon
  back
Cercignani, Carlo, Ludwig Boltzmann: The Man Who Trusted Atoms, Oxford University Press, USA 2006 'Cercignani provides a stimulating biography of a great scientist. Boltzmann's greatness is difficult to state, but the fact that the author is still actively engaged in research into some of the finer, as yet unresolved issues provoked by Boltzmann's work is a measure of just how far ahead of his time Boltzmann was. It is also tragic to read of Boltzmann's persecution by his contemporaries, the energeticists, who regarded atoms as a convenient hypothesis, but not as having a definite existence. Boltzmann felt that atoms were real and this motivated much of his research. How Boltzmann would have laughed if he could have seen present-day scanning tunnelling microscopy images, which resolve the atomic structure at surfaces! If only all scientists would learn from Boltzmann's life story that it is bad for science to persecute someone whose views you do not share but cannot disprove. One surprising fact I learned from this book was how research into thermodynamics and statistical mechanics led to the beginnings of quantum theory (such as Planck's distribution law, and Einstein's theory of specific heat). Lecture notes by Boltzmann also seem to have influenced Einstein's construction of special relativity. Cercignani's familiarity with Boltzmann's work at the research level will probably set this above other biographies of Boltzmann for a very long time to come.' Dr David J Bottomley  
Amazon
  back
Hallett, Michael, Cantorian set theory and limitation of size, Oxford UP 1984 Jacket: 'This book will be of use to a wide audience, from beginning students of set theory (who can gain from it a sense of how the subject reached its present form), to mathematical set theorists (who will find an expert guide to the early literature), and for anyone concerned with the philosophy of mathematics (who will be interested by the extensive and perceptive discussion of the set concept).' Daniel Isaacson. 
Amazon
  back
Huang, Kerson, Statistical Mechanics, John Wiley 1987 'Preface: ... The purpose of this book is to teach statistical mechanics as an integral part of theoretical phyiscs, a discipline that aims to describe all natural phenomena on the basis of a single unifying theory. This theory, at present, is quantum mechanics. ... Before the subject of statistical mechanics proper is presented, a brief but self contained discussion of thermodynamics and the classical kinetic theory of gases is given. The order of this devlopment is imperative, from a pedagogical point of view, for two reasons. First, thermodynamics has successfully described a large part of macroscopic experience, which is the concern of statistical mechanics. It has done so not on the basis of molecular dynamics but on the basis of a few simple and intuitive postulates stated in everyday terms. If we first falimiarize ourselves with thermodynamics, the task of statistical mechanics reduces to the explanation of thermodynamics. Second, the classical kinetic theory of gases is the only known special case in which thermodynics can be derived nearly from first principles, ie, molecular dynamics. A study of this special case will help us to understand why statstical mecahnics sorks.' 
Amazon
  back
Khinchin, A I, Mathematical Foundations of Information Theory (translated by P A Silvermann and M D Friedman), Dover 1957 Jacket: 'The first comprehensive introduction to information theory, this book places the work begun by Shannon and continued by McMillan, Feinstein and Khinchin on a rigorous mathematical basis. For the first time, mathematicians, statisticians, physicists, cyberneticists and communications engineers are offered a lucid, comprehensive introduction to this rapidly growing field.' 
Amazon
  back
Zemansky, Mark W, Heat and Thermodynamics, McGraw-Hill Education 1997 Amazon Book review: 'This respected text deals with large-scale, easily known thermal phenomena and then proceeds to small-scale, less accessible phenomena. The wide range of mathematics used in Dittman and Zemansky's text simultaneously challenges students who have completed a course in impartial differential calculus without alienating those students who have only taken a calculus-based general physics course. Examples of calculations are presented shortly after important formulas are derived. Students see the solutions of problems related to the formulas. Actual thermodynamic experiments are explained in detail. The student sees the applicability of abstract thermodynamic concepts and formulas to real situations.' 
Amazon
  back

Papers

Shannon, Claude E, "The mathematical theory of communication", Bell System Technical Journal, 27, , July and October, 1948, page 379-423, 623-656. 'A Note on the Edition Claude Shannon's ``A mathematical theory of communication'' was first published in two parts in the July and October 1948 editions of the Bell System Technical Journal [1]. The paper has appeared in a number of republications since: o The original 1948 version was reproduced in the collection Key Papers in the Development of Information Theory [2]. The paper also appears in Claude Elwood Shannon: Collected Papers [3]. The text of the latter is a reproduction from the Bell Telephone System Technical Publications, a series of monographs by engineers and scientists of the Bell System published in the BSTJ and elsewhere. This version has correct section numbering (the BSTJ version has two sections numbered 21), and as far as we can tell, this is the only difference from the BSTJ version. o Prefaced by Warren Weaver's introduction, ``Recent contributions to the mathematical theory of communication,'' the paper was included in The Mathematical Theory of Communication, published by the University of Illinois Press in 1949 [4]. The text in this book differs from the original mainly in the following points: o the title is changed to ``The mathematical theory of communication'' and some sections have new headings, o Appendix 4 is rewritten, o the references to unpublished material have been updated to refer to the published material. The text we present here is based on the BSTJ version with a number of corrections.. back
Shannon, Claude E, "Communication in the Presence of Noise", Proceedings of the IEEE, 86, 2, February 1998, page 447-457. Reprint of Shannon, Claude E. "Communication in the Presence of Noise." Proceedings of the IEEE, 37 (January 1949) : 10-21. 'A method is developed for representing any communication system geometrically. Messages and the corresponding signals are points in two "function spaces," and the modulation process is a mapping of one space into the other. Using this representation, a number of results in communication theory are deduced concerning expansion and compression of bandwidth and the threshold effect. Formulas are found for the maximum rate of transmission of binary digits over a system when the signal is perturbed by various types of noise. Some of the properties of "ideal" systems which transmit this maximum rate are discussed. The equivalent number of binary digits per second of certain information sources is calculated.' . back

Links

Absolute infinite - Wikipedia Absolute infinite - Wikipedia, the free encyclopedia 'The Absolute Infinite is mathematician Georg Cantor's concept of an "infinity" that transcended the transfinite numbers. Cantor equated the Absolute Infinite with God. He held that the Absolute Infinite had various mathematical properties, including that every property of the Absolute Infinite is also held by some smaller objec' back
Entropy - Wikipedia Entropy - Wikipedia, the free encyclopedia 'In terms of statistical mechanics, the entropy describes the number of the possible microscopic configurations of the system. The statistical definition of entropy is the more fundamental definition, from which all other definitions and all properties of entropy follow. Although the concept of entropy was originally a thermodynamic construct, it has been adapted in other fields of study, including information theory, psychodynamics, thermoeconomics, and evolution' back
Heat death of the universe - Wikipedia Heat death of the universe - Wikipedia, the free encyclopedia 'The heat death is a possible final state of the universe, in which it has "run down" to a state of no thermodynamic free energy to sustain motion or life. In physical terms, it has reached maximum entropy. The hypothesis of a universal heat death stems from the 1850s ideas of William Thomson (Lord Kelvin) who extrapolated the theory of heat views of mechanical energy loss in nature, as embodied in the first two laws of thermodynamics, to universal operation' back

 

  in association with Amazon.com

Click on an "Amazon" link in the booklist at the foot of the page to buy the book, see more details or search for similar items

Related sites:


Concordat Watch
Revealing Vatican attempts to propagate its religion by international treaty

Copyright: You may copy this material freely provided only that you quote fairly and provide a link (or reference) to your source.

 


Top
next: page 11: Knowledge
previous: page 9: Selection
Google
Search WWW Search naturaltheology.net Search physicaltheology.com

top

site scripted with Frontier
This page was last built on 2/28/09; 11:18:18 AM by jhn. tnrp@bigpond.com

ntBLine picture