- Taschenbuch: 506 Seiten
- Verlag: Cambridge University Press (15. Mai 2014)
- Sprache: Englisch
- ISBN-10: 1107684536
- ISBN-13: 978-1107684539
- Größe und/oder Gewicht: 13,8 x 2,4 x 21,6 cm
- Durchschnittliche Kundenbewertung: Schreiben Sie die erste Bewertung
- Amazon Bestseller-Rang: Nr. 298.140 in Fremdsprachige Bücher (Siehe Top 100 in Fremdsprachige Bücher)
- Komplettes Inhaltsverzeichnis ansehen
Andere Verkäufer auf Amazon
+ GRATIS Lieferung innerhalb Deutschlands
+ EUR 3,00 Versandkosten
+ GRATIS Lieferung innerhalb Deutschlands
Information and the Nature of Reality: From Physics To Metaphysics (Canto Classics) (Englisch) Taschenbuch – 15. Mai 2014
Kunden, die diesen Artikel gekauft haben, kauften auch
Es wird kein Kindle Gerät benötigt. Laden Sie eine der kostenlosen Kindle Apps herunter und beginnen Sie, Kindle-Bücher auf Ihrem Smartphone, Tablet und Computer zu lesen.
Geben Sie Ihre Mobiltelefonnummer ein, um die kostenfreie App zu beziehen.
Wenn Sie dieses Produkt verkaufen, möchten Sie über Seller Support Updates vorschlagen?
'This is the anthology we have been waiting for … Philosophers, theologians and scientists all have their say, wrestling with the theme of God as the ultimate informational and structuring principle in the universe.' Professor Sir Brian Heap, President, European Academies Science Advisory Board, German Academy of Sciences
Über das Produkt
Many scientists regard mass and energy as the primary currency of nature. In recent years, however, the concept of information has gained importance. In this book, eminent scientists, philosophers and theologians chart various aspects of information, from quantum information to biological and digital information, to understand how nature works.Alle Produktbeschreibungen
|5 Sterne (0%)|
|4 Sterne (0%)|
|3 Sterne (0%)|
|2 Sterne (0%)|
|1 Stern (0%)|
Die hilfreichsten Kundenrezensionen auf Amazon.com
Information, like the concepts of matter and energy has been difficult to define. According to Terrence Deacon, the definition of energy wasn't fully realized until it was discovered that energy is not a substance, but rather, a dynamic process of change that is always conserved. Just as with the concept of energy, he said, we must give up the idea of thinking of information as some "artifact" or "commodity". In the broadest sense, says John F. Haught, information can mean whatever gives form, order, pattern, or identify to something.
Today most physicists divide information into two broad categories: syntactic information and semantic information. Syntactic information is sometimes called Shannon information after Claude Shannon who discovered that information can be thought of as a measure of entropy and probability. This is both a quantitative and physical definition, which describes how much information any system can carry and is not concerned with the meaning of the information. The more information a system carries the less entropy it contains, which also happens to be the least probable state of the system. Likewise, the most probable state of a system has a high degree of entropy and carries little information. So we can think of information as a complementarity between the message and the medium. Both are needed for a complete description of information. The second type of information is called semantic information, and it deals with the content of the message--what it means.
Paul Davies says that most physicists now believe that information and not particles and fields are the ground of all being. Beginning with the ancient Greeks up until recent times it has been assumed that the laws of physics, and their mathematically descriptive counterpart were objective aspects of the universe cast in stone, and it was the job of the physicist to uncover these objective truths. This idea was furthered by monotheistic thinking which suggested that the discovery of these objective truths were a window into the mind of God, an idea that has gone unchallenged for three centuries. Davis states: "The fusion of Platonism and Monotheism created the powerful orthodox scientific concept of the laws of physics as ideal, perfect infinitely precise, immutable, eternal unchanging mathematical forms that reside in an abstract platonic heaven beyond space and time. All of these assumptions must be jettisoned to come to an understanding that the laws and states of the universe co-evolve."
For many--from Plato to physicist/ mathematician, Roger Penrose-- mathematics has been assumed to be an objective construct of the universe from which matter and information find expression, but an evolving view among physicists is that information is the basic entity of reality from which the laws of physics, and matter emerge. After all says Davies, "Laws are an informational statement." Mathematics has been successful in describing the laws of physics, not because mathematics is somehow an objective aspect of the universe, but because mathematics and the laws of physics co-emerge from computations carried out since the beginning of time by the ultimate quantum computer--the universe at large.
There can be no separation between the information processing nature of the universe and the information processing revolution of life itself. Both the syntactic and the semantic concept of information are involved in the interplay between organisms and their environment in the sense that far from equilibrium system (organisms) need to be associated with an environment that supports the organisms condition. Both the environment (the signal medium) and the organism (the message) are needed for the co-evolution of the organism/environmental system.
According to Keith Ward and Arthur Peacocke, the information contained in DNA is not semantic information because no understanding is required for the translation and transcription processes that code for proteins. This kind of information belongs to a third category he calls "Shaping" or coded information and it requires no sentience. The functioning of the parts can only be explained by how they contribute to the organism as a whole, and this is true whether we are speaking of the universe as a whole or a living organism. Since consciousness is primordial and contains all possible states, we should not look to the simple to explain the complex, but rather the complex to explain the simple.
John Haught maintains that the idea of "God" as a designer is getting harder and harder to defend in light of the fact that the universe is constantly evolving. Information is a complementarity of order and disorder. Too much order is too rigid and does not allow for novelty and evolution. "If the universe or life were simply designed," says Haught, "it would be frozen in a fixed and eternally unchanging identity. Design is a dead end." Though Haught says that whether or not one calls such a primordial consciousness "God" is partly a matter of taste, it hasn't stopped him and other contributors to the last section of this work in making a desperate attempt to shoe-horn God into the equation.
This work was a very exhaustive and comprehensive treatment of the topic of information, and it greatly informed me on the subject. I would highly recommend this to anyone willing to wade through some fairly dense material in order to get to a clear understanding of the nature of information.
This review by David Kreiter author of: "Confronting the Quantum Enigma: Albert, Niels, and John" (2011), and "Quantum Reality: A New Philosophical Perspective."
On the biological front, both Deacon's and Hofmeyer's chapters are of great interest for now they ask the great question, just what is information? Deacon focuses on the problem of content, i.e., how information - that is, a code - actually specifies content. Two machines can be synced up to send/receive and correctly code/re-encode a sequence of high-low voltage blips. But in what sense is this information? Deacon labels it merely syntactic. The (semantic) question of the mapping to the content this information describes yet remains. Deacon has no particular solution. What Deacon is discussing (only indirectly) is actually the fundamental problem of perception. The light received from the surrounding environment - the ambient optic array - is transduced by the brain to a neural code (information). But a code, say three dots, "...", can stand for a "S" in Morse code, the three blind mice, or Assad's nose, i.e., multiple possible domains. This is the fundamental problem - what is the domain to which a code is mapped? How can the brain use a code to specify the external environment - the image of the external world (or the "content" of our perception) - without already knowing the domain, i.e., what the world looks like? The problem stated in terms of "qualia" as done by Chalmers has been simply a misleading statement of this more general problem of the origin of the image of external world - the content of perception. This leads me to what is a major gap in this interesting book, namely the complete neglect of, nay, failure to grasp the significance of, the great theorist of perception, J. J. Gibson (The Senses Considered as Perceptual Systems) and his concept of information as residing in the invariance laws that define the transforming events of the external world. No model of the brain can ignore this fundamental, basic form of information. This structure of invariants is what the brain is using to specify (or be "specific to") the external world. This "specific to" must be further placed within Bergson's conception that the brain is in effect a modulated reconstructive wave passing through the holographic universal field, modulated by these invariants, and "specific to" a subset of the field - now an image (One can see Time and Memory: A primer on the scientific mysticism of consciousness). But Bergson's model requires the dynamic motion (over time) of this field to be, just as Davies wondered, non-differentiable. It is a solution to Deacon's problem of "content" and a conception of information that is revolutionary - and incapable of being handled under current notions of computation, or Shannon's notion, or machine models of information. Perhaps someday the information theory folks will discover it.