Es wird kein Kindle Gerät benötigt. Laden Sie eine der kostenlosen Kindle Apps herunter und beginnen Sie, Kindle-Bücher auf Ihrem Smartphone, Tablet und Computer zu lesen.

  • Apple
  • Android
  • Windows Phone
  • Android

Geben Sie Ihre Mobiltelefonnummer ein, um die kostenfreie App zu beziehen.

Kindle-Preis: EUR 28,36
inkl. MwSt.

Diese Aktionen werden auf diesen Artikel angewendet:

Einige Angebote können miteinander kombiniert werden, andere nicht. Für mehr Details lesen Sie bitte die Nutzungsbedingungen der jeweiligen Promotion.

An Ihren Kindle oder ein anderes Gerät senden

An Ihren Kindle oder ein anderes Gerät senden

Facebook Twitter Pinterest <Einbetten>
A Farewell to Entropy:Statistical Thermodynamics Based on Information von [Arieh Ben-Naim]
Anzeige für Kindle-App

A Farewell to Entropy:Statistical Thermodynamics Based on Information Kindle Edition


Alle Formate und Ausgaben anzeigen Andere Formate und Ausgaben ausblenden
Preis
Neu ab Gebraucht ab
Kindle Edition
"Bitte wiederholen"
EUR 28,36

Länge: 411 Seiten Word Wise: Aktiviert Verbesserter Schriftsatz: Aktiviert
PageFlip: Aktiviert Sprache: Englisch
  • Aufgrund der Dateigröße dauert der Download dieses Buchs möglicherweise länger.

Kindle AusLese
Jeden Monat vier außergewöhnliche Neuerscheinungen für je nur 2,49 EUR - empfohlen vom Amazon-Team. Erfahren Sie mehr über das Programm und melden Sie sich beim Kindle AusLese Newsletter an.

Produktbeschreibungen

Kurzbeschreibung

The principal message of this book is that thermodynamics and statistical mechanics will benefit from replacing the unfortunate, misleading and mysterious term “entropy” with a more familiar, meaningful and appropriate term such as information, missing information or uncertainty. This replacement would facilitate the interpretation of the “driving force” of many processes in terms of informational changes and dispel the mystery that has always enshrouded entropy.

It has been 140 years since Clausius coined the term “entropy”; almost 50 years since Shannon developed the mathematical theory of “information” — subsequently renamed “entropy”. In this book, the author advocates replacing “entropy” by “information”, a term that has become widely used in many branches of science.

The author also takes a new and bold approach to thermodynamics and statistical mechanics. Information is used not only as a tool for predicting distributions but as the fundamental cornerstone concept of thermodynamics, held until now by the term “entropy”.

The topics covered include the fundamentals of probability and information theory; the general concept of information as well as the particular concept of information as applied in thermodynamics; the re-derivation of the Sackur-Tetrode equation for the entropy of an ideal gas from purely informational arguments; the fundamental formalism of statistical mechanics; and many examples of simple processes the “driving force” for which is analyzed in terms of information.

Contents:
  • Elements of Probability Theory
  • Elements of Information Theory
  • Transition from the General MI to the Thermodynamic MI
  • The Structure of the Foundations of Statistical Thermodynamics
  • Some Simple Applications

Readership: Anyone interested in the sciences, students, researchers; as well as the layman.

Synopsis

The principal message of this book is that thermodynamics and statistical mechanics will benefit from replacing the unfortunate, misleading and mysterious term "entropy" with a more familiar, meaningful and appropriate term such as information, missing information or uncertainty. This replacement would facilitate the interpretation of the "driving force" of many processes in terms of informational changes and dispel the mystery that has always enshrouded entropy. It has been 140 years since Clausius coined the term "entropy"; almost 50 years since Shannon developed the mathematical theory of "information" - subsequently renamed "entropy." In this book, the author advocates replacing "entropy" by "information," a term that has become widely used in many branches of science. The author also takes a new and bold approach to thermodynamics and statistical mechanics. Information is used not only as a tool for predicting distributions but as the fundamental cornerstone concept of thermodynamics, held until now by the term "entropy."

The topics covered include the fundamentals of probability and information theory; the general concept of information as well as the particular concept of information as applied in thermodynamics; the re-derivation of the Sackur-Tetrode equation for the entropy of an ideal gas from purely informational arguments; the fundamental formalism of statistical mechanics; and many examples of simple processes the "driving force" for which is analyzed in terms of information.


Produktinformation

  • Format: Kindle Edition
  • Dateigröße: 10730 KB
  • Seitenzahl der Print-Ausgabe: 411 Seiten
  • ISBN-Quelle für Seitenzahl: 9812707069
  • Verlag: WSPC (18. Januar 2008)
  • Verkauf durch: Amazon Media EU S.à r.l.
  • Sprache: Englisch
  • ASIN: B004S06TU8
  • Text-to-Speech (Vorlesemodus): Aktiviert
  • X-Ray:
  • Word Wise: Aktiviert
  • Verbesserter Schriftsatz: Aktiviert
  • Durchschnittliche Kundenbewertung: Schreiben Sie die erste Bewertung
  • Amazon Bestseller-Rang: #853.199 Bezahlt in Kindle-Shop (Siehe Top 100 Bezahlt in Kindle-Shop)

  •  Ist der Verkauf dieses Produkts für Sie nicht akzeptabel?

Kundenrezensionen

Es gibt noch keine Kundenrezensionen auf Amazon.de
5 Sterne
4 Sterne
3 Sterne
2 Sterne
1 Stern

Die hilfreichsten Kundenrezensionen auf Amazon.com (beta)

Amazon.com: 4.4 von 5 Sternen 9 Rezensionen
18 von 18 Kunden fanden die folgende Rezension hilfreich
5.0 von 5 Sternen A bridge between information theory and thermodynamics sure to inspire a new generation of developments in statistical mechanics 4. Januar 2009
Von Otto Normal - Veröffentlicht auf Amazon.com
Format: Taschenbuch
Finally a book that brings information theory and thermodynamics together in a comprehensive way! Ben-Naim paves the way for a future generation of innovation in statistical thermodynamics using the tools of information theory.

The traditional understanding of entropy associates it with disorder. While this view is useful in many contexts, it fails to explain some properties of entropy. Ben-Naim leads us into identifying entropy with uncertainty, or the "missing information" of the system. Information possessed by whom? In this case, we are not talking about perception or communication. Wherever the number of states of a thermodynamic system resides, there resides the entropy.

Ben-Naim explains the so-called Gibbs-paradox in a most satisfying way (Appendix O). A related phenomenon occurs when we mix two chemical species and leave the volume and temperature unchanged. We originally have Na moles of gas A and Nb moles of gas B, each in its respective container of volume V. If we now mix both substances into a single container also of volume V, the entropy remains unchanged. If we insist in understanding entropy as disorder, the mixed container looks more disordered; but the entropy stayed constant. This is not a paradox. In this process, the volume available for substance A and substance B to explore never changed, so the counting of states is unaltered. The missing locational information about gas A and gas B is constant for this process. This treatment applies to ideal mixtures, where the particles don't interact among themselves.

For processes where the particles do interact, we will observe additional correlations which reduce the missing information, a.k.a. entropy. The physical coupling of intermolecular forces translates into statistical correlations. Ben-Naim's presentation (Chap. 5) creates a further bridge between the statistical, information theoretical understanding and thermodynamic entropy.

What I would call the jewel of the book is a rederivation of the Sackur-Tetrode equation for the entropy of an ideal gas (Sec. 5.4). We learn this equation from physical chemistry books as set in stone, but what does it mean? The author rederives it by stacking the missing information due to 4 terms: locational uncertainty, momenta uncertainty, quantum mechanical uncertainty principle and the indistinguishability of the particles.

For all the bridges that Ben-Naim constructs between information theory and thermodynamics, there remain some gaps. "There is no formal proof that [the counting of states and the quantity defined by Clausius in terms of heat transfer and temperature] are identical. The validity of the relationship between the two quantities ultimately rests on the agreement between the calculated values of [entropy] and experimental data based on Clausius' definition". This state of affairs is however not unique to entropy, as science rests on many empirical relationships.

The book is well-written and can be used by researchers and students of undergraduate and graduate levels.
18 von 20 Kunden fanden die folgende Rezension hilfreich
4.0 von 5 Sternen A modern approach to statistical thermodynamics 5. März 2008
Von Diego Casadei - Veröffentlicht auf Amazon.com
Format: Gebundene Ausgabe
In the Preface, the author states that there are many good books on statistical thermodynamics, and that this is not a textbook on this subject. However, though few other books exist that make use of the concept of "information" in statistical thermodynamics, this seems to be the first one that bases the full construction of the theory upon the information. For this reason, the approach by Arieh Ben-Naim is really modern and deserves a careful reading.

Personally, I don't think that this book can not be used as a textbook. Indeed, it is quite self consistent and builds step by step the core of the theory in such a way that any student is able to follow all arguments. Actually, it is true that it does not contain everything, but what is the textbook that really contains everything? May be, the only unpleasant thing for a student is the non negligible amount of time spent commenting the differences with other well known references, in particular the explanations of the probable reasons why Gibbs did not reach the very same results. But comments like these would be a valuable resource for teachers, on the other hand.

If I had to choose one thing in this book, I would recommend to enjoy the derivation of the Sackur-Tetrode equation (chapter 4): it is really beautiful and does not have the "shadows" that classical derivations suffer. For the very first time, I should say, I think I have understood it, thanks to this book.

However, the most important point of the whole book, the real starting point of the full construction, is the following. Shannon's measure H of the missing information (MI) is a more general concept than the entropy S, that is a thermodynamic quantity defined _only_ for equilibrium states: H can be defined for _any_ probability distribution and it comes out that H=S for the equilibrium state. Hence, though in thermodynamics, only changes of S between different equilibrium states are defined, one can make use of the properties of H to perform derivations in a more general context. The results, when applied to thermodynamic equilibrium states, will be also valid for the entropy S.

As the author explained in his introductory book Entropy Demystified: The Second Law Reduced to Plain Common Sense, the fact that S can only increase is an experimental observation, whose explanation is provided by a framework in which matter is discrete and composed by intrinsically indistinguishable particles, with the postulate of equally probable microscopic states and the postulate that the system will be found more often in macroscopic (or "dim") states with higher probability (the latter being the sum of the probabilities of all practically indistinguishable microscopic states, under the assumption that they are all independent).

What it is known as "thermodynamic equilibrium state" is really a set of "dim states" (or "macro-states", following Gibbs) for which the measurable quantities (that are inherently macroscopic) differ only by negligible amounts, so that they are practically (though not in principle) indistinguishable. In turn, these dim states are (in principle) different because they contain all "micro-states" (or "specific states", for Ben-Naim) that features the very same values of the observable quantities. In the assumption that all microscopic states are equally probable (dating back to Boltzmann and fully used by Gibbs), it turns out that the macro-states containing more micro-states are more probable, so that the system will spend more time on them. The family of macro-states around the macro-state with the maximum number of micro-states is what it is called thermodynamic equilibrium state.

The entropy S is defined _only_ for this set of macro-states, that is for the equilibrium state. However, the Shannon's (missing) information theory is defined for _each_ individual macro-state. That is why H is more general than S. By following these arguments, the full theory of statistical thermodynamics can be built, as you will find in the book by Arieh Ben-Naim.
18 von 21 Kunden fanden die folgende Rezension hilfreich
2.0 von 5 Sternen As if there isn't enough confusion already... 23. August 2009
Von Themis Matsoukas - Veröffentlicht auf Amazon.com
Format: Gebundene Ausgabe
In the Introduction, Ben-Naim greets us with a teaser: those of us who think that the (ideal) entropy of mixing is positive, have a problem. We should take our medication and read this book till we come to our senses and realize that the entropy of mixing is zero. But first things, first.

Ben-Naim uses Shannon's entropy (information) to re-interpret statistical thermodynamics. This has been done before, most notably by Jaynes, who is quoted throughout the book. Ben-Naim, however, goes further to argue that all other interpretations of entropy are wrong. The very term, "entropy,'' he argues, is part of the problem: it means nothing and should be abandoned. Even "enfometry'' or "average surpisal" would be more meaningful terms. From here on the book goes back and forth between being a technical presentation of information theory, or a repetitive litany of arguments on semantics. So, what about the entropy of mixing? The argument is laid out in section 6.7. It is long and somewhat rambling ("we already feel in our bones that some information has been lost forever in the process...'' p279) but it goes something like this:

Consider the classical mixing experiment, a box divided into two parts, each filled with a different ideal gas. Remove the partition and let the system equilibrate. Ben-Naim argues that the corresponding increase of entropy arises, not from mixing, but from the expansion of the gases (each gas has more volume to roam). Mixing is entirely incidental, he argues, as relevant to this process as the shape of the container into which the gases expand. If we compress the mixed gases isothermally to half the volume of the mixture, its entropy would become the same as that of the pure gases before mixing. Ergo, the entropy of mixing is zero! What the rest of us call "entropy of mixing,'' Ben-Naim continues, is a bad application of a bad term: "naming a quantity "entropy of mixing'' is more than naming a person or a thing by an arbitrarily chosen name. It is both inappropriate and potentially misleading (p274)''. We wouldn't call it "entropy of squaring" if gases happen to expand in a square vessel, why then call it "entropy of mixing" if gases happen to mix as they expand? Thus goes the argument. None of this requires information theory, by the way. It can be argued by undergraduate classical thermodynamics.

Does this view advance our understanding of nature? Is mixing as incidental to the expansion experiment as Ben-Naim argues? Suppose it is. Then, we should be able to reverse the state of the expanded mixture by compression. But that is not possible: the molecules of gas A would have to be compressed to the left half of the box, the molecules of gas B to the right, and this cannot be done because we don't know where the molecules are.

It is a pity that these irrelevant arguments undermine the premise of the book, which is (ought to be) to argue for an information theory of thermodynamics. Fortunately, this was done fifty years ago by Jaynes in his classic paper, Information Theory and Statistical Mechanics, Physical Review, vol. 106, p. 620, 1957, also included in the collection E.T. Jaynes: Papers on Probability, Statistics and Statistical Physics (Synthese Library).
9 von 10 Kunden fanden die folgende Rezension hilfreich
5.0 von 5 Sternen an insightful and clear book on a controversial subject 31. Oktober 2008
Von Alex Antonelli - Veröffentlicht auf Amazon.com
Format: Taschenbuch Verifizierter Kauf
"A Farewell to Entropy: Statistical Thermodynamics Based on Information" by Arieh Ben-Naim is really a great book. One can consider it a more technical and detailed version of the wonderful little book by the same author, "Entropy Demystified". Although the concept of entropy has been around for almost 150 years, it still remains elusive and controversial. This can be easily seen by the large number of scientific articles, books, and scientific meetings that are currently dedicated to the foundations of the subject. The interpretation of entropy of a system as missing information about the system has been around for a long time, since the works by Brillouin and Jaynes in the 1950's. However, it has been dismissed by many scientists as a subjective interpretation, although, as it is wonderfully explained by Ben-Naim, these same scientists sponsored even more subjective interpretations, such as a measure of the disorder of the system. The book by Ben-Naim provides a solid and lucid explanation of missing information as a very precise and objective concept. The connection between missing information and thermodynamic entropy is also very clearly explained. The book also contains several examples that are very illuminating. I think the book should be read by anyone interested in statistical physics and physics of complex systems.
4 von 4 Kunden fanden die folgende Rezension hilfreich
5.0 von 5 Sternen Anyone trying to understand entropy should read this book. 12. Februar 2013
Von Bob Hanlon - Veröffentlicht auf Amazon.com
Format: Taschenbuch
I'm writing my own book on the foundations of chemical engineering in which the concept of entropy plays a significant role. In attempting to understand the inherent, fundamental meaning of entropy, I've read much of the relevant literature, starting back with Clausius himself and continuing into Boltzmann, Gibbs and Tolman. While the development of this historical context has been a fascinating experience for me, it wasn't until I started reading the works of Arieh Ben-Naim that I finally started to truly understand what both entropy and the 2nd law of thermodynamics are all about. Ben-Naim connects them together using the work and insight of Claude Shannon's Information Theory. This was a very readable book and addressed many of the questions I had for myself about the subject, which I appreciated. It has technical depth combined with a clarity of message. This book complemented well two of the other Ben-Naim books on entropy that I recently read as well, "Discover Entropy of the Second Law of Thermodynamics" and "Entropy and the Second Law." I strongly recommend this book to anyone seeking to understand entropy.
Waren diese Rezensionen hilfreich? Wir wollen von Ihnen hören.
click to open popover