Neu kaufen

Loggen Sie sich ein, um 1-Click® einzuschalten.
Mit kostenloser Probeteilnahme bei Amazon Prime. Melden Sie sich während des Bestellvorgangs an.
Gebraucht kaufen
Gebraucht - Gut Informationen anzeigen
Preis: EUR 37,95

Jetzt eintauschen
und EUR 10,40 Gutschein erhalten
Alle Angebote
Möchten Sie verkaufen? Hier verkaufen
Der Artikel ist in folgender Variante leider nicht verfügbar
Keine Abbildung vorhanden für
Keine Abbildung vorhanden

Den Verlag informieren!
Ich möchte dieses Buch auf dem Kindle lesen.

Sie haben keinen Kindle? Hier kaufen oder eine gratis Kindle Lese-App herunterladen.

Information Theory, Inference and Learning Algorithms [Englisch] [Gebundene Ausgabe]

David J. C. MacKay
4.7 von 5 Sternen  Alle Rezensionen anzeigen (3 Kundenrezensionen)
Preis: EUR 45,95 kostenlose Lieferung. Siehe Details.
  Alle Preisangaben inkl. MwSt.
o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o
Nur noch 8 auf Lager (mehr ist unterwegs).
Verkauf und Versand durch Amazon. Geschenkverpackung verfügbar.
Lieferung bis Freitag, 29. August: Wählen Sie an der Kasse Morning-Express. Siehe Details.

Weitere Ausgaben

Amazon-Preis Neu ab Gebraucht ab
Gebundene Ausgabe EUR 45,95  
Taschenbuch --  


25. September 2003
Information theory and inference, taught together in this exciting textbook, lie at the heart of many important areas of modern technology - communication, signal processing, data mining, machine learning, pattern recognition, computational neuroscience, bioinformatics and cryptography. The book introduces theory in tandem with applications. Information theory is taught alongside practical communication systems such as arithmetic coding for data compression and sparse-graph codes for error-correction. Inference techniques, including message-passing algorithms, Monte Carlo methods and variational approximations, are developed alongside applications to clustering, convolutional codes, independent component analysis, and neural networks. Uniquely, the book covers state-of-the-art error-correcting codes, including low-density-parity-check codes, turbo codes, and digital fountain codes - the twenty-first-century standards for satellite communications, disk drives, and data broadcast. Richly illustrated, filled with worked examples and over 400 exercises, some with detailed solutions, the book is ideal for self-learning, and for undergraduate or graduate courses. It also provides an unparalleled entry point for professionals in areas as diverse as computational biology, financial engineering and machine learning.

Hinweise und Aktionen

  • Studienbücher: Ob neu oder gebraucht, alle wichtigen Bücher für Ihr Studium finden Sie im großen Studium Special. Natürlich portofrei.

Wird oft zusammen gekauft

Information Theory, Inference and Learning Algorithms + Bayesian Reasoning and Machine Learning + Probabilistic Graphical Models: Principles and Techniques (Adaptive Computation and Machine Learning)
Preis für alle drei: EUR 173,85

Die ausgewählten Artikel zusammen kaufen

Kunden, die diesen Artikel gekauft haben, kauften auch


  • Gebundene Ausgabe: 640 Seiten
  • Verlag: Cambridge University Press (25. September 2003)
  • Sprache: Englisch
  • ISBN-10: 0521642981
  • ISBN-13: 978-0521642989
  • Größe und/oder Gewicht: 25 x 20 x 4 cm
  • Durchschnittliche Kundenbewertung: 4.7 von 5 Sternen  Alle Rezensionen anzeigen (3 Kundenrezensionen)
  • Amazon Bestseller-Rang: Nr. 26.895 in Fremdsprachige Bücher (Siehe Top 100 in Fremdsprachige Bücher)
  • Komplettes Inhaltsverzeichnis ansehen

Mehr über den Autor

Entdecken Sie Bücher, lesen Sie über Autoren und mehr



'This is an extraordinary and important book, generous with insight and rich with detail in statistics, information theory, and probabilistic modeling across a wide swathe of standard, creatively original, and delightfully quirky topics. David MacKay is an uncompromisingly lucid thinker, from whom students, faculty and practitioners all can learn.' Peter Dayan and Zoubin Ghahramani, Gatsby Computational Neuroscience Unit, University College, London

'This is primarily an excellent textbook in the areas of information theory, Bayesian inference and learning algorithms. Undergraduates and postgraduates students will find it extremely useful for gaining insight into these topics; however, the book also serves as a valuable reference for researchers in these areas. Both sets of readers should find the book enjoyable and highly useful.' David Saad, Aston University

'An utterly original book that shows the connections between such disparate fields as information theory and coding, inference, and statistical physics.' Dave Forney, Massachusetts Institute of Technology

'An instant classic, covering everything from Shannon's fundamental theorems to the postmodern theory of LDPC codes. You'll want two copies of this astonishing book, one for the office and one for the fireside at home.' Bob McEliece, California Institute of Technology

'… a quite remarkable work … the treatment is specially valuable because the author has made it completely up-to-date … this magnificent piece of work is valuable in introducing a new integrated viewpoint, and it is clearly an admirable basis for taught courses, as well as for self-study and reference. I am very glad to have it on my shelves.' Robotica

'With its breadth, accessibility and handsome design, this book should prove to be quite popular. Highly recommended as a primer for students with no background in coding theory, the set of chapters on error correcting codes are an excellent brief introduction to the elements of modern sparse graph codes: LDPC, turbo, repeat-accumulate and fountain codes are described clearly and succinctly.' IEEE Transactions on Information Theory

Über das Produkt

This exciting and entertaining textbook is ideal for courses in information, communication and coding. It is an unparalleled entry point to these subjects for professionals working in areas as diverse as computational biology, data mining, financial engineering and machine learning.

Welche anderen Artikel kaufen Kunden, nachdem sie diesen Artikel angesehen haben?

In diesem Buch (Mehr dazu)
In this chapter we discuss how to measure the information content of the outcome of a random experiment. Lesen Sie die erste Seite
Mehr entdecken
Ausgewählte Seiten ansehen
Buchdeckel | Copyright | Inhaltsverzeichnis | Auszug | Stichwortverzeichnis | Rückseite
Hier reinlesen und suchen:

Eine digitale Version dieses Buchs im Kindle-Shop verkaufen

Wenn Sie ein Verleger oder Autor sind und die digitalen Rechte an einem Buch haben, können Sie die digitale Version des Buchs in unserem Kindle-Shop verkaufen. Weitere Informationen


3 Sterne
2 Sterne
1 Sterne
4.7 von 5 Sternen
4.7 von 5 Sternen
Die hilfreichsten Kundenrezensionen
2 von 2 Kunden fanden die folgende Rezension hilfreich
5.0 von 5 Sternen simply great 21. Oktober 2012
Format:Gebundene Ausgabe|Verifizierter Kauf
The book is simply great. Its key feature is that MacKay nails things down in two paragraphs in a clear manner where other authors are fuzzy and write 2 pages.

This book is available for free as a pdf on the authors website. I started with the online version but in the end bought the hardcopy as well as it reads better and from my point if view, every cent is well invested.
War diese Rezension für Sie hilfreich?
4.0 von 5 Sternen Foundations of Machine Learning 7. Mai 2014
Format:Gebundene Ausgabe|Verifizierter Kauf
This book is really nice and quite easy to read. I would recommend this book to every student who starts working in machine learning.
War diese Rezension für Sie hilfreich?
0 von 3 Kunden fanden die folgende Rezension hilfreich
5.0 von 5 Sternen Prima 12. Juli 2013
Von jaz
Format:Gebundene Ausgabe|Verifizierter Kauf
THe book is amazing, but hey, amazon, do I really have to type twenty words? Come on. . . .
War diese Rezension für Sie hilfreich?
Die hilfreichsten Kundenrezensionen auf (beta) 4.3 von 5 Sternen  20 Rezensionen
46 von 47 Kunden fanden die folgende Rezension hilfreich
5.0 von 5 Sternen Outstanding book, especially for statisticians 2. Oktober 2007
Von Alexander C. Zorach - Veröffentlicht auf
Format:Gebundene Ausgabe
I find it interesting that most of the people reviewing this book seem to be reviewing it as they would any other information theory textbook. Such a review, whether positive or critical, could not hope to give a complete picture of what this text actually is. There are many books on information theory, but what makes this book unique (and in my opinion what makes it so outstanding) is the way it integrates information theory with statistical inference. The book covers topics including coding theory, Bayesian inference, and neural networks, but it treats them all as different pieces of a unified puzzle, focusing more on the connections between these areas, and the philosophical implications of these connections, and less on delving into depth in one area or another.

This is a learning text, clearly meant to be read and understood. The presentation of topics is greatly expanded and includes much discussion, and although the book is dense, it is rarely concise. The exercises are absolutely essential to understanding the text. Although the author has made some effort to make certain chapters or topics independent, I think that this is one book for which it is best to more or less work straight through. For this reason and others, this book does not make a very good reference: occasionally nonstandard notation or terminology is used.

The biggest strength of this text, in my opinion, is on a philosophical level. It is my opinion, and in my opinion it is a great shame, that the vast majority of statistical theory and practice is highly arbitrary. This book will provide some tools to (at least in some cases) anchor your thinking to something less arbitrary. It's ironic that much of this is done within the Bayesian paradigm, something often viewed (and criticized) as being more arbitrary, not less so. But MacKay's way of thinking is highly compelling. This is a book that will not just teach you subjects and techniques, but will shape the way you think. It is one of the rare books that is able to teach how, why, and when certain techniques are applicable. It prepares one to "think outside the box".

I would recommend this book to anyone studying any of the topics covered by this book, including information theory, coding theory, statistical inference, or neural networks. This book is especially indispensable to a statistician, as there is no other book that I have found that covers information theory with an eye towards its application in statistical inference so well. This book is outstanding for self-study; it would also make a good textbook for a course, provided the course followed the development of the textbook very closely.
30 von 34 Kunden fanden die folgende Rezension hilfreich
5.0 von 5 Sternen Good value text on a spread of interesting and useful topics 19. Februar 2005
Von Iain - Veröffentlicht auf
Format:Gebundene Ausgabe
I am a PhD student in computer science. Over the last year and a half this book has been invaluable (and parts of it a fun diversion).

For a course I help teach, the intoductions to probability theory and information theory save a lot of work. They are accessible to students with a variety of backgrounds (they understand them and can read them online). They also lead directly into interesting problems.

While I am not directly studying data compression or error correcting codes, I found these sections compelling. Incredibly clear exposition; exciting challenges. How can we ever be certain of our data after bouncing it across the world and storing it on error-prone media (things I do every day)? How can we do it without >60 hard-disks sitting in our computer? The mathematics uses very clear notation --- functions are sketched when introduced, theorems are presented alongside pictures and explanations of what's really going on.

I should note that a small number (roughly 4 or 5 out of 50) of the chapters on advanced topics are much more terse than the majority of the book. They might not be of interest to all readers, but if they are, they are probably more friendly than finding a journal paper on the same topic.

Most importantly for me, the book is a valuable reference for Bayesian methods, on which MacKay is an authority. Sections IV and V brought me up to speed with several advanced topics I need for my research.
20 von 22 Kunden fanden die folgende Rezension hilfreich
5.0 von 5 Sternen A must have... 28. Februar 2005
Von Rich Turner - Veröffentlicht auf
Format:Gebundene Ausgabe
Uniting information theory and inference in an interactive and entertaining way, this book has been a constant source of inspiration, intuition and insight for me. It is packed full of stuff - its contents appear to grow the more I look - but the layering of the material means the abundance of topics does not confuse.

This is _not_ just a book for the experts. However, you will need to think and interact when reading it. That is, after all, how you learn, and the book helps and guides you in this with many puzzles and problems.
10 von 11 Kunden fanden die folgende Rezension hilfreich
5.0 von 5 Sternen A Bayesian View: Excellent Topics, Exposition and Coverage 20. November 2008
Von Edward Donahue - Veröffentlicht auf
Format:Gebundene Ausgabe
I am reviewing David MacKay's `Information Theory, Inference, and Learning Algorithms, but I haven't yet read completely. It will be years before I finish it, since it contains the material for several advanced undergraduate or graduate courses. However, it is already on my list of favorite texts and references. It is a book I will keep going back to time after time, but don't take my word for it. According to the back cover, Bob McEliece, the author of a 1977 classic on information theory recommends you buy two copies, one for the office and one for home. There are topics in this book I am aching to find the time to read, work through and learn.

It can be used as a text book, reference book or to fill in gaps in your knowledge of Information Theory and related material. MacKay outlines several courses for which it can be used including: his Cambridge Course on Information Theory, Pattern Recognition and Neural Networks, a Short Course on Information Theory, and a Course on Bayesian Inference and Machine Learning. As a reference it covers topics not easily accessible in books including: a variety of modern codes (hash codes, low density parity check codes, digital fountain codes, and many others), Bayesian inference techniques (maximum likelihood, LaPlace's method, variational methods and Monte Carlo methods). It has interesting applications such as information theory applied to genes and evolution and to machine learning.

It is well written, with good problems, some help to understand the theory, and others help to apply the theory. Many are worked as examples, and some are especially recommended. He works to keep your attention and interest, and knows how to do it. For example chapter titles include `Why Have Sex' and `Crosswords and Codebreaking'. His web site ( [...] ) is a wondrous collection of resource material including code supporting a variety of topics in the book. The book is available online to browse, either through Google books, or via a link from his web site, but you need to have it in hand, and spend time with it to truly appreciate it.
10 von 12 Kunden fanden die folgende Rezension hilfreich
5.0 von 5 Sternen pretty much indispensible 26. September 2008
Von S. Matthews - Veröffentlicht auf
Format:Gebundene Ausgabe
This is an unqualified classic, to shelve with the likes of 'Structure and Interpretation of Computer Programs', 'Concrete Mathematics' and 'Mathematical Methods of Classical Mechanics'. If you are involved with, or interested in, high-end data analytics, then you _need_ this.

However 'high-end data analytics' does not even begin to do the book justice, so let me try again.

This is a magnificient compendium of fascinating stuff presented in a coherent information-theoretic framework. It covers everything from how digital television data compression and CD error correction work to a detailed commentary on neural networks, and discussion of principled AI methods such as clustering, Gaussian processes and probabilistic graphical models, together with Monte-Carlo techniques and a bunch of statistical physics. It even throws in a complete course in Bayesian statistics. It reads like a really good 'popular' 'science' book (I often wonder where the scare quotes should be) that doesn't bother to try to be popular.

In fact I bought this originally as bedside reading, for pleasure. It was only later that I actually used it for anything.
Waren diese Rezensionen hilfreich?   Wir wollen von Ihnen hören.
Kundenrezensionen suchen
Nur in den Rezensionen zu diesem Produkt suchen

Kunden diskutieren

Das Forum zu diesem Produkt
Diskussion Antworten Jüngster Beitrag
Noch keine Diskussionen

Fragen stellen, Meinungen austauschen, Einblicke gewinnen
Neue Diskussion starten
Erster Beitrag:
Eingabe des Log-ins

Kundendiskussionen durchsuchen
Alle Amazon-Diskussionen durchsuchen

Ähnliche Artikel finden

Ihr Kommentar