- Taschenbuch: 640 Seiten
- Verlag: Cambridge University Press (1. November 2004)
- Sprache: Englisch
- ISBN-10: 0521644445
- ISBN-13: 978-0521644440
- Durchschnittliche Kundenbewertung: 3 Kundenrezensionen
- Amazon Bestseller-Rang: Nr. 3.990.939 in Fremdsprachige Bücher (Siehe Top 100 in Fremdsprachige Bücher)
- Komplettes Inhaltsverzeichnis ansehen
Information Theory, Inference and Learning Algorithms (Englisch) Taschenbuch – 1. November 2004
Kunden, die diesen Artikel angesehen haben, haben auch angesehen
Welche anderen Artikel kaufen Kunden, nachdem sie diesen Artikel angesehen haben?
Es wird kein Kindle Gerät benötigt. Laden Sie eine der kostenlosen Kindle Apps herunter und beginnen Sie, Kindle-Bücher auf Ihrem Smartphone, Tablet und Computer zu lesen.
Geben Sie Ihre Mobiltelefonnummer ein, um die kostenfreie App zu beziehen.
Wenn Sie dieses Produkt verkaufen, möchten Sie über Seller Support Updates vorschlagen?
'This is an extraordinary and important book, generous with insight and rich with detail in statistics, information theory, and probabilistic modeling across a wide swathe of standard, creatively original, and delightfully quirky topics. David MacKay is an uncompromisingly lucid thinker, from whom students, faculty and practitioners all can learn.' Peter Dayan and Zoubin Ghahramani, Gatsby Computational Neuroscience Unit, University College, London
'This is primarily an excellent textbook in the areas of information theory, Bayesian inference and learning algorithms. Undergraduates and postgraduates students will find it extremely useful for gaining insight into these topics; however, the book also serves as a valuable reference for researchers in these areas. Both sets of readers should find the book enjoyable and highly useful.' David Saad, Aston University
'An utterly original book that shows the connections between such disparate fields as information theory and coding, inference, and statistical physics.' Dave Forney, Massachusetts Institute of Technology
'An instant classic, covering everything from Shannon's fundamental theorems to the postmodern theory of LDPC codes. You'll want two copies of this astonishing book, one for the office and one for the fireside at home.' Bob McEliece, California Institute of Technology
'… a quite remarkable work … the treatment is specially valuable because the author has made it completely up-to-date … this magnificent piece of work is valuable in introducing a new integrated viewpoint, and it is clearly an admirable basis for taught courses, as well as for self-study and reference. I am very glad to have it on my shelves.' Robotica
'With its breadth, accessibility and handsome design, this book should prove to be quite popular. Highly recommended as a primer for students with no background in coding theory, the set of chapters on error correcting codes are an excellent brief introduction to the elements of modern sparse graph codes: LDPC, turbo, repeat-accumulate and fountain codes are described clearly and succinctly.' IEEE Transactions on Information Theory
Über das Produkt
This exciting and entertaining textbook is ideal for courses in information, communication and coding. It is an unparalleled entry point to these subjects for professionals working in areas as diverse as computational biology, data mining, financial engineering and machine learning.Alle Produktbeschreibungen
Derzeit tritt ein Problem beim Filtern der Rezensionen auf. Bitte versuchen Sie es später noch einmal.
This book is available for free as a pdf on the authors website. I started with the online version but in the end bought the hardcopy as well as it reads better and from my point if view, every cent is well invested.
Die hilfreichsten Kundenrezensionen auf Amazon.com
This is a learning text, clearly meant to be read and understood. The presentation of topics is greatly expanded and includes much discussion, and although the book is dense, it is rarely concise. The exercises are absolutely essential to understanding the text. Although the author has made some effort to make certain chapters or topics independent, I think that this is one book for which it is best to more or less work straight through. For this reason and others, this book does not make a very good reference: occasionally nonstandard notation or terminology is used.
The biggest strength of this text, in my opinion, is on a philosophical level. It is my opinion, and in my opinion it is a great shame, that the vast majority of statistical theory and practice is highly arbitrary. This book will provide some tools to (at least in some cases) anchor your thinking to something less arbitrary. It's ironic that much of this is done within the Bayesian paradigm, something often viewed (and criticized) as being more arbitrary, not less so. But MacKay's way of thinking is highly compelling. This is a book that will not just teach you subjects and techniques, but will shape the way you think. It is one of the rare books that is able to teach how, why, and when certain techniques are applicable. It prepares one to "think outside the box".
I would recommend this book to anyone studying any of the topics covered by this book, including information theory, coding theory, statistical inference, or neural networks. This book is especially indispensable to a statistician, as there is no other book that I have found that covers information theory with an eye towards its application in statistical inference so well. This book is outstanding for self-study; it would also make a good textbook for a course, provided the course followed the development of the textbook very closely.
Each chapter contains a preface where the author tells you what exercises you should have done in order to be qualified to read it, and this is where I lost my patience with the book. It looks like a great self-study book, but after I spent a lot of time trying to follow the author's advice, I think the suggested exercises are too hard and the book doesn't contain enough preparation. Either you struggle with some excessively hard and time-consuming problems, or you just go to MacKay's solutions. There are many flattering reviews. I doubt the reviewers studied the book in the way it suggests. I found it much easier to study using Cover and Thomas's information theory book.
Another reason for my scepticism is this. The author makes available lectures online at "videolectures dot net" containing similar content to the book. However, the video lectures are simpler than this book. The lectures are given to undergraduates at Cambridge University. That David MacKay has to simplify the content even for these elite undergraduates accords with my guess that the book's suggested self-study routes are unrealistic.
The chapters are accessible, the language is clear, and the amount of math is just right for a student to start learning. You can start understanding the theory and mathematical rationale for the technicques but without getting bogged down in pages of greek letter math theorems like what happens in LNCS or math classes. I think the book is very well suited for people with the background of roughly the equivalent of a bachelors degree in EE, CS, physics or the like.
The book will get you up to date on things like Bayesian networks, error correcting codes, variational methods, stochastic optimization etc. I just wish the good doctor would add a chapter on unsupervised learning of stacked networks (basically, the recent advancements in deep learning).