am 8. Juli 1999
Rarely do I encounter a book of such technical quality that also is a pleasure to read. Bishop moves through sometimes difficult topics in a clear, well-motivated style that is appropriate as both an introduction and a desktop reference on neural nets. Definitely on the "A list."
Bishop chose to not include discussions on a number of topics that might have diluted his focus on pattern recognition (for example, Hebbian learning and neural net approaches to principal components analysis). I think that these choices greatly strengthened the integrity of his presentation.
I would love to see an updated edition with a discussion of recent results in statistical learning theory, kernel methods and support vector machines.
am 21. Juni 1999
I'd like to agree with previous reviewers. Note that you will need a good mathematical background (especially in statistics) to understand the content. However, the book is completely thorough in developing all the key concepts and really tries to give you insight into the meaning behind the equations. It's style is that of an undergraduate level textbook, but a very well written one. To use neural nets effectively, I think you need to have at least one book like this.
am 28. Mai 2013
It has been a long way since 1995, and many new techniques and important developments have taken place in the field of A.I. and more concretely, machine learning. Still, this book has aged very well, for two reasons: first, the fundamental techniques and concepts that every practitioner must understand and be able to make use of, like for example parametric techniques for density estimation (kNN), dimensionality reduction (PCA), mixture models, in addition to, of course, neural networks. Second, this book paves the way for moving on to modern techniques like deep energy models and deep belief networks with its last chapter on bayesian techniques.
The explanations are clear and amenable to read. Properties of and advances based on neural networks are presented in a principled way in the context of statistical pattern recognition. The exercises are wisely chosen to ensure the understanding of the presented results, and under what conditions they were derived.
But this book goes beyond theory, A chapter is devoted to optimization techniques, i.e. what algorithms are used to train neural networks in practice. After reading that chapter and going through the exercises you will have a good understanding of the conjugate gradients and LFGB.
The chapter on how to improve generalization, either by optimizing the structure of the network or by combining multiple classifiers is keep at a intuitive level, yet the concepts are well motivated and the few mathetical details help achieving a solid grasp of why do those ideas work. As in the rest of chapters, it is explained how to carry out it in practice, i.e. how I can proofcheck, if my classifier has become better. By the end of the chapter the reader is familiar with the concept of regularization (weight decay), cross validation and bagging.
am 8. Juni 1996
Bishop cuts through the hype surrounding neural networks, and
shows how they relate to standard techniques
in statistical pattern recognition. He concentrates on feedforward
and radial basis function networks, which are the ones used most
widely in practice. This book is about as mathematical as
Hertz, Krogh and Palmer ("An Introduction to the Theory of Neural
Computation", 1991), but is probably easier to read, and is
certainly of more use to the practitioner. A real gem!