am 23. März 2006
But how much is 'a little'?
I first encountered this book when I was a physics and astronomy major in college, a major that changed over time to include mathematics proper, then political science, then other humanities such as religious studies, history and philosophy. Strange as it may seem, this text has been one of the few constants that has been helpful in almost every field. For physics and any of the natural sciences, the content of this book is highly necessary - be in chemistry, physics, astronomy, geology, or biology, all sciences depend upon observation and analysis, both of which are far from perfect. The task of ever-increasing observational and analytical precision is both an art and a science in and of itself, and one of the tasks of any scientist is to discover where errors might lie.
Interestingly, this also occurs in political science and sociology, economics and history, and even philosophy (logic can incorporate ideas from error analysis, as can epistemology). Error analysis is primarily a statistical tool, and those who have had statistics will find this very familiar. The first part of the book is very simple - Taylor assumes no background, so gives an introduction to the simple reading of charts, graphs, scales and other such things, with plenty of examples. He talks about estimating, significant figures, fractional uncertainties, and how uncertainties can accumulate. How can 2 + 2 = 5? Well, if you round to the highest or lowest whole number, 2.49 and 2.49 will both be rounded down to 2 (under many normal rounding procedures), yet if the underlying calculation or data include the 'real' information, 2.49 + 2.49 in fact equals 4.98, very close to 5. If you think that's confusing, you ain't seen nothing yet...
Taylor's first part concludes by looking at the basics of simple statistical analysis - standard deviations, normal distributions, justification of the mean as best estimate, and a brief introduction to the concept of confidence. Part two gets into more detailed analysis, including least-squares fitting, correlation coefficients, binomial distributions, Poission distributions, and the chi-squared test. The mathematics requirement goes up as the chapters progress - the early chapters only require an elementary knowledge of algeba; as the text continues, knowledge of differentiation, integration and exponential functions are necessary. A first-year course in calculus should be sufficient for easy understanding here; it is possible to get through the material without this background, but it will be more difficult.
This text is designed to be a self-study for the students; it can be introduced in lectures prior to lab work, but can also be used easily for the independent reader to understand. This book is really intended for the physical scientist - most of the examples come from problems in optics or mechanics (physics problems). Useful, helpful, and a good introduction to error analysis.
Read and understand.
am 29. März 2000
I bought the first edition of this book as an engineering graduate student in the early '80s, and it sparked my enduring fascination with statistical methods. I'm a software engineer with a major statistical software firm now, and I still refer to my copy regularly.
When I purchased the second edition this year for my son, who is a junior in college, I doubted that it could improve on the original.
I was wrong. John Taylor has outdone himself. The new examples are superb enhancements of an already outstanding text.
am 8. Oktober 2013
Nachdem im physikalischen Praktikum bei mir an der Uni nur die grundlegenden Begriffe der Fehlerrechnung besprochen und nur unsorgfältig angewandt wurden habe ich lange nach einem Buch gesucht, das die relevanten Aspekte einmal ordentlich abdeckt.
Nach der Lektüre dieses Buchs sind alle meine offenen Fragen nun endlich restlos geklärt. Mit verständlichen Erklärungen, einfachen Beispielen, guten Zusammenfassungen und vielen Übungsaufgaben ist das Buch für jeden angehenden Naturwissenschaftler sehr empfehlenswert und interessant!