EUR 46,56
  • Tous les prix incluent la TVA.
Il ne reste plus que 3 exemplaire(s) en stock (d'autres exemplaires sont en cours d'acheminement).
Expédié et vendu par Amazon.
Emballage cadeau disponible.
Quantité :1
Data Reduction and Error ... a été ajouté à votre Panier
Vous l'avez déjà ?
Repliez vers l'arrière Repliez vers l'avant
Ecoutez Lecture en cours... Interrompu   Vous écoutez un extrait de l'édition audio Audible
En savoir plus
Voir cette image

Data Reduction and Error Analysis for the Physical Sciences (Anglais) Broché – 1 août 2002

Voir les 4 formats et éditions Masquer les autres formats et éditions
Prix Amazon Neuf à partir de Occasion à partir de
"Veuillez réessayer"
EUR 228,00
"Veuillez réessayer"
EUR 46,56
EUR 45,81 EUR 42,15

Offres spéciales et liens associés

Détails sur le produit

En savoir plus sur l'auteur

Découvrez des livres, informez-vous sur les écrivains, lisez des blogs d'auteurs et bien plus encore.

Dans ce livre (En savoir plus)
Parcourir et rechercher une autre édition de ce livre.
Première phrase
It is a well-established fact of scientific investigation that the first time an experiment is performed the results often bear all too little resemblance to the "truth" being sought. Lire la première page
En découvrir plus
Parcourir les pages échantillon
Couverture | Copyright | Table des matières | Extrait | Index | Quatrième de couverture
Rechercher dans ce livre:

Commentaires en ligne

Il n'y a pas encore de commentaires clients sur
5 étoiles
4 étoiles
3 étoiles
2 étoiles
1 étoiles

Commentaires client les plus utiles sur (beta) 21 commentaires
62 internautes sur 63 ont trouvé ce commentaire utile 
"Updated" classic, but still vintage '92 5 janvier 1998
Par Un client - Publié sur
Format: Broché
Robinson's second edition continues the late Bevington's tradition of clear and concise writing, making this book a priceless reference for scientists. Robinson has added discussions of modern problems such as resolving closely-spaced peaks in a spectrum. The new version also adds chapters on Monte Carlo techniques and maximum-likelihood analysis, both powerful tools for data analysis made possible by better computers.
The chapter structure has been modified considerably, so those who have grown comfortable with the first edition over the past decades may not be able to find things as easily. Other than that, most of the weaknesses are computer-related. Much has changed even since 1992.
Robinson added an appendix on graphical presentation. This sounds promising but is a pretty trivial discussion of when to use linear or logarithmic axes and the advantages of a historgram. Might be useful for a very young student, but these days playing with such things is easy in any graphing program.
Many of the computer code snippets have been removed. Most of them were only a few lines of code with lots of comment lines anyway. The codes that remain have been moved from the main text to a densely-packed appendix, which makes them more difficult to study while reading the text.
The codes themselves have been updated from old FORTRAN to a structured language, but I would have preferred C or FORTRAN 90 over the chosen PASCAL. The latter may be useful for undergraduate students, but I've never seen a PASCAL compiler in a working physics lab.
The included disk is a now-obsolete 5.25" floppy. I had to hunt for a machine that could read it and copy over to a 3.5" disc. The text claims repeatedly that the disc has both FORTRAN 77 and PASCAL routines on it, but my copy only has the PASCAL.
In the end, it's the textual content that is important, and this book is a fantastic basic discussion of data analysis and statistics for students and a great reference for the practicing scientist.
13 internautes sur 14 ont trouvé ce commentaire utile 
Great Book 22 mai 2002
Par Un client - Publié sur
Format: Broché
I make measurements frequently and this book is great for providing the background to analyze your data.
I took undergraduate level statistics and it never really gave the practical applied background in how to analyze data. It merely presented concepts and presumed you knew how and why to apply them. This book is very good at helping you to understand the how and why.
I have read a number of other statistics book in search of the practical applied information provided in this book and did not find it in the other books.
The writing is clear and consice. There is enough background provided for even those unexposed to statistics.
I have not tried the software. Most of the formulas are easy to apply and can be implemented in simple programs or spreadsheets in very little time.
In short, I recommend this book to anyone making measurements of any kind.
8 internautes sur 9 ont trouvé ce commentaire utile 
Buy Taylor's Book Instead 14 décembre 2009
Par Alex Zaharakis - Publié sur
Format: Broché
Simply put, not as clear as John Taylor's error analysis book. If you want a good explanation for how and why to approach error analysis, this book does not do the job.
7 internautes sur 8 ont trouvé ce commentaire utile 
All new but just as good 26 juillet 2004
Par misterbeets - Publié sur
Format: Broché
This book seems to have been completely rewritten by the new author, only keeping the outline of the original, and it's for the better. The writing is as careful as the original, and as economical, so you have to master the early chapters or the rest is hopeless, as things start off slowly but quickly become difficult. It begins by considering the error in a single measurement, and proceeds to estimating errors derived from curve fitting. A few nuclear decay experiments provide examples throughout, and the author insists on calculating many quantities manually, even though in practice it would never be done that way. Some background topics like matrix algebra appear in the appendix too.
3 internautes sur 3 ont trouvé ce commentaire utile 
A Lot of Fundamental Mistakes 3 mars 2014
Par Peter B - Publié sur
Format: Broché
This book contains a lot of fundamental mistakes. I’m not talking about typos that some people complain about. I mean some fundamental mistakes, where the corrected information is written in fairly basic probability and statistics books. Here are some of them:

1. p.31: The authors claim that the Lorentzian (Cauchy) distribution has the mean mu. In fact, the mean is not defined. The parameter mu is the median, but not the mean (although the distribution is symmetric). If the importance of this fact is not clear to you, here is an example. If Lorentzian (Cauchy) distribution had the mean, the law of large numbers would apply to it, but in fact it does not. Google Cauchy distribution for more info.
2. p. 66 (both figures): The authors claim that the distribution of the number of points in each bin is Poisson. In fact, it is binomial. Although binomial converges to Poisson, the approximation is reasonable for really small p (think of variance of binomial, which is (1-p) times variance of Poisson (lambda=n*p), so with p=0.1 we still get 10% difference).
3. p. 67 formula (4.32): The authors divide by variance, which may seem intuitive, but in fact you are supposed to divide by the Expected Count. Since they incorrectly assume Poisson, they end up with the correct denominator n*p (lucky for them). If they correctly used the binomial, they would get n*p*(1-p), which is incorrect. If you correctly use binomial and the Expected Count, you get the correct denominator n*p.
4. p. 67 formula (4.33): The first part of that equation is correct, but then the authors feel the need to replace n*p with the observed count (h(x)), assuming that n*p is approximately h(x). You never want to do this for two reasons:

a. The whole purpose of the chi-squared test is to check if n*p is close to h(x). If n*p is approximately h(x), you might as well write zero in the numerator.
b. The chi-squared approximation will not work well.

5. p. 66 (both figures): The bins for the chi-squared statistic should have equal p, not equal width (like it is done for histograms). With equal widths the chi-squared approximation does not work well for small samples.

I could continue, but I think I will stop here. This looks pretty embarrassing. The book seems to be based on what authors thought to be correct, rather than checking facts with other statistical books. I know it is supposed to be a practical book, but the facts should be correct, and using an incorrect formula (for example, for chi-squared) is not good for practitioners.

The explanations are also poor as pointed out by other reviewers. One example is Section 3.3 on Error Propagation, where all formulas are shown as exact, although in fact, some of them are approximate and some are exact. In other context, the authors are careful to write that, for example, square root of 723 is approximately 26.9. The approximation in Error Propagation formulas can be much cruder of course, but this is not discussed, other than a general discussion of approximations in Section 3.2.

I was sad to see that this book apparently is a classic, and I’m guessing it has been used by generations of scientists. The only good news is that students typically do not remember all that was written in their textbook.

The book is quite fascinating for me as an example of how people knowing so much about probability and statistics can be so deeply confused by it. That is correct; authors know a lot about the topic. I guess enough to be dangerous, as the saying goes.
Ces commentaires ont-ils été utiles ? Dites-le-nous


Souhaitez-vous compléter ou améliorer les informations sur ce produit ? Ou faire modifier les images?