At first glance, this might not look like a bioinformatics question. However, I suspect that an understanding of this area might influence my choice/usage of proteomic pipeline software (e.g. mass deviation thresholds are important parameters in peptide identification programs).
My first thought is that since mass (or rather m/z) is the thing being measured, accuracy should be measured in absolute units of m/z, i.e. thompsons (Th).
However, in practice, the relative unit ppm seems to be used instead. I find this confusing, since ppm will mean different things at different m/z values.
e.g. (taken from here)
- 5 ppm @ m/z 300 = ±0.0015 Th
- 5 ppm @ m/z 3000 = ±0.015 Th
When describing the latest-and-greatest new machines, the literature seems to stick with ppm, e.g. Parts per million mass accuracy on an Orbitrap mass spectometer via lock mass injection in a C-trap.
According to Gross in Mass spectrometry: a textbook:
As mass spectrometers tend to have similar absolute mass accuracies over a comparatively wide range, absolute mass accuracy represents a more meaningful way of stating mass accuracies than the more trendy use of ppm.
So, can someone perhaps shed light on why ppm seems to be preferred?
Even statistical treatments tend to use ppm where I would expect to see Th, e.g. see Fig. 1 from this paper from the Mann lab, which graphs the distribution of mass deviations in terms of ppm.
Thanks for your time.
UPDATE: I have cross-posted this question to the spctools-discuss google group, which is dedicated to proteomics questions.