2

Let's say I have a sensor that measures pressure in a range from $10 - 60 \text{ mmHg}$.

This sensor has an error of $\pm 0.8 \text{ mmHg}$.

Is there a way to quantify how large this error is with respect to the range of values the sensor can sense? I want to figure out what the relative size of this error is.

Gilles
  • 1,022
  • 1
  • 10
  • 21

2 Answers2

1

This sensor has an error of +/- 0.8 mmHg.

More correctly, that is not an error but an uncertainty (you can find a definition of these two terms in the International Vocabulary of Metrology: error, §2.16, and uncertainty, §2.26). You don't need to prefix the uncertainty value with $\pm$.

Is there a way to quantify how large this error is with respect to the range of values the sensor can sense?

Without further specification from the manufacturer, you should conservatively assume that the uncertainty is the same along the whole range. The relative uncertainty, which is the ratio between the uncertainty and the measured value, then changes from $0.8/10\approx 8\times 10^{-2} = 8\,\%$ at the beginning of the scale to $0.8/60\approx 1.3\times 10^{-2} = 1.3\,\%$ at the full-scale value.

Massimo Ortolano
  • 202
  • 2
  • 10
0

If you are wondering if the error is larger at the high end of the range compared to the low end, you could test this by taking a large number of readings at controlled pressures. Then take all your readings and calculate the confidence interval. Repeat this for different parts of the range.

John
  • 425
  • 1
  • 4
  • 8