What percent error is too high?

2 Answers
Mar 21, 2014

The acceptability of a percent error depends on the application.

Explanation:

In some cases, the measurement may be so difficult that a 10 % error or even higher may be acceptable.

In other cases, a 1 % error may be too high.

Most high school and introductory university instructors will accept a 5 % error. But this is only a guideline.

At higher levels of study, the instructors usually demand higher accuracy.

Mar 17, 2017

It is never too high. It is what it is (if calculated properly). The USE of a value with a high percent error in measurement is the judgment of the user.

Explanation:

Accuracy, Precision, and Percent Error all have to be taken together to make sense of a measurement. As a scientist and statistician I would have to say that there is no upper limit on a “percent error”. There is only the necessary (human) judgment on whether the data is refers to can be useful or not.

Accuracy and precision are inherent in measurement designs. They are whatever they are, and can only be improved by improving the device. Multiple measurements can improve the accuracy of the statistics of a measurement, but they cannot improve the inherent measurement error. The percent error is calculated as the deviation range of a measurement from the last, best fixed metric point.

For example, I may have the actual, PRIMARY standard meter rod. But, without calibrated sub-intervals I can scientifically only make “accurate” measurements to +/- 1 meter. I really can’t trust my eyes (especially compared to others’) to accurately define even ¼-meter.

My 0.5-meter measurement contains error, because there is no actual 0.5-m reference mark. So, compared to my accurate meter, my measurement of 0.5 meter has a 0.5/1 * 100 = 50% error. That is pretty much the physical reality for any measurement interval. Even there we are assuming that our visual acuity is really able to find that “middle point” between any two other marks.

Precision has to do with how consistently the device delivers the same value for the same measurement. That is usually a function of the construction and use of the device. Accuracy is how close to the “real” value the measured value is. That often relates to the calibration of the device. Percent error is just the determination of how the possible values may deviate from the “true” value due to the limitations of the metric device and its use.