Accuracy vs. Precision: What’s the Difference?

I remember sitting in my ninth-grade chemistry class when my teacher mentioned that the day’s lesson would include a discussion about accuracy and precision, and how both relate to making experimental measurements. I’ve always been more of a liberal-arts-minded individual, and I initially thought, Is there really a difference between the two terms? In fact, I even remembered using the words interchangeably in my writing for English class!

However, as I continued through more advanced science and math courses in college, and eventually joined Minitab Inc., I became tuned in to the important differences between accuracy and precision—and especially how they relate to quality improvement projects!

Assessing Variation in Your Measurement Systems

I’ve learned that if you’re starting a quality improvement project that involves collecting data to control quality or to monitor changes in your company’s processes, it’s essential that your systems for collecting measurements aren’t faulty.

After all, if you can’t trust your measurement system, then you can’t trust the data that it produces.

So what types of measurement system errors may be taking place? Here’s where accuracy and precision come into play. Accuracy refers to how close measurements are to the "true" value, while precision refers to how close measurements are to each other. In other words, accuracy describes the difference between the measurement and the part’s actual value, while precision describes the variation you see when you measure the same part repeatedly with the same device.

Precision can be broken down further into two components:

Repeatability: The variation observed when the same operator measures the same part repeatedly with the same device.

Reproducibility: The variation observed when different operators measure the same part using the same device.

It’s important to note that measurement systems can suffer from both accuracy and precision problems! A dart board can help us visualize the difference between the two concepts:

Accurate and Precise                               Precise...but not Accurate

Accurate, but not Precise                              Neither Accurate nor Precise

When Accuracy and Precision Get “Snacky”

Maybe this example can help to further show the differences. Let’s talk potato chips! Suppose you’re a snack foods manufacturer producing 12 oz. bags of potato chips. You test the weight of the bags using a scale that measures the bags precisely (in other words, there is little variation in the measurements), but not accurately – measuring 13.2 oz., 13.33 oz., and 13.13 oz. for three samples.

Or maybe the scale is accurate, measuring the three samples at 12.02 oz., 11.74 oz., and 12.43 oz., but not precise. In this case, the measurements have a larger variance, but the average of the measurements is very close to the target value of 12 oz.

Or maybe your measurements are all over the place, with samples measuring at 11.64 oz., 12.35 oz., and 13.04 oz., in which case your scale may be neither accurate nor precise.

But how can you detect these problems in your measurement system?

Evaluating Accuracy & Precision

Accuracy and precision can be easily evaluated through many measurement systems analysis tools in Minitab Statistical Software, including Gage Linearity and Bias Studies and Gage R&R Studies, which can help you reveal if a scale needs to be recalibrated or if your newly hired operators are measuring ingredients consistently.

What should you do if you detect accuracy and/or precision errors? Focus on improving your measurement system before relying on your data and moving forward with your improvement project. Allow the results of your Gage R&R Study to help you decide if recalibrating a scale or conducting more training for new hires might be just what you need to get your measurement systems back on track.

Understanding Measurement Systems Analysis

Unbalanced Designs and Gage R&R (Expanded)

7 Deadly Statistical Sins Even the Experts Make

Do you know how to avoid them?

Name: S. Luko • Wednesday, August 22, 2012

The term "accuracy" is now defined as a composite of bias as well as precision. The new "accuracy" in the old sence is now called "bias". See ISO terminology definition below.

3.3.1
accuracy
closeness of agreement between a test result
(3.4.1) or measurement result (3.4.2) and the true
value (3.2.5)
NOTE 1 In practice, the accepted reference value
(3.2.7) is substituted for the true value.
NOTE 2 The term “accuracy”, when applied to a set of
test or measurement results, involves a combination of
random components and a common systematic error or
bias component.
NOTE 3 Accuracy refers to a combination of trueness
(3.3.3) and precision (3.3.4).

Name: Marion Foster • Wednesday, August 22, 2012

Thanks for the newsletter! I am a manager of the Laboratory here at Spirit AeroSystems and in the future via a telecom or newsletter I would like to see some tutorials for calculating Uncertainty Analysis for measurements if possible. Thanks, Marion Foster

Name: webb burgess • Wednesday, August 22, 2012

Very good description of the difference between accuracy and precision. I think it would be a good idea to include the topics of resolution and significant digits in this commentary too. This would, perhaps, prevent writing 13.2, 13.33 and 13.13. It is difficult to explain precision when the number of significant digits or resolution varies from measurement to measurement. The 13.2 should probably be recorded as 13.20. Contrary to most people's belief the "0" is important.

Name: Carly Barry • Friday, September 7, 2012

Hi Webb - Thanks for reading and for your comment! Great idea - resolution and significant digits would be helpful topics to add to a future post!