Explaining Quality Statistics So My Boss Will Understand: Measurement Systems Analysis (MSA)

Minitab Blog Editor | 4/22/2013

Topics: Quality Improvement

As a teenaged dishwasher at a local eatery, I had a boss who'd never washed dishes in a restaurant himself. I once spent 40 minutes trying to convince him that forks and spoons should go in their holders with the business end up, while knives should go in point-down. Whatever I said, he didn't get it. We were ordered to put forks and spoons in the holders with the handles up.

The outraged wait staff soon made clear what I hadn't: you can't immediately tell the difference between a fork and a spoon when all you can see is the handle! Explaining that in the right way would have minimized wasted time and  waitress anger.   

I knew nothing about statistics then. Now that I do, I often need to explain statistical concepts to people who don't know (and usually don't care) about the calculations and assumptions -- they just want the bottom line as fast as possible. I've found it useful to imagine how I could explain things to my old boss so that even he could understand. 

If you've got a boss who doesn't appreciate why we need to do the things we do to analyze data, maybe these thoughts will help you. 

"Why Do We Need this MSA Thing? Just Grab Some Numbers and Go." 

What's the first question you ask when you're about to run an analysis? For me, it's very simple: "Can I trust my data?"  If the answer to that question is "No," or even "I'm not sure," it's usually pointless to go any further: why would you spend any valuable time to interpret data you can't rely on? 

That's where measurement systems analysis, or MSA, comes in. MSA is a collection of methods you can use to assess your ability to collect  trustworthy, reliable data--the kind of data you want to analyze. 

MSA helps determine how much of the overall variation  found in your data is due to variation in your measurement system itself. Factors that might affect the measurement system can include data collection procedures, gages, and other test equipment, not to mention the individuals who are taking the measurements. It's just good sense to evaluate your measurement system before control charting, capability analysis, or any another analysis: it proves that your measurement system is accurate and precise, and your data are trustworthy.

Let's say you're overseeing a production line making a new line of precision screws that need to meet strict specifications. The value of doing an MSA seems self-evident to you, but you've got a boss who hasn't needed to actually measure anything since grade school. "Look," he tells you, "we've got the best people using the best equipment money can buy. Don't waste time measuring how you measure stuff! Just get some quick numbers to show we're meeting the customer's requirements." 

You know that even if you do have the best people and the best equipment available, you can't prove your products meet spec unless you've proved your measurements are reliable. But the customer's not specifically asking for an MSA, just some inspection data.  So how can you convince your boss it's worth the time to do an MSA first? 

Getting a Handle on Measurement System Error

On the surface, measurement can seem pretty straightforward: just record the number you see, and you're done! But that's not really the case. There will always be some degree of variation or error in a measurement system, and you can put those errors into two different categories: accuracy and precision. "Accuracy" is how close the measurement is to the actual value being measured.  "Precision" refers to the ability to measure the same part with the same device consistently. 

If your measurement system is working perfectly, wonderful: you won't have problems with accuracy or precision. But most systems can have one or both of these problems, and even if the system was working great a few months ago, something may have changed in the interim. The device that worked great last month might have slipped out of calibration, either through accident or just plain wear and tear. You might have a gage that measures the same part consistently (it has precision), but that measurement is wrong. Or the device might take measurements that are close to the actual value (it has accuracy), but show a lot of variation between multiple measurements of the same part. And you might have a device that records both inaccurate and widely variant measurements. 

The easiest way to visualize the value of an MSA is to think of targets: the bullseye is the actual value that you're measuring, and a good measurement hits the target right in the center each time. Doing an MSA is like looking at the pattern of "shots" so you can see if and where a device is having problems, and how to adjust it if needed, as shown below.

visualizing precision and accuracy in measurement systems analysis

And thus far we're just considering the device being used to take the measurement. If your measurements are being done and/or recorded by human beings, with all their innate potential for error and variation, and you can quickly see where doing data analysis without doing an MSA first creates boundless opportunities for disaster. 

Seeing (and Measuring) Is Believing

If your boss still doesn't get it, a hands-on demonstration involving actual human beings will probably help, and can even be fun. Get a bag of your boss's favorite candy and take a couple of samples. Measure their height or width yourself, and use these as your "standards." Now you can use Minitab to create a worksheet that will help you gather data for a simple Gage R&R study.  My colleague Cody Steele detailed how to analyze this data by doing a simple gummi bear experiment with his coworkers, which serves as a good model for what you can do with your boss.

Depending on how you set up your demonstration, you are very likely to find that different measuring devices and operators can assess the same part or sample and generate different results. The more error in the measurements, the more likely it is that any decisions based on those measurements will be in error, too.

And that means doing an MSA and making sure you're getting the good measurements you want can make the difference between a product that meets customer specifications and one that suggests your whole company has a couple of screws loose. 

Look for a statistical software package that offers MSA tools to help you evaluate accuracy and precision in your measurement systems. For example, Minitab offers tools including:

  • Gage R&R Study (crossed and nested)
  • Gage Run Chart
  • Gage Linearity and Bias Study
  • Type 1 Gage Study
  • Attribute Agreement Analysis
  • Attribute Gage Study (AIAG method)

If you don't already use it, you can try out these tools yourself with a free 30-day trial of our statistical software