dcsimg
 

Quality Improvement

Blog posts and articles about using statistics and data analysis to improve quality through methodologies such as Lean and Six Sigma.

When I blogged about automation back in March, I made my husband out to be an automation guru. Well, he certainly is. But what you don’t know about my husband is that while he loves to automate everything in his life, sometimes he drops the ball. He’s human; even I have to cut him a break every now and then. On the other hand, instances of hypocrisy in his behavior tend to make for a good story.... Continue Reading
Here is a scenario involving process capability that we’ve seen from time to time in Minitab's technical support department. I’m sharing the details in this post so that you’ll know where to look if you encounter a similar situation. You need to run a capability analysis. You generate the output using Minitab Statistical Software. When you look at the results, the Cpk is huge and the histogram in... Continue Reading

LIVE WEBINAR | MAY 4, 10:00 AM EST

Smarter Process Improvement

with Companion by Minitab

SIGN UP TODAY >
 
Design of Experiments (DOE) is the perfect tool to efficiently determine if key inputs are related to key outputs. Behind the scenes, DOE is simply a regression analysis. What’s not simple, however, is all of the choices you have to make when planning your experiment. What X’s should you test? What ranges should you select for your X’s? How many replicates should you use? Do you need center... Continue Reading
It's been called a "demographic watershed".  In the next 15 years alone, the worldwide population of individuals aged 65 and older is projected to increase more than 60%, from 617 million to about 1 billion.1 Increasingly, countries are asking themselves: How can we ensure a high quality of care for our growing aging population while keeping our healthcare costs under control? The answer? More... Continue Reading
Earlier this month, PLOS.org published an article titled "Ten Simple Rules for Effective Statistical Practice." The 10 rules are good reading for anyone who draws conclusions and makes decisions based on data, whether you're trying to extend the boundaries of scientific knowledge or make good decisions for your business.  Carnegie Mellon University's Robert E. Kass and several co-authors devised... Continue Reading
by Matthew Barsalou, guest blogger Control charts plot your process data to identify and distinguish between common cause and special cause variation. This is important, because identifying the different causes of variation lets you take action to make improvements in your process without over-controlling it. When you create a control chart, the software you're using should make it easy to see where... Continue Reading
The last thing you want to do when you purchase a new piece of software is spend an excessive amount of time getting up and running. You’ve probably been ready to the use the software since, well, yesterday. Minitab has always focused on making our software easy to use, but many professional software packages do have a steep learning curve. Whatever package you’re using, here are three things you... Continue Reading
Suppose you’ve collected data on cycle time, revenue, the dimension of a manufactured part, or some other metric that’s important to you, and you want to see what other variables may be related to it. Now what? When I graduated from college with my first statistics degree, my diploma was bona fide proof that I'd endured hours and hours of classroom lectures on various statistical topics, including l... Continue Reading
This is an era of massive data. A huge amount of data is being generated from the web and from customer relations records, not to mention also from sensors used in the manufacturing industry (semiconductor, pharmaceutical, petrochemical companies and many other industries). Univariate Control Charts In the manufacturing industry, critical product characteristics get routinely collected to ensure... Continue Reading
The Pareto chart is a graphic representation of the 80/20 rule, also known as the Pareto principle. If you're a quality improvement specialist, you know that the chart is named after the early 20th century economist Vilfredo Pareto, who discovered that roughly 20% of the population in Italy owned about 80% of the property at that time. You probably also know that the Pareto principle was... Continue Reading
When you analyze a Gage R&R study in statistical software, your results can be overwhelming. There are a lot of statistics listed in Minitab's Session Window—what do they all mean, and are they telling you the same thing? If you don't know where to start, it can be hard to figure out what the analysis is telling you, especially if your measurement system is giving you some numbers you'd think are... Continue Reading
There has been plenty of noisy disagreement about the state of health care in the past several years, but when you get beyond the controversies surrounding various programs and changes, a great deal of common ground exists. Everyone agrees that there's a lot of waste and inefficiency in the way we've been doing things, and that health care should be delivered as efficiently and effectively as... Continue Reading
While the roots of Lean Six Sigma and other quality improvement methodologies are in manufacturing, it’s interesting to see how other organizational functions and industries apply LSS tools successfully. Quality improvement certainly has moved far beyond the walls of manufacturing plants! For example, I recently had the opportunity to talk to Drew Mohler, a Lean Six Sigma black belt and senior... Continue Reading
When I wrote How to Calculate B10 Life with Statistical Software, I promised a follow-up blog post that would describe how to compute any “BX” lifetime. In this post I’ll follow through on that promise, and in a third blog post in this series, I will explain why BX life is one of the best measures you can use in your reliability analysis. As a refresher, B10 life refers to the time at which 10% of... Continue Reading
If you need to assess process performance relative to some specification limit(s), then process capability is the tool to use. You collect some accurate data from a stable process, enter those measurements in Minitab, and then choose Stat > Quality Tools > Capability Analysis/Sixpack or Assistant > Capability Analysis. Now, what about sorting the data? I’ve been asked “why does Cpk change when I... Continue Reading
Any time you see a process changing, it's important to determine why. Is it indicative of a long term trend, or is it a fad that you can ignore since it will be gone shortly?  For example, in the 2014 NBA Finals, the San Antonio Spurs beat the two-time defending champion Miami Heat by attempting more 3-pointers (23.6 per game) than any championship team in league history. In the 2015 regular... Continue Reading
In an earlier post, I shared an overview of acceptance sampling, a method that lets you evaluate a sample of items from a larger batch of products (for instance, electronics components you've sourced from a new supplier) and use that sample to decide whether or not you should accept or reject the entire shipment.  There are two approaches to acceptance sampling. If you do it by attributes, you... Continue Reading
Now that we've seen how easy it is to create plans for acceptance sampling by variables, and to compare different sampling plans, it's time to see how to actually analyze the data you collect when you follow the sampling plan.  If you'd like to follow along and you're not already using Minitab, please download the free 30-day trial.  Collecting the Data for Acceptance Sampling by Variable If you'll... Continue Reading
In my last post, I showed how to use Minitab Statistical Software to create an acceptance sampling plan by variables, using the scenario of a an electronics company that receives monthly shipments of LEDs that must have soldering leads that are at least 2 cm long. This time, we'll compare that plan with some other possible options.  The variables sampling plan we came up with to verify the... Continue Reading
If you're just getting started in the world of quality improvement, or if you find yourself in a position where you suddenly need to evaluate the quality of incoming or outgoing products from your company, you may have encountered the term "acceptance sampling." It's a statistical method for evaluating the quality of a large batch of materials from a small sample of items, which statistical softwar... Continue Reading