dcsimg
 

Data Analysis

Blog posts and articles with tips for data analysis for quality improvement methodologies, including Six Sigma and Lean.

If you have a process that isn’t meeting specifications, using the Monte Carlo simulation and optimization tool in Companion by Minitab can help. Here’s how you, as a chemical technician for a paper products company, could use Companion to optimize a chemical process and ensure it consistently delivers a paper product that meets brightness standards. The brightness of Perfect Papyrus Company’s new... Continue Reading
As someone who has collected and analyzed real data for a living, the idea of using simulated data for a Monte Carlo simulation sounds a bit odd. How can you improve a real product with simulated data? In this post, I’ll help you understand the methods behind Monte Carlo simulation and walk you through a simulation example using Companion by Minitab. Companion by Minitab is a software platform that... Continue Reading

7 Deadly Statistical Sins Even the Experts Make

Do you know how to avoid them?

Sign Up Today >
Choosing the right type of subgroup in a control chart is crucial. In a rational subgroup, the variability within a subgroup should encompass common causes, random, short-term variability and represent “normal,” “typical,” natural process variations, whereas differences between subgroups are useful to detect drifts in variability over time (due to “special” or “assignable” causes). Variation within... Continue Reading
Do your executives see how your quality initiatives affect the bottom line? Perhaps they would more often if they had accessible insights on the performance, and ultimately the overall impact, of improvement projects.  For example, 60% of the organizations surveyed by the American Society for Quality in their 2016 Global State of Quality study say they don’t know or don’t measure the financial... Continue Reading
Have you ever tried to install ventilated shelving in a closet?  You know: the heavy-duty, white- or gray-colored vinyl-coated wire shelving? The one that allows you to get organized, more efficient with space, and is strong and maintenance-free? Yep, that’s the one. Did I mention this stuff is strong?  As in, really hard to cut?  It seems like a simple 4-step project. Measure the closet, go the... Continue Reading
Grocery shopping. For some, it's the most dreaded household activity. For others, it's fun, or perhaps just a “necessary evil.” Personally, I enjoy it! My co-worker, Ginger, a content manager here at Minitab, opened my eyes to something that made me love grocery shopping even more: she shared the data behind her family’s shopping trips. Being something of a data nerd, I really geeked out over the... Continue Reading
If you regularly perform regression analysis, you know that R2 is a statistic used to evaluate the fit of your model. You may even know the standard definition of R2: the percentage of variation in the response that is explained by the model. Fair enough. With Minitab Statistical Software doing all the heavy lifting to calculate your R2 values, that may be all you ever need to know. But if you’re... Continue Reading
In Parts 1 and 2 of Gauging Gage we looked at the numbers of parts, operators, and replicates used in a Gage R&R Study and how accurately we could estimate %Contribution based on the choice for each.  In doing so, I hoped to provide you with valuable and interesting information, but mostly I hoped to make you like me.  I mean like me so much that if I told you that you were doing... Continue Reading
Earlier, I wrote about the different types of data statisticians typically encounter. In this post, we're going to look at why, when given a choice in the matter, we prefer to analyze continuous data rather than categorical/attribute or discrete data.  As a reminder, when we assign something to a group or give it a name, we have created attribute or categorical data.  If we count something, like... Continue Reading
You run a capability analysis and your Cpk is bad. Now what? First, let’s start by defining what “bad” is. In simple terms, the smaller the Cpk, the more defects you have. So the larger your Cpk is, the better. Many practitioners use a Cpk of 1.33 as the gold standard, so we’ll treat that as the gold standard here, too. Suppose we collect some data and run a capability analysis using Minitab Statisti... Continue Reading
In Part 1 of Gauging Gage, I looked at how adequate a sampling of 10 parts is for a Gage R&R Study and providing some advice based on the results. Now I want to turn my attention to the other two factors in the standard Gage experiment: 3 operators and 2 replicates.  Specifically, what if instead of increasing the number of parts in the experiment (my previous post demonstrated you would need... Continue Reading
by Kevin Clay, guest blogger In transactional or service processes, we often deal with lead-time data, and usually that data does not follow the normal distribution. Consider a Lean Six Sigma project to reduce the lead time required to install an information technology solution at a customer site. It should take no more than 30 days—working 10 hours per day Monday–Friday—to complete, test and... Continue Reading
"You take 10 parts and have 3 operators measure each 2 times." This standard approach to a Gage R&R experiment is so common, so accepted, so ubiquitous that few people ever question whether it is effective.  Obviously one could look at whether 3 is an adequate number of operators or 2 an adequate number of replicates, but in this first of a series of posts about "Gauging Gage," I want to look at... Continue Reading
Everyone who analyzes data regularly has the experience of getting a worksheet that just isn't ready to use. Previously I wrote about tools you can use to clean up and eliminate clutter in your data and reorganize your data.  In this post, I'm going to highlight tools that help you get the most out of messy data by altering its characteristics. Know Your Options Many problems with data don't become... Continue Reading
In Part 1 of this blog series, I compared Six Sigma to a diamond because both are valuable, have many facets and have withstood the test of time. I also explained how the term “Six Sigma” can be used to summarize a variety of concepts, including philosophy, tools, methodology, or metrics. In this post, I’ll explain short/long-term variation and between/within-subgroup variation and how they help... Continue Reading
You've collected a bunch of data. It wasn't easy, but you did it. Yep, there it is, right there...just look at all those numbers, right there in neat columns and rows. Congratulations. I hate to ask...but what are you going to do with your data? If you're not sure precisely what to do with the data you've got, graphing it is a great way to get some valuable insight and direction. And a good graph to... Continue Reading
In my last post, I wrote about making a cluttered data set easier to work with by removing unneeded columns entirely, and by displaying just those columns you want to work with now. But too much unneeded data isn't always the problem. What can you do when someone gives you data that isn't organized the way you need it to be?   That happens for a variety of reasons, but most often it's because the... Continue Reading
Did you know the most popular diamond cut is probably the Round Brilliant Cut? The first early version of what would become the modern Round Brilliant Diamond Cut was introduced by an Italian named Vincent Peruzzi, sometime in the late 17th century.  In the early 1900s, the angles for an "ideal" diamond cut were designed by Marcel Tolkowsky. Minor changes have been made since then, but the angles... Continue Reading
B'gosh n' begorrah, it's St. Patrick's Day today! The day that we Americans lay claim to our Irish heritage by doing all sorts of things that Irish people never do. Like dye your hair green. Or tell everyone what percentage Irish you are. Despite my given name, I'm only about 15% Irish. So my Irish portion weighs about 25 pounds. It could be the portion that hangs over my belt due to excess potatoes... Continue Reading
Isn't it great when you get a set of data and it's perfectly organized and ready for you to analyze? I love it when the people who collect the data take special care to make sure to format it consistently, arrange it correctly, and eliminate the junk, clutter, and useless information I don't need.   You've never received a data set in such perfect condition, you say? Yeah, me neither. But I can... Continue Reading