Typically these pieces have an attention-grabbing headline, like Six Sigma Initiative Fails to Save the Universe, followed by a dissection of a deployment or project that failed—usually in spectacular fashion—to achieve its goals.
"There!" the writer typically crows. "See? It's obvious Six Sigma doesn't work!" What makes these articles misleading and potentially dangerous is that there's almost always a kernel of truth to the story. Projects do fail. Deployments do go awry. Without continued monitoring, quality improvements do wane.
But none of that means you shouldn't pursue a quality strategy based on data analysis, whether you happen to call it Six Sigma or something else. It's a mistake to use an example of a failed project as a rationale for not doing data-driven quality improvement, because there are also countless stories of quality improvements that saved businesses millions of dollars, deployments that have transformed organizations for the better, and benefits that have been sustained for years.
Business improvement trends that don't work don't last long. Six Sigma principles have been used widely in businesses worldwide for more than 30 years. That's why even though articles have tried to sound the death knell for Six Sigma for at least 15 years, we're still talking about it (and doing it) today. Done correctly, it works.
One recent article cited this plea for assistance, evidently lifted verbatim from some discussion forum, as a reason why Six Sigma is doomed to failure in the real world:
"...There is a Call Center process in which CTQ is Customer Response Time. If Customer Response Time is more than >60sec then its a defect. In this Case opportunity will be 1. Suppose there are 300 Calls and out of that 123 call Response time is >60 sec (This is defect) and opportunity is 1 then my Sigma value will be 1.727. This value came by this Formulae process Sigma = NORMSINV(1-((Total Defects) / (Total Opportunities))) + 1.5 and DPMO is 410000, this value came by this formulae Defects Per Million Opportunities (DPMO) = ((Total Defects) / (Total Opportunities)) * 1,000,000
Question is, Can anybody tell me from this case how can i get bell curve. How I should calculate USL and LSL value. How I will draw this Bell Curve in a excel. Waiting for all your master response."
The author goes on to point out, and rightly so, that all this math isn't worth much without an explanation of "what can I—as a business person—do with it?" He continues:
Presented with information of this nature I would find it hard to glean anything useful in terms of process improvement to action. What’s more, turning a process into a set of calculations to eke the last drop of efficiency from it appears counterproductive.
He then goes on to state that, at some point, making incremental improvements will fail to yield significant returns (yeah...), that sometimes we need to scrap an entire process to achieve the greatest gains (sure, sure...), and that continuous improvement is about adaptation (no argument there...). The author wraps up by asserting that Six Sigma "is about the only thing that isn't" adapting. And since the core principles and practices of Six Sigma have remained consistent over the years, I can agree with that statement in its broadest sense.
Huh. The writer hasn't said anything I disagree with...but where's his evidence that "Six Sigma doesn't work in the real world"?
It's clear that the author is really arguing against the misapplication of data analysis to things that won't make any difference to quality—an idea that any quality practitioner should concur with. But what he describes is not Six Sigma: it's using math as a substitute for squid ink.
If you expend vast resources on an incremental process improvement project that doesn't have the potential for significant benefits, you're not really practicing quality improvement, let alone the typically more formal "Six Sigma." Moreover, the managers (or "champions") who would greenlight such a project aren't doing due diligence to make sure resources are being directed where they'll do the most good.
Here are some things that dedicated quality improvement practitioners do to make sure projects succeed:
Successful quality improvement needs data analysis to succeed, but analyzing data aimlessly is not the same thing as doing data-driven quality improvement.
Similarly, asserting "Six Sigma doesn't work" because it's not always done properly is like saying "Pianos don't work" because they don't sound good if you play them with mittens on.
What do you think?