Guest Post: How quickly and effectively can you drive to a solution?

Minitab Guest Blogger | 30 August, 2018

Topics: Automotive, Manufacturing, Six Sigma, Continuous Improvement, Data Analysis, Quality Improvement

Bob Thomas, Technical Leader and Manager in Ford Motor Company’s Global Data Insights and Analytics organization

As a Technical Leader and Manager in Ford Motor Company’s Global Data Insights and Analytics organization, Bob Thomas is responsible for supporting data-driven analytic solution development that produces business, product and manufacturing insights.

 

Today, the scientific method includes a deductive logic element – testing a premise or hypothesis – as well as an inductive reasoning element – leveraging empirical observations or data. Deductive logic has been used for millennia to describe the natural world. On the other hand, the explosion of data availability and machine learning has led some to even suggest that inductive reasoning will make deduction obsolete – “the end of science.” However, an approach that supports learning, as well as drives toward a solution, involves the sequential application of deductive and inductive approaches. The faster one can cycle between them, the faster one can arrive at an effective solution.

Here are some key insights to drive rapidly and effectively to business solutions using the scientific method.

 

1. Be persistent, but not too committed to the hypothesis, premise or idea being proposed and investigated.

I can’t tell you how many times a manager has asked me to prove their hypothesis. For example, one manager thought wheel-to-hub clearance was the main cause of a vibration concern in an automobile.

To begin with, there is an awkwardness to this type of request because falsifying the null hypothesis – the devil’s advocate that the wheel-to-hub clearance does not have an effect – does not prove that it does. When the data was collected and the analysis did not falsify the null hypothesis (that is, there was no reason to believe the hub-to-bore clearance effect wasn’t “0”), the next request was to collect more data (surely that was the problem).

Unfortunately, when that didn’t help, the next demand was to work harder. Obviously, such a rigid commitment to a hypothesis, to the exclusion of any data-driven inductive reasoning, can delay the finding of an effective solution.

 

2. On the other hand, don’t be exclusively committed to the data and inductive reasoning.

Today, there is a plethora of inductive reasoning tools (e.g. conventional and regularized regression, traditional machine and deep learning, and decision trees to name a few) that help one navigate through datasets. However, any solution developed with these approaches make the bold assumption that the system under study will remain unchanged over time.

In my experience, a run chart of the difference between what was predicted and what actually happened, in a non-trivial setting, has been one of the best ways to demonstrate that the inductively-reasoned solution still applied and the system generating the data has not changed.

For example, once I helped a team develop a model for determining how a particular third-party source would score and rate automotive products. Fortunately, the team was convinced of the importance of graphing the difference between “what the team said and what the third-party source reported.” The team investigated large discrepancies when they occurred. Often, they learned the third-party source had changed their scoring system. Eventually, this led to formal roundtable discussions, hosted by the third-party source, to let manufacturers know of any changes they were considering, deleting, and/or implementing to their scoring system. This was quite beneficial since manufacturers and the third-party source had the same goal -delivering and informing consumers of fantastic products.

 

3. Knowing everything possible might work for theoretical research, but not for business

Finally, engineers and scientists – by nature and training – try to leave as little as possible to chance before providing a solution. This can take an inordinate amount of time. When doing theoretical research this may be acceptable; in business this tactic is unacceptable.

I remember when there was a particular reliability issue a Manager was trying to resolve. The team had many ideas, premises, and theories on how to increase the system’s reliability. One accelerating life parameter was temperature. However, there were many ways to characterize how the system experienced temperature. The engineers wanted to select the “best” characterization. Studying the many temperature characterizations would have taken time. In this case, using any characterization of temperature to accelerate the life test and gain information for fundamental improvement would help (it did not need to be the “best”). Understanding and quantifying uncertainty help make this call.   

Today’s tools and techniques, in the context of the scientific method, allow the analyst to better meet the challenges of a rapidly changing world by asking and formulating good questions and swiftly cycling between hypotheses and inductive reasoning which enable them to arrive at more effective (but perhaps not perfect) solutions faster.