Minitab Blog

7 Top Talks from the Minitab Insights Conference

Written by Minitab Blog Editor | Jan 24, 2019 4:19:46 PM

 

Great stories others want to hear. Common challenges we have all faced before (or might even be facing right now). Discovering new tools in Minitab software as shared by peers. Engaging walkthroughs of finding insights in your data, and recommendations on how to act on them. All packed into a few days of learning and fun.

For the past three years, we have been honored to host all of the above at the annual Minitab Insights Conference. Minitab collects Voice of the Customer data in the form of evaluation sheets at each session, asking the audiences’ opinions on whether they learned something they can put into practice, the clarity and organization of the presentation, the presenter’s knowledgeability on the topic and more.

The speakers at our 2018 conference were excellent, and the data shows it! The average of all their ratings was 4.5 out of 5. Here are some highlights from some standouts in the beginner, intermediate and advanced tracks.

Beginner Track

1. Using Minitab to Hire the Right Candidate

Anthony Santillanes, Operations Manager, ProMiles Software Development Corporation

An 8.4% permanent hire turnover rate is not bad when the industry is at 10%, but Anthony and his colleagues found their problem was in the rates they were seeing in the first 12 months (50%) and 90 days (38%), including changing out an entire 10-person team over the course of four years!

With tens of thousands of dollars on the line in wages and staffing agency fees (let alone software, training and other expenses), they set out on a path to find and keep the right candidates. They performed a strengths, weaknesses, opportunities and threats analysis (SWOT) using comments from team members bucketed into different categories, then visualized it in Minitab Statistical Software using interval plots, multivariate plots, bubble plots and more. They used the findings to improve how they hire and how they train.

“We’ve changed a team’s perception of itself and validated that using Minitab,” Anthony said. Insights Conference attendees were interested to see data-driven decisions being used in Human Resources. 

 

2. Getting Started with Big Data – a Lean Six Sigma and Quality Practitioner's Guide

Kristine Bradley, Principal, Firefly Consulting

Big data has applications everywhere – from reducing safety-related incidents in manufacturing to increasing the response rate by 3% in marketing financial services (and significantly reducing mailing costs at the same time).

Kristine demonstrated how big data analytics draw some inspiration from the DMAIC process (Define, Measure, Analyze, Improve, Control) and how practitioners can expand their skills in this rapidly growing and complementary area – starting with logistic regression (already available in Minitab Statistical Software!).

Attendee comments included “love the fun facts and great life application examples” and “great presentation, great pace.”

 

3. Know Your Limits: FDA Compliant Process Validation all in Minitab!

Erik Sherburne, Quality Manager, Advanced Molding Technologies

From leading the audience in the “Minitab fight song” to help keep them going during one of the last sessions on a Friday to using examples of buying his daughters popsicles on a hot day at the carnival to illustrate risk assessment, Erik Sherburne was both entertaining and educational.

The US Food and Drug Administration (FDA) is on our side, he reminded everyone. By enforcing process validation, “ultimately what they’re trying to do is help companies make better product that’s safe for the market.” He shared that at an FDA conference a couple years ago he learned the number one cited regulatory action by the FDA for production and process controls is a lack of process validation. And it’s not just that – it’s also good business sense to develop a robust process that works.

Sherburne showed how to use Minitab tools such as Measurement Systems Analysis, DOE, Capability Studies and Orthogonal Regression in order to validate a process. He introduced terms like Installation Qualification (IQ) to document that equipment is functional, Characterization to document your company’s understanding of the process using DOE, Operational Qualification (OQ) to document the process is capable, and Performance Qualification (PQ) to document the process is repeatable.

One audience member said Sherburne is a natural presenter and showman, and he covered key details perfectly. Others said he was energetic and passionate and the presentation was one of the best and most useful for them.

 

Related Post: Learn more about process validation in detail

 

Intermediate Track

4. Using TreeNet to Reduce Wait Times at Disney Theme Parks

Fred Hazelton, Data Scientist, TouringPlans.com

TouringPlans.com creates uses machine learning and analytics to help families solve a “tale as old as time” — long lines at theme parks. Their Disney World crowd calendar shows how busy each Disney theme park is on every day of your trip, with an accuracy down to 15-minute intervals. Chief Data Scientist Fred Hazelton shared how they use TreeNet in Minitab’s Salford Predictive Modeler (SPM) software to increase accuracy, understand the variables that really matter to the process and isolate the influence of one particular field.

They can tell customers which attractions tend to have more accurate times posted by Disney, and they actually realized that on average the actual time of your wait is only 65% of the displayed time! This is all with the help of their own data as well as customers to the website submitting their wait times, thereby improving the accuracy of their posted wait times. 

One attendee said it was their favorite session so far. We like to see when Insights attendees are actually able to put what they learn to good use, so this comment from another attendee was great to see too: “Awesome! Downloaded the app ... Taking family to Disneyland in February and can't wait to share some statistical science!”   

 

5. Assessing Process Capability When Normality Tests Fail

Cheryl Pammer, Sr. Advisory User Experience Designer

Whether there are extreme outliers, mixed distributions, large sample sizes or even issues with the measurement system, nonnormal data can certainly present an obstacle – barring you from using several types of analyses.

Minitab trainer, statistical consultant and software designer Cheryl Pammer showed some strategies using capability analysis when you’re facing nonnormal data. She walked us through some real-life examples, including one of testing for leakage in a prescription bottle, where a traditional process capability analysis that assumes normality suggests the process is capable. In actuality – given that the data is not normal – that answer is not correct. Using 3-parameter Weibull or Box-Cox analyses though, you correctly find that the process is not capable.

One attendee said they loved it: “You made it very easy to understand. Thank you!”

 

Related Post: Cheryl Pammer explains Intervals in Healthcare and Medical Devices

 

Advanced Track

6. Regression with Life Data to Save and Sustain Lives

Frank Dudzik, Engineering Manager, Baxter

Who has had a great experimental design, flawless execution but still encounters problems with analysis due to data distribution? Or problems at the tails instead of the average, or censored response data, or all those things at once?

Frank Dudzik of Baxter told the Insights audience how Regression with Life Data was used to help identify process controls that ensured manufacturing of life-sustaining therapies could continue at the expected quality level from their customers.

With the advantages over ANOVA, DOE and other types of analyses Regression with Life Data showed, they found it to be the right tool for them. Unlike other regression analyses, Regression with Life Data accepts censored data and uses different distributions to model the data. You can also use this analysis to estimate other percentiles besides the 50th.

Comments included “Great presentation! Super interesting content to propose.” And “Explained a complex subject in engineering lay terms! Great job!”

 

7. Design of Experiments and Gage R&R in Forensic Science Applications

Michelle Mancenido, Assistant Professor, Arizona State University

One of our favorite aspects of Insights is seeing how Minitab software is used in areas where we have not seen it before. With Mickey Mancenido’s analysis visualizing how fingerprints degrade over time, it felt like seeing the science behind TV shows like CSI , Forensic Files and Criminal Minds.

More than 60 people gathered in this breakout session to see how Gage R&R measured the deviation between a reference image in their database and the images captured by different investigators. Establishing the chronology of events around a crime leads to forming a suspect pool, Mancenido explained, and accurately predicting the age of fingerprints left behind at a crime scene can put you in or out of the suspect pool. It shows whether the fingerprints were left before or after the crime happened.

Many attendees enjoyed Mancenido’s speaking style – one remarked that she’s clearly a professional teacher and another said she relayed the information in a very engaging manner and they enjoyed learning about forensic science . 

We were thrilled to hear another attendee discover new features they had not used before: “I've used Minitab 18 years and never knew about split plot designs!” they said. “I can definitely use this in the future and should have used it in the past.”