Of all the practical applications of big data and analytics, quality has to be near the top of the list. After all, quality has always been statistically oriented.
In the “old days,” the tools for analyzing quality were somewhat limited. You knew how many parts were produced, and you knew what percentage failed. You could relate quality problems to certain processes and facilities, but visibility into such matters was not good.
That’s changing rapidly. Global manufacturers have invested heavily in modern information platforms and manufacturing execution systems, with the result that more data than ever is now available. In some companies, factories and supply chains have been highly digitized – to the point where every process is recorded and reported digitally. As the Industrial Internet of Things continues to evolve, this trend of rich and deep information availability will only accelerate.
Given the importance of quality in today’s world, in terms of both profitability and brand integrity, now might be a good time to ask: How far along is your enterprise in leveraging this information to improve quality?
A Maturity Model for Quality Analytics
LNS Research has been writing about this topic for quite some time. One blog post talks challenges readers to see how they compare in their Quality Management maturity. In this light, manufacturers can evaluate their Quality Analytics maturity too in a similar way that firms use Enterprise Manufacturing Intelligence strategies to gain more value from their business intelligence systems.
In a recent positioning paper, LNS Research defined a four-stage framework for evaluating a manufacturer’s maturity level. Each step is a prerequisite for the next, so the framework is a measure of how far along a company is in leveraging big data for Quality Analytics.
Here, briefly, are the 4 stages:
- Descriptive. The adage “You can’t fix what you don’t know” certainly applies to Quality. Data analysis can give you very specific information on the nature of a problem—when it happens, where, how, at what cost, etc. It often allows you to discover problems you didn’t even know about, by identifying anomalies that can’t be seen in a simple spreadsheet. For example, Statistical Process Control (SPC) can provide insights into production processes that could be improved to elevate overall product quality.
- Diagnostic. Once you have the data, you can investigate the cause and find solutions. Of course, root cause analysis is not new, but as part of the larger framework, its power is greatly increased with a greater volume of data that is increasingly available. Manufacturers that have achieved this level of maturity have implemented enterprise genealogy tools, built global execution platforms, and have pinpointed root cause analysis with extreme specificity, down to specific suppliers, lots, lines, personnel and the time of the day. Just as importantly, they can do this much faster than before. In many cases, a root cause analysis that used to take weeks can now be done in an hour or two—greatly reducing the cost and harm of quality problems.
- Predictive. The first two stages might be considered “fire-fighting” levels. Once those issues are in hand, the next step of the maturity curve is to get ahead of the problems. Predictive analytics uses sophisticated algorithms and modeling software to sift through historical data, identify trends and outcomes and predict future events. Those events might be anything from equipment degradation problems (based on data about current processes and expected maintenance labor shortages), to possible supplier delays (based on social media chatter and news analysis). It all depends on the interests of the manufacturer, the quality and extent of the data, and the sophistication of the analytics tools being used. Not many manufacturers have reached this level of analytics maturity, but expect the number to grow as more companies lay the foundation in Stages 1 and 2, and as the science of predictive analytics progresses.
- Prescriptive. This stage is like Diagnostics, except applied to future problems. The idea is to get so far ahead of the game that you don’t even see problems turning up in Predictive Analytics anymore! At this stage, manufacturers are using a combination of real-time data mashups, predictive tools and diagnostics, all with an eye toward optimizing business practices in the future—tomorrow, next week, or next year. For example, a manufacturer could be constantly modeling its logistics networks, adjusting suppliers or other factors to optimize quality and efficiency. Long before new product introduction becomes a reality, a manufacturer could be at work analyzing processes and logistics, predicting outcomes and shaping an optimum business model, greatly smoothing the launch process.
Wikipedia calls Prescriptive Analytics the “final frontier of analytic capabilities.” Only a few, if any, manufacturers have begun to use this technology to improve quality. It will be interesting to revisit this topic a year or two from now, and see what the Quality Analytics landscape looks like.
What about your company? How far along are you on the maturity curve? I’d like to hear your thoughts in the comments below.
If you liked this article, here are others you might also find interesting: