Dec 16 2014

Taking Lean to the Supply Chain

lean_supply_chainMost large manufacturers today have established Lean or Six Sigma programs for their own operations. It’s practically a requirement to be competitive.

But the line between a manufacturer’s “own” operations, and those of its partners, is getting fuzzier all the time. With today’s increasingly connected supply chains, demand-driven supply networks, and global operations, it only makes sense that big opportunities for performance improvement and cost reduction might be out there in the supply chain, just waiting to be discovered.

The benefits of a Lean supply chain might be a good thing – or might not, depending upon your perspective. But is it realistic? Considering that Lean and Six Sigma programs are all about consistency and control, and involve cultural and operational transformation, can a manufacturer really expect to extend these practices outside of its own organization? And, at what cost?

Pundit Perspective

A new report from Gartner, “Transform Your Supply Chain to Become Demand-Driven,” cautions that creating a Lean supply chain is a journey, and won’t be easy. The authors write, “Companies striving to become demand-driven must recognize that functional integration is a prerequisite — and that it is extremely difficult to achieve. Fewer than 10% of companies that have assessed their supply chain maturity, rate it as integrated.”

Nevertheless, Gartner recommends enterprises pursue the goal, and many are starting to do just that.

Early Signs of Success

A recent article in Industry Week called “Lean into the Supply Chain” describes several examples of global manufacturers who have taken up the challenge, some with striking success:

  • Pratt & Whitney, the aircraft engine manufacturer, aims to triple jet engine production by 2020, with hundreds of suppliers. To keep control, the company has created an “Operations Command Center,” which gathers and shares information about the delivery status of 400 suppliers worldwide, with early warnings if schedules might slip.
  • USG Corp., which produces and distributes gypsum wallboard, joint compound and related construction products, has trained all 100 employees in its supply chain organization in Lean and Six Sigma; the company saved almost $10 million last year while improving operational efficiencies. Says a spokesman, “If we have a warehouse in one region with stock-out problems, we’ll involve production, transportation, logistics, etc., to solve that problem. We’ll use enterprise value stream mapping from several locations throughout the entire process.”
  • MTU America, a Rolls-Royce Power Systems and Daimler subsidiary, created a 400,000 square-foot aftermarket logistics center. They report “huge gains” in on-time delivery and productivity, but say “the biggest accomplishment has been improved customer satisfaction.”
  • Even healthcare, an industry that traditionally has lagged behind in this kind of technology, is “leaning” its supply chain. Intermountain, a non-profit healthcare system of more than 20 hospitals in the southwest United States, opened a 327,000 square-foot Supply Chain Center equipped with the “latest warehousing technology, such as a new warehouse management system, a cubing and dimensioning system, and an automated conveyor system.” They hoped to save $80 million in five years. Instead, they did it in two!


Supporting and enabling all these efforts, of course, is ever-advancing technology that makes it possible for more and more people and operations to communicate and synchronize. As Paul Myerson, professor of supply chain management at Lehigh University, says in the article, “technology not only enables lean but it can help identify and eliminate waste by substituting information for inventory.”

In other words, success with applying Lean manufacturing methodologies across a distributed global supply chain is heavily dependent upon visibility, control and synchronization of material flows such that as issues present themselves, they can be quickly remedied to avoid potential for large disruptions.

Based on the experience of these companies, it looks like information is a lot cheaper to store and manage than inventory. And, it certainly can be transported far more easily!

What do you think? Is your enterprise ready to “Lean into the supply chain”?

Permanent link to this article:

Dec 11 2014

EQMS: Take it Up Another Notch

EQMS blog postIn my previous blog post, A Compelling Case for EQMS, I discussed the shift of the EQMS (Enterprise Quality Management Software) business case from streamlining and slashing the costs of quality control processes to preventing expensive and embarrassing quality failures. According to an LNS Research report, the cost of quality failures increases exponentially as detection occurs closer in the process to the end user. In a world of complex global supply chains, tracking a quality failure to its source can be a lengthy, expensive, resource intensive undertaking.

Three emerging technology trends promise to add value to the preventive potential of EQMS.

The Cloud

Few trends have shaken up legacy computing like the cloud, allowing organizations to delegate everything from infrastructure to software development platforms and entire software deployments to third parties. There can be agility, scalability and cost advantages to cloud computing for many organizations. For the preventive potential of cloud based EQMS, however, one stands out—extending EQMS reach across the global supply chain.

Early implementations of EQMS added value by centralizing and automating quality control processes across an entire enterprise. In an environment of complex global supply chains, Software as a Service (SaaS) based EQMS implementations can make it much easier for organizations to integrate their suppliers and other partner systems into a single EQMS deployment. It is possible to extend internal systems to partners and suppliers, but SaaS providers can make the process quicker, easier and less expensive – and even make the difference between success and failure – by taking over much of the complexity of management, security and integration. Global providers are also more likely to have datacenters closer to suppliers for better performance and reliability, and the experience and security infrastructure to make sure connections and sensitive data aren’t compromised.


EQMS centralizes and streamlines essential quality control processes such as supplier management, Corrective and Preventive Action (CAPA), compliance management, risk management, complaint handling, change management, and auditing. Add mobile devices, BYOD and mobile EQMS applications, and you have the potential for even more streamlining and workflow acceleration. When an approval is required in a complex workflow or a quality issue or complaint needs to be handled quickly at a higher level, mobile devices and applications make it possible to reach the right parties immediately, wherever they are on whatever device they have on hand. The sooner a quality issue is addressed, the less expensive it is likely to be. Thanks to the Bring Your Own Device (BYOD) to work trend, the mobile device can even be a personal iPhone running a protected corporate EQMS client application. The same goes for partners, suppliers and in some cases even customers.

Big Data and Predictive Analytics

Big data is all about mining huge amounts of disparate structured and unstructured information from multiple systems to discover hidden trends and insights that would normally not be available with traditional data analysis methods. Predictive analytics aims to harness such information to predict and address issues before they have any noticeable impact, or to predict the impact of new initiatives.

EQMS systems are perfect candidates for this type of predictive analytics, as they integrate and exchange information with many other core business systems, potentially even from partners and supplier systems as well. Effective predictive analytics can be invaluable not only for addressing supply chain and regulatory issues before they have a significant financial impact, but gaining invaluable new insight in how quality management processes can be improved.

All of these technologies are in their early stages, so, choosing and implementing the right solution has its challenges. However, as the industry matures and business cases become more widely adopted, the impact on how EQMS can operate is already starting to look very compelling.

Permanent link to this article:

Dec 09 2014

Use a “Digital Twin” to Ensure Products are Built to Design

Use a “Digital Twin” to Ensure Products are Built to DesignI recently published a white paper that takes a closer look at the concept of a “Digital Twin,” and how it can be used to help close the loop between production and design. What follows is a brief extract from the paper. Those interested in reading the full paper can do so here.

Much has been written about the process of transferring the digital rendered designs to the shop floor so the right product is built. The challenge is that quite often what was designed isn’t actually built. Inevitably, issues occur whereby equipment doesn’t perform as planned, incorrect work instructions are used or user errors occur.

The Concept

A virtual, digital equivalent to a physical product, or a Digital Twin, was introduced in 2003 at my University of Michigan Executive Course on Product Lifecycle Management (PLM). At that time, digital representations of actual physical products were relatively new and immature. The information collected about the physical product as it was being produced was limited, manually collected and mostly paper-based.

Virtual products are rich representations of products that are virtually indistinguishable from their physical counterparts. The rise of Manufacturing Execution Systems (MES) has resulted in a wealth of data that is collected and maintained on the production and form of physical products. In addition, this collection has progressed from being manually collected and paper-based, to being digital and collected by a wide variety of physical non-destructive sensing technologies.

Three Parts to the Model

The Digital Twin concept model, as illustrated above, contains three main parts: a) physical products in Real Space, b) virtual products in Virtual Space, and c) the connections of data and information that ties the virtual and real products together. In the decade since this model was introduced, there have been tremendous increases in the amount, richness and fidelity of information of both the physical and virtual products.

On the virtual side, we have much more information now available. Numerous behavioral characteristics can not only visualize a product, but can be tested for performance capabilities. On the physical side, we now collect much more data about the physical product. Actual measurements from automated quality control stations, and data from the machines that produced the physical part, is now readily available to understand exactly what operations, at what speeds and forces, were applied.

Unifying the Virtual and Real Worlds

The amount and quality of information about the virtual and physical product have progressed rapidly in the last decade. The issue is that the two-way connection between real and virtual space has been lagging behind. Global manufacturers today either work with the physical product or with the virtual product. Historically, we have not developed the connection between the two products so that we can work with both of them simultaneously. This shortcoming, however, may soon go away.

In order to deliver the substantial benefits to be gained from this linkage between virtual and physical products, one solution is to have a Unified Repository (UR) that links the two products together. Both virtual development tools and physical collection tools could populate the Unified Repository, creating a two-way connection between the virtual and physical product.

On the virtual tool side, design and engineering would identify characteristics, such as dimensions, tolerances, torque requirements, hardness measurements, etc., and place a unique tag in the virtual model that would serve as a data placeholder for the actual physical product. Included in the tag would be the as-designed characteristic parameter.

On the physical side, these tags would be incorporated into the MES in the Bill of Process creation at the process step where they will be captured. As the processes are completed on the factory floor, the MES would output the captured characteristic to the UR.

The final step would be to incorporate this information back into the factory simulation. This would turn the factory simulation into a factory replication application. Instead of simulating what should be happening in the factory, the application would be replicating what actually was happening at each step in the factory on each product. Many interesting use cases could then be possible by leveraging this digital twin, which could then contribute to improving overall manufacturing excellence.

To read about these specific use cases, as well as further details on this concept, read the rest of the paper here.

Dr. Michael Grieves has published several books on this topic, which can be found here.

Permanent link to this article:

Dec 03 2014

Wearable Technology is a Natural Fit in Manufacturing


Photo credit: Thomas Hawk via photopin cc

Sometimes I think the manufacturing revolution taking place today, driven by information technology, has barely started. What if we’re only just beginning to see what the factory of the future will look like?

I use the word “see” literally, because technology like Google Glass is bound to seep into manufacturing sooner or later – probably sooner.

Consider how companies are already using this technology outside of manufacturing. For example, Virgin Atlantic has instituted an innovative customer service program at London’s Heathrow Airport, where workers wear Google Glasses linked to their customer database. The results are either spooky or exciting, depending on how you feel about this sort of thing. Staff can use the technology to greet every customer by name, provide updates on flight information, talk about events or the weather at the destination where the customer is heading. And, this technology can even translate information into other languages. The airline says they’re trying it out in response to a study that has shown that customers want a better flying experience.

This is just one example of how the Internet of Things might have an impact on user experiences. Whether or not this particular experiment succeeds, one thing is clear: as we evolve towards a society where the user experience is “king,” any opportunity that companies have for differentiation should be seriously considered. Which brings us back to manufacturing.

Wearables for Process Improvement

It’s easy to imagine some very exciting applications on the factory floor. Think of a quality inspector with Google Glass or other “wearable” technology walking the production line and seeing live performance metrics about the actual running process he’s looking at (somewhat like the Terminator’s visual readouts in the sci-fi movie series). The inspector could call up any relevant information, perhaps by voice command or by eye movement, to see specifications, defect patterns, or anything else desired.

Watch this Autoline Video showing BMW testing Google Glass for Quality Improvement.

Another application might be in the warehouse, where workers could see maps with locations of any item needed, as well as where to deliver it. This could reduce wasted time in moving supplies, and could greatly aid Just-in-Time manufacturing schemes.

It might even be possible to add some kind of learning capability to these applications, so the user experience becomes more and more personalized and helpful over time.

Wouldn’t you want to do business with (or work at) a company that could offer this kind of innovative experience? It would be more than a manufacturing tool—it would be a sales tool as well. A manufacturer that wanted to impress customers with its commitment to quality could have them don a pair of glasses and tour the factory, receiving live readouts on every important activity and event. Here’s our quality performance, right before your eyes!

I’m sure I’ve only scratched the surface on this topic. There’s no reason to think the manufacturing transformation now underway is anywhere close to its endpoint, or that it even has one! Every new technology that comes along, if it can improve the user experience, is a candidate for manufacturing applications. We’d better get used to the idea that today’s fad may well become tomorrow’s manufacturing advantage.

Here are other posts you might also enjoy reading:


Michal can be found on Google+


Permanent link to this article:

Nov 21 2014

Simulating Success in the Automotive Manufacturing Industries

digital_to_real_automotive_manufacturingContinuing from my prior post, another example of the role that digital manufacturing and simulation plays comes from the Automobili Lamborghini Advanced Composite Structures Laboratory (ACSL) at the University of Washington in Seattle (USA), which blends aerospace and automotive composite development. Working with Boeing and the US Federal Aviation Administration (FAA), the ACSL improves certification of new composite materials and structures, often based on proven virtual testing principles pioneered for Lamborghini automobiles.

ACSL and Boeing collaborated on advanced analysis methods for predicting the crash performance of the all-composite monocoque of Lamborghini’s Aventador automobile. Aventador passed its crash-test certification on the first try; previous models required two or three tests. At $1 million per crash, savings were substantial, even without factoring in time and cost saved by not building additional test vehicles. See figure above.

A Complete Paradigm Shift

While such programs go beyond industry standards in employing virtual testing, Dr. R. Byron Pipes, John Bray Distinguished Professor in the College of Engineering at Purdue University (USA), believes they don’t go far enough.

Current trends in virtual testing of new composites is only an incremental improvement, Pipes believes, not the complete paradigm shift needed to unshackle composite development. “We are still struggling with empirical-based manufacturing and (physical) testing-based certification,” he said. “It costs $100 million per material to qualify composites to fly on a new airframe. Once certified, materials changes are economically impossible.”

Dr. Pipes describes composite development today as dominated by experiments and only aided by analysis. “We have the computational power to change this paradigm and replace thousands of (physical) tests with robust multi-scale simulation of manufacturing and performance,” he said. “Only then will we enable innovations in materials composition and processing without repeated costly recertification.”

Reducing Uncertainty

Today, manufacturers physically test every element before it is assembled and every part before it goes on an airplane, contributing to unsustainable development cycles and costs. “You will never totally escape the need for (physical) testing to validate models, but we must address the issue of certainty in simulation results, or rather, how to manage uncertainty,” Dr. Pipes said. “Simulation tools can guide understanding of uncertainty in design and also how it propagates.”

Using virtual simulations, Cobham Life Support reduced destructive tests on a NASA fuel tank by 50%, saving $500,000.

To demonstrate the potential of the approach, Dr. Pipes cites the US National Nuclear Security Administration (NNSA).

Due to the US moratorium on nuclear device testing, NNSA, a division of the US Department of Energy, cannot conduct full-scale physical performance testing. “About 15 to 20 years ago, we defined a road map of what was needed to achieve simulation-based certification,” said Dr. Mark Anderson, technical advisor to the NNSA from Los Alamos National Laboratory, a US government- supported research agency. Key elements of this road map include: transition to a validated predictive capability based on multi-scale, physics-based computer simulation and quantification of uncertainty in NNSA’ simulation tools.

Balancing Physical and Virtual

Dr. Anderson believes composites modeling can be advanced by adapting the NNSA approach. “For most industries, what would be the most appropriate is a balance between the historical testing-based approach and this simulation/uncertainty quantification based approach,” Anderson said. He notes that although significant theory has gone into composite industry models, many still use a simple mathematical description that fits empirical test data.

Uncertainty quantification (UQ) involves managing both parametric uncertainty and model-form uncertainty. “There is an investment to be made up front, both in time and money,” Dr. Anderson said. “But by building simulation capability, it is possible to reduce testing costs from $500,000 to $100,000, for example.” He notes that US-based automotive maker General Motors has used UQ in crash-test simulations and that NASA is incorporating it into the space agency’s simulation tools to aid with tests it cannot perform physically, such as reactions in a space environment or full-structure tests that are beyond the scope of its current budget.

The result is the potential for “robust design” – high performance without the overdesign needed to compensate for uncertainty. Robust design factors uncertainty directly into the model, producing designs that are less sensitive to uncertainty, with less bet-hedging overdesign.


This excerpt was originally published in Dassault Systèmes’ Compass magazine, and was used with permission. Read the full article here.

Permanent link to this article:

Older posts «