This Article by Seth Earley was originally published on SmartIndustry.
Manufacturing 4.0 is exciting stuff, promising great value, but we must ask ourselves what needs to be in place to correctly capture, organize, and act upon the data that is streaming through IoT-enabled devices and equipment and will continue to do so in ever-increasing volume?
In order to answer those questions, let’s discuss four foundational elements required for manufacturing 4.0 and identify the capabilities that need to be developed to successfully navigate this next evolution.
Optimizing the digital workplace
The “digital workplace” holds the technology that employees work with day to day. It’s where they find answers, manage their work and interact with colleagues. A factory worker needs to access policies, procedures, time tracking, support requests, maintenance logs, quality issues, process knowledge, troubleshooting materials and so on. Making this information readily available and streamlining the information processes needed on the job is a necessary part of manufacturing 4.0. A connected factory requires faster information flows, which lead to faster turnaround time, decreased downtime, better quality, and greater efficiencies.
Just as organizations map out their customer journeys (the interactions as they go about consuming products and services), the employee journey needs to be mapped out to identify bottlenecks, information and knowledge gaps and other challenges that people encounter as they perform their jobs. Removing “acts of heroics” needed to get the job done means less burn out and greater agility, flexibility and efficiency. Employees will need access to specific information, knowledge sources, or applications that are not readily available.
Removing these friction points is a foundational element to improved connectivity and faster decision-making. Consider the new product lifecycle and the various roles and data sources. Digital tools can assist from ideation, to design iteration, to virtual prototype development and performance measuring using a well modeled virtual version of the product. Once the product is manufactured (using a design optimized based on equipment, materials, and constraints), prototype performance can be measured using sensors and comparing expected performance to the predicted one. At each stage, collaborative, design, and modeling applications assist designer and engineers in their tasks.
Measuring ROI on digital investments
There are always demands on resources, time and money that need to be prioritized. If there is no clear ROI on a data project—but there is a piece of equipment that can speed throughput and have a measurable impact—the organization will go with the clear ROI versus a fuzzy “cost of doing business.” Getting that ROI for digital investments requires an understanding of specific processes that are to be impacted and baseline measures of those processes.
Some projects may be billed as “infrastructure”’ however, they should still be tied to an expected improvement in some process even if the process is disconnected from direct ROI measures. Getting to that level of granularity means mapping data, content, and knowledge sources to the processes that they support, and aligning them with business outcomes.
Measures of data quality and completeness can fall under this framework as long as there is a clear connection between the data and the processes. To do this, systems must first be instrumented for baseline metrics.
Building an extensible enterprise architecture
This element goes by various names—reference architecture, master data, knowledge graph, ontology, data standards and others. A dictionary of terminology and data structures is essential so that when new systems are configured or redesigned, a list of consistent approved terminology and data standards can be used to prevent interoperability problems in the future. This step can begin with an inflight project today by looking at business concepts that are important to the enterprise and building a gating process for approval if new terminology or data elements are needed.
For example, when an organization is redesigning its ecommerce experience, a new product taxonomy is typically developed. The taxonomy should not be reinvented for another project that needs product categories and classifications. Methods for derivation and validation of a reference architecture will ensure that the correct decisions are made and that the output will be useable for the foreseeable future (with updates and maintenance of course using a disciplined data driven approach)
Defining an enterprise architecture may have to happen on an incremental basis; however, that has to be done with a holistic view of the organization’s data and application infrastructure. This approach will mitigate some data quality issues arising down the road.
A project of this scope must include measurable processes that have, or can be instrumented for, baseline metrics. For example, one continuous-process manufacturer wanted to use machine learning to optimize energy consumption. The plant had 150,000 pieces of equipment—from pumps and motors to reaction vessels and piping that produced sensor data. The data formats varied, and there were tens of thousands of variables. That data could be ingested with normalization rules, but not every piece of equipment had the correct entry in a master hierarchy of the items. This meant that some data was orphaned—there was no way to put the data into the correct context for action. The algorithm could process the data, but acting on it effectively was not practical. When an operating parameter was out of range, where was that equipment in the plant, what safety and maintenance procedures needed to be referenced and possibly updated? Who was the owner? What was the repair history?
A knowledge architecture was needed to act on and apply results of the analysis. Once it was implemented, the measurable result was faster maintenance turnaround and response time to issues, the optimized performance level of the equipment and ultimately reduced power expenses.
Savings from these various areas added up to millions in annual cost reductions for power and maintenance costs as well as reduced downtime. In some cases, equipment failures were predicted and prevented by having an orderly shutdown of the line, replacement, or repair and then a restart without all of the problems that would have occurred with a failure mid-process.
Tuning the customer experience
Many B2B manufacturers are still doing business the old way—relationships, phone calls, and even paper catalogs. But that is no longer sustainable or viable. Expertise is aging out and retiring. New competencies are hard to find in the marketplace—the old career paths necessary to build tacit knowledge are not as attractive to new generations of employees. It is just not possible to scale a highly specialized expert. But even with low expertise transactions, the eCommerce and web experience need to be upgraded to attract and keep customers. Digital natives do not like to pick up the phone. Self-service needs to replace costly account reps and salespeople. Some organizations have ceded knowledge of the customer to Amazon or other marketplaces.
The way to compete with large competitors is to fine tune your digital experience to align with the mental model of your customers. Understand the details of how they want to interact, what they look for, how they look for it and what additional information they need to complete their transaction.
Offload routine, repeatable, low touch activities to digital—including the judicious use of virtual assistants and chatbots. There is a tremendous amount to discuss here but you can start with “helper bots” that can assist employees and call center people in doing their jobs. Then extend to repeatable, low complexity tasks. Maturity in this space requires maturity in knowledge processes, content, product information, and the customer journey so there are lots of dependencies.
Manufacturing 4.0 is about ubiquity of intelligence throughout our physical world. That intelligence begins as sensor data-tracking performance, functionality and usage. The sensors interact with other devices so the larger ecosystems of applications can act as a connected organism that can be optimized along many dimensions. Sensors in soil and on crops provide data to systems and machinery for irrigation, fertilization and harvesting. Each of those systems provides data that, along with weather data and field data, can be tuned to optimize production efficiency, minimize waste, reduce pollutant run off, and conserve water supplies. A manufacturer of farm equipment can now extend the value proposition to production optimization, not just high-performance equipment.
Similarly, other manufacturers are selling use of equipment, including maintenance and performance guarantees, instead of the equipment itself. Ask what the overall objective of the customer is, and consider how sensors and feedback can go beyond the product to produce the outcomes that they are looking for.
Of course, this requires data-science capabilities and the ability to manage internal data, as well as the commitment to take on these massive flows of customer and performance data. Begin by understanding the current state of data and information in the organization. Start with projects that will show short term payoff but will also support longer term objectives. Build a cost effective, sustainable path to a harmonized information environment.
This approach will provide the capabilities your organization will need to ensure its survival in the future.