Quality 4.0 is a vital part of Industry 4.0, the move toward digital transformation and recession-proofing your business – now more important than ever. In the smart factory, information plays a key role, and proper data management relies on having the correct people, processes, and technology in place. The 2022 LNS Research Quality 4.0 Analytics report found that more companies are approaching industrial transformation (IX) with the goal of finding “hard-benefit” use cases with a clear ROI in direct savings in labor, material, or equipment expenses.
Many organizations are investing time and money into data analytics, but the majority of that investment often gets poured into analytics – not data. It’s critically important for quality, as a function, to have the right technology in place to handle all types of data at every level.
The LNS Research study also found that 58% of companies are deploying industrial analytics solutions, but only 11% have realized dramatic business impacts. There are several reasons for this, but one of the main ones is data management.
Here, we will highlight some of the key findings from the Quality 4.0 Analytics: A Data Hub Approach to Quality Management and Execution report, including the challenges presented by processes that no longer work and how an industrial data hub solves myriad issues.
Examples of Quality Management Practices and Challenges
Current quality management practices don’t always support the transition to Quality 4.0. Pursuing the high ROI, hard-benefit use cases, unlike “low-hanging fruit” use cases, requires sophisticated data connectivity and integration capabilities that many companies lack. This presents challenges for organizations looking to improve efficiency while using data to improve performance across the business. Here are some of the challenges manufacturers are facing that prevent them from attaining the dramatic business impacts from analytics.
Paper-Based Processes
Paper-based processes are, at best, outdated and, at worst, damaging to a business. Using physical documents to store data is highly inefficient. The information on them tends to be inconsistent, prone to errors, and difficult to cross-reference, and recording and revisiting this data is time-consuming. And yet, almost a quarter of the surveyed companies still use paper in their quality processes.
Digitizing quality data is challenging because it involves a shift in process and a change in data storage, plus a new mindset. Just because paper-based processes have worked in the past doesn’t mean they always will. In fact, they’re almost guaranteed to fail you if they haven’t already.
File-Based Systems
Paper isn’t the only culprit regarding inefficient quality management and data collection – using spreadsheets, Word documents, PowerPoint files, and other flat file-based systems is also sub-optimal. Some of the issues with physical paper documents are negated by digital files, but they’re ill-equipped for analyzing and providing additional information for audits or inspections.
Additionally, they can’t support the workflows necessary for quality process automation, and Quality 4.0 leaders are swiftly moving away from them as viable quality management methods.
IT and Relational Database Systems
Most commonly, organizations manage their quality data through multiple IT systems. Sometimes these disparate systems have redundant capabilities that create inefficiencies and inconsistent data. Other times applications serve different teams and require integration with the quality management system, including enterprise-level PLM, ERP, SCM, CRM systems, plant-level MES, data historians, SPC, LIMS systems, and so on.
These systems are familiar legacies to many businesses, so they choose to continue using them. The problem is that this type of network has issues with integration, data quality and access restrictions, thanks to each system’s disparate, siloed nature. Fortunately, a QMS software, such as ETQ Reliance, can integrate these systems, enriching quality data across all stages of the product lifecycle.
How an Industrial Data Hub Opens New Opportunities for Quality 4.0 Analytics
For organizations dedicated to improving data collection and management, a data hub approach enables companies to maintain a single source of truth across IT and OT (Operational Technology) data, avoid custom integrations, and overcome other challenges with traditional data management.
Functionalities
The LNS Quality 4.0 report explains how an industrial data hub strategy addresses data management challenges and helps Quality 4.0 teams pursue the complex hard-benefit use cases.
The report points out that an industrial data hub is complex, and the hub needs several core functionalities to collect, store, condition effectively, and govern multiple data types in one location. These functionalities include:
- Orchestration: Handling batch, streaming, and intermittent data simultaneously
- Synchronization: Consolidating multiple types of data to ensure consistency over time
- Conditioning: Screening, filtering, interpolating, and other data conditioning functions
- Contextualization: Establishing relationships between the data and its metadata
- Persistence: Maintaining the relationships as source data changes
- Access: Providing access to data consumers in the format they need to consume it
The hub creates symbiotic relationships between existing applications and data lakes, rather than containing any analytics functionalities of its own. This unlocks richer analytics on EQMS and other systems and enables businesses to manage data quality through basic visualization tools.
Use Cases
The industrial data hub enables manufacturers to pursue novel use cases by bringing together quality IT and OT data. IT-OT convergence is a problem in most organizations, as it needs to bring together quality management and execution.
The data hub provides access to the right people within the necessary timeframe while enabling companies to mandate proven data governance best practices, including pushing data responsibilities as close to the data source as possible. This ensures data owners are not only custodians but also active data users.
These capabilities should support today’s quality teams to scale their analytics approach and allow them to pursue complex, hard-benefit use cases.
Solve Complex Quality 4.0 Use Cases with a Data Hub
Quality as a function spans various levels and requires many types and formats of data to perform meaningful analytics. This complexity is far beyond what traditional quality data management processes can handle. As such, a data hub approach is needed to overcome modern challenges and enable quality leaders to pursue quality 4.0 use cases, maximizing ROI.
The benefits of an industrial data hub include unlocking the power of big data analytics for contextualized quality data from all stages of the product lifecycle. A data hub is designed to provide a single source of truth across all IT and OT (operational technology) systems. On the IT side, a data hub allows quality engineers to perform advanced analytics on manufacturing, quality, suppliers and customer information to reduce supplier defects, warranty accruals, etc. On the OT side, a data hub allows use cases such as dynamic work instructions for plant operators and the ability to correlate SPC data with other datapoints to gain insights.
Industrial data hubs don’t exist to replace data management systems – they collect and aggregate data from disparate sources and prepare them for other applications to consume. This change requires deep transformation within the underlying processes and support from the necessary people and process capabilities. Additionally, data lakes need rigorous data governance processes to maximize returns.
To learn more about data hubs and their benefits, watch the Quality 4.0 Analytics Webinar: A Data Hub Approach to Quality Management and Execution webinar and download the LNS Research report today.