Manufacturing Analytics

Assembly Line

Automate plant maintenance using MDE with ABAP SDK for Google Cloud

๐Ÿ“… Date:

โœ๏ธ Authors: Manas Srivastava, Devesh Singh

๐Ÿ”– Topics: Manufacturing Analytics, Cloud Computing, Data Architecture

๐Ÿข Organizations: Google, SAP, Litmus

Analyzing production data at scale for huge datasets is always a challenge, especially when thereโ€™s data from multiple production facilities involved with thousands of assets in production pipelines. To help solve this challenge, our Manufacturing Data Engine is designed to help manufacturers manage end-to-end shop floor business processes.

Manufacturing Data Engine (MDE) is a scalable solution that accelerates, simplifies, and enhances the ingestion, processing, contextualization, storage, and usage of manufacturing data for monitoring, analytical, and machine learning use cases. This suite of components can help manufacturers accelerate their transformation with Google Cloudโ€™s analytics and AI capabilities.

Read more at Google Cloud Blog

How Data-Powered 3D Printers Will Change Manufacturing

๐Ÿ“… Date:

โœ๏ธ Author: Julie Van Der Hoop

๐Ÿ”– Topics: Additive Manufacturing, 3D printing, Manufacturing Analytics

Similar to how autonomous vehicles collect and apply data to continuously improve a carโ€™s ability to drive, connected 3D printers can use collected data for artificial intelligence-powered automation. During each print job, 3D printers produce large quantities of data that are sent to and stored in the cloud. The print job dataโ€”ripe for AI, machine learning, and automation-based product featuresโ€”can then be fed to algorithms, which printers and users can access through the cloud. Among other things, these data help businesses make decisions about what parts to print and how best to print them, while improving the quality of print jobs.

Read more at Quality Digest

Enhancing CTQ Parameters with Traciviss - Traceability Software

๐Ÿ“… Date:

๐Ÿ”– Topics: Manufacturing Analytics

๐Ÿข Organizations: Dataviss

In the dynamic landscape of the Indian automotive industry, efficiency, precision, and quality are paramount. Our client, a distinguished commercial automotive manufacturer based in Ennore, Chennai, sought to optimize their production process and enhance their critical-to-quality (CTQ) parametersโ€”Nut Runners Availability, Torque Quality, and Vision Cot Pin Identification. To achieve this, they engaged Traciviss, an AI-driven traceability software solution, in conjunction with Rockwell PLC integration provided by MXHub Technocare. This innovative partnership aimed to streamline the production line, reduce errors, and automate the entire axle assembly process.

Read more at Dataviss Case Studies

Launch of Smart Manufacturing Cell Transforms Rochester Operations

๐Ÿ“… Date:

๐Ÿ”– Topics: Manufacturing Analytics

๐Ÿข Organizations: L3Harris, LightGuide, Mountz, Cognex

L3Harris is driving toward fully controlled and paced production of tactical radios with the launch of its first Smart Manufacturing Cell in its Rochester, New York, facilities, which streamlines assembly processes so the company can continue to meet customer demands and delivery schedules for critical communication devices.

The answer to the companyโ€™s current and future needs was the implementation of Smart Manufacturing Cell production. SMC is an Industry 4.0-level assembly process where control technologies, such as LightGuide augmented reality, Mountz precision torque drivers and Cognexยฎ machine vision inspection, are integrated into one common platform by WorkSmart Systems. This capability delivers a line-agnostic station where different products with the same process can be built without requiring device-specific configurations when switching between production lines. Further, the system itself collects data including who worked on a specific unit and at what time for troubleshooting and root-cause analysis of potential defects found later in internal testing.

Read more at L3Harris Newsroom

Using Industrial Automation to Monitor Vertical Farms

๐Ÿ“… Date:

โœ๏ธ Author: Qusi Alqarqaz

๐Ÿ”– Topics: Manufacturing Analytics

๐Ÿข Organizations: Nokia, AeroFarms

The adoption of artificial intelligence and machine learning algorithms now allow analysis of vast amounts of data collected from sensors to enable predictive analytics. Farmers can make more informed decisions about managing crops, optimizing resource usage, and predicting yields.

AeroFarms and Nokia discussed how to build a system to monitor a vertical farm where leafy greens including arugula, bok choy, and kale are grown. A typical facility can produce more than 1 million kilograms of leafy greens annually. A 13,000-square-meter facility such as the AeroFarms one in Danville is so large that workers canโ€™t physically check all the plants. โ€œBecause the growth cycle in indoor farming is much shorter than outdoor farming, it is very important to know whatโ€™s going on at all times and not to miss anything,โ€ Klein says. โ€œIf you fail to detect something, you will miss a huge opportunity. You might be at the end of your growth cycle, and you canโ€™t take corrective measures in terms of the production yield, or the quality or quantity of produce.โ€

Read more at IEEE Spectrum

Transforming Semiconductor Yield Management with AWS and Deloitte

๐Ÿ“… Date:

๐Ÿ”– Topics: Cloud Computing, Manufacturing Analytics, Data Architecture

๐Ÿญ Vertical: Semiconductor

๐Ÿข Organizations: AWS, Deloitte

Together, AWS and Deloitte have developed a reference architecture to enable the aforementioned yield management capabilities. The architecture, shown in Figure 1, depicts how to collect, store, analyze and act on the yield related data throughout the supply chain. The following describes how the modernized yield management architecture enables the six capabilities discussed earlier.

Read more at AWS Blogs

๐Ÿง  Data Driven Optimization - AI, Analytics IIoT and Oden Technologies

๐Ÿ“… Date:

โœ๏ธ Author: Trey Bell

๐Ÿ”– Topics: Manufacturing Analytics, IIoT

๐Ÿข Organizations: Oden Technologies

If you can predict that offline quality test in real time, so that you know, in real time, that youโ€™re making good products, it reduces the risk to improve the process in real time. We actually use that type of modeling to then prescribe the right set points for the customer to reach whatever outcome they want to achieve. If they want to lower the cost, lower the material consumption and lower energy consumption, increase the speed, then we actually give them the input parameters that they need to use in order to get a more efficient output.

And then the last step, which is more exploratory, which weโ€™re working on now is also generating work instructions for the operators, kind of like an AI support system for the operator. Because still, and we recognize this, the big bottleneck for a lot of manufacturers is talent. Talent is very scarce, itโ€™s very hard to hire a lot of people that can perform these processes, especially when they say that itโ€™s more of an art than a science. We can lower the barrier to entry for operators to become top performers, through recommendations, predictions and generative AI for how to achieve high performance. By enabling operators to leverage science more than art or intuition, we can really change the game in terms of how we make things.

Read more at Industrial Machinery Digest

Predictive Maintenance for Semiconductor Manufacturers with SEEQ powered by AWS

๐Ÿ“… Date:

โœ๏ธ Authors: Gautham Unni, Namrata Sharma, Sean Tropsa

๐Ÿ”– Topics: Predictive Maintenance, Manufacturing Analytics

๐Ÿญ Vertical: Semiconductor

๐Ÿข Organizations: AWS, Seeq

There are challenges in creating predictive maintenance models, such as siloed data, the offline nature of data processing and analytics, and having the necessary domain knowledge to build, implement, and scale models. In this blog, we will explore how using Seeq software on Amazon Web Services can help overcome these challenges.

The combination of AWS and Seeq pairs a secure cloud services platform with advanced analytics innovation. Seeq on AWS can access time series and relational data stored in AWS data services including Amazon Redshift, Amazon DynamoDB, Amazon Simple Storage Service (S3), and Amazon Athena. Once connected, engineers and other technical staff have direct access to all the data in those databases in a live streaming environment, enabling exploration and data analytics without needing to go through the steps to extract data and align timestamps whenever more data is required. As a result, monitoring dashboards and running reports can be set to auto generate and are easily shared among groups or sites. This enables balancing machine downtimes and planning ahead for maintenance without disrupting schedules or compromising yields.

Read more at AWS for Industries

Advanced Analytics at BASF with TrendMiner

๐Ÿ“… Date:

๐Ÿ”– Topics: Manufacturing Analytics

๐Ÿข Organizations: BASF, TrendMiner

Through an insightful case study on monitoring instrument air pressure and flare flows, Rooha Khan highlights how TrendMinerโ€™s platform effectively optimizes manufacturing processes. Witness the tangible value BASF has discovered by harnessing the capabilities of industrial data analysis and monitoring, and be prepared to embrace the transformative possibilities of digitalization.

Read more at Trendminer Webinar

Using Data Models to Manage Your Digital Twins

๐Ÿ“… Date:

โœ๏ธ Author: Greger Teigre Wedel

๐Ÿ”– Topics: Digital Twin, Manufacturing Analytics

๐Ÿข Organizations: Cognite

A continuously evolving industrial knowledge graph is the foundation of creating industrial digital twins that solve real-world problems. Industrial digital twins are powerful representations of the physical world that can help you better understand how your assets are impacting your operations. A digital twin is only as useful as what you can do with it, and there is never only one all-encompassing digital twin. Your maintenance view of a physical installation will need to be different from the operational view, which is different from the engineering view for planning and construction.

Read more at Cognite Blog

Manufacturing Process Optimization in Times of Adversity

๐Ÿ“… Date:

โœ๏ธ Author: Nicol Ritchie

๐Ÿ”– Topics: Manufacturing Analytics

๐Ÿข Organizations: dataprophet

For the current era, we can usefully define manufacturing process optimization like this:

  1. Digitally connected plant teams learning and implementing data-driven strategies that impact their manufacturing processes to minimize cost and maximize production toward peak operational efficiency.
  2. Using data-to-value technologies that integrate seamlessly with their legacy systems and progressively automate an end-to-end, continuous improvement, production loop โ€” freeing manufacturers from a reactive troubleshooting paradigm so they can layer in further innovations toward the smart factory.

Through the above process, machine learning workflows are able to solve current generation data-readiness and production process optimization issues while future-proofing operations. By easing cost pressures and driving up revenue via data-driven production efficiencies (and with increasingly data-mature plant personnel), the C-suite is free to develop strategies with innovation managers. Together, they can combat the broader external challenges experienced by many manufacturers today.

Read more at dataprophet Blog

โญ Hunting For Hardware-Related Errors In Data Centers

๐Ÿ“… Date:

โœ๏ธ Author: Anne Meixner

๐Ÿ”– Topics: Manufacturing Analytics

๐Ÿข Organizations: Google, Meta, Synopsys

The data center computational errors that Google and Meta engineers reported in 2021 have raised concerns regarding an unexpected cause โ€” manufacturing defect levels on the order of 1,000 DPPM. Specific to a single core in a multi-core SoC, these hardware defects are difficult to isolate during data center operations and manufacturing test processes. In fact, SDEs can go undetected for months because the precise inputs and local environmental conditions (temperature, noise, voltage, clock frequency) have not yet been applied.

For instance, Google engineers noted โ€˜an innocuous change to a low-level libraryโ€™ started to give wrong answers for a massive-scale data analysis pipeline. They went on to write, โ€œDeeper investigation revealed that these instructions malfunctioned due to manufacturing defects, in a way that could only be detected by checking the results of these instructions against the expected results; these are โ€˜silentโ€™ corrupt execution errors, or CEEs.โ€

Engineers at Google further confirmed their need for internal data, โ€œOur understanding of CEE impacts is primarily empirical. We have observations of the form, โ€˜This code has miscomputed (or crashed) on that core.โ€™ We can control what code runs on what cores, and we partially control operating conditions (frequency, voltage, temperature). From this, we can identify some mercurial cores. But because we have limited knowledge of the detailed underlying hardware, and no access to the hardware-supported test structures available to chip makers, we cannot infer much about root causes.โ€

Read more at Semiconductor Engineering

Our connected future: How industrial data sharing can unite a fragmented world

๐Ÿ“… Date:

โœ๏ธ Author: Peter Herweck

๐Ÿ”– Topics: Manufacturing Analytics, Data Architecture

๐Ÿข Organizations: AVEVA

The rapid and effective development of the coronavirus vaccines has set a new benchmark for todayโ€™s industriesโ€“but it is not the only one. Increasingly, savvy enterprises are starting to share industrial data strategically and securely beyond their own four walls, to collaborate with partners, suppliers and even customers.

Worldwide, almost nine out of 10 (87%) business executives at larger industrial companies cite a need for the type of connected data that delivers unique insights to address challenges such as economic uncertainty, unstable geopolitical environments, historic labor shortages, and disrupted supply chains. In fact, executives report in a global study that the most common benefits of having an open and agnostic information-sharing ecosystem are greater efficiency and innovation (48%), increasing employee satisfaction (45%), and staying competitive with other companies (44%).

Read more at AVEVA Perspectives

The future is now: Unlocking the promise of AI in industrials

๐Ÿ“… Date:

โœ๏ธ Authors: Kimberly Borden, Mark Huntington, Mithun Kamat, Alex Singla, Joris Wijpkema, Bill Wiseman

๐Ÿ”– Topics: AI, Manufacturing Analytics, Knowledge Graph

๐Ÿข Organizations: McKinsey

Many executives remain unsure where to apply AI solutions to capture real bottom-line impact. The result has been slow rates of adoption, with many companies taking a wait-and-see approach rather than diving in.

Rather than endlessly contemplate possible applications, executives should set an overall direction and road map and then narrow their focus to areas in which AI can solve specific business problems and create tangible value. As a first step, industrial leaders could gain a better understanding of AI technology and how it can be used to solve specific business problems. They will then be better positioned to begin experimenting with new applications.

Read more at McKinsey Insights

Manufacturing needs MVDA: An introduction to modern, scalable multivariate data analysis

๐Ÿ“… Date:

โœ๏ธ Author: Manuel Tejada

๐Ÿ”– Topics: Manufacturing Analytics

๐Ÿข Organizations: Quartic AI

In most settings, a qualitative/semi-quantitative process understanding exists. Through extensive experimentation and knowledge transfer, subject-matter experts (SMEs) know a generally acceptable range for distinct process parameters which is used to define the safe operating bounds of a process. In special cases, using bivariate analysis, SMEs understand how a small number of variables (no more than five) will interact to influence outputs.

Quantitative process understanding can be achieved through a holistic analysis of all process data gathered throughout the product lifecycle, from process design and development, through qualification and engineering runs, and routine manufacturing. Data comes from time series process sensors, laboratory logbooks, batch production records, raw material COAs, and lab databases containing results of offline analysis. As a process SME, the first reaction to a dataset this complex is that any analysis should be left to those with a deep understanding of machine learning and all the other big data buzzwords. However, this is the ideal opportunity for multivariate data analysis (MVDA).

Read more at Quartic AI Blog

Solution Accelerator: Multi-factory Overall Equipment Effectiveness (OEE) and KPI Monitoring

๐Ÿ“… Date:

โœ๏ธ Authors: Jeffery Annor, Tarik Boukherissa, Bala Amavasai

๐Ÿ”– Topics: Manufacturing Analytics

๐Ÿข Organizations: Databricks

The Databricks Lakehouse provides an end-to-end data engineering, serving, ETL, and machine learning platform that enables organizations to accelerate their analytics workloads by automating the complexity of building and maintaining analytics pipelines through open architecture and formats. This facilitates the connection to high-velocity Industrial IoT data using standard protocols like MQTT, Kafka, Event Hubs, or Kinesis to external datasets, like ERP systems, allowing manufacturers to converge their IT/OT data infrastructure for advanced analytics.

Using a Delta Live Tables pipeline, we leverage the medallion architecture to ingest data from multiple sensors in a semi-structured format (JSON) into our bronze layer where data is replicated in its natural format. The silver layer transformations include parsing of key fields from sensor data that are needed to be extracted/structured for subsequent analysis, and the ingestion of preprocessed workforce data from ERP systems needed to complete the analysis. Finally, the gold layer aggregates sensor data using structured streaming stateful aggregations, calculates OT metrics e.g. OEE, TA (technical availability), and finally combines the aggregated metrics with workforce data based on shifts allowing for IT-OT convergence.

Read more at Databricks Blog

Luxury goods manufacturer gets a handle on production capacity from FourJaw

๐Ÿ“… Date:

๐Ÿ”– Topics: Manufacturing Analytics

๐Ÿข Organizations: Armac Martin, FourJaw

Machine monitoring software from FourJaw has driven a 14% uplift in machine utilisation at a luxury goods manufacturer. Fast-growing brass cabinet hardware manufacturer Armac Martin used data from FourJawโ€™s machine monitoring platform to increase its production capacity and meet a surge in demand for its product range.

Armac Martinโ€™s Production Director, Rob McGrail, said: โ€œWhen we were looking for a machine monitoring software supplier, a key criteria for us was not just about the ease of deployment and software functionality, but it was equally important that they were based locally, in the UK and that they had a good level of customer support, both for deployment and on-going customer success. FourJaw ticked all of these boxesโ€.

Read more at Manufacturing & Engineering Magazine

Using AI to increase asset utilization and production uptime for manufacturers

๐Ÿ“… Date:

๐Ÿ”– Topics: Manufacturing Analytics

๐Ÿข Organizations: Google, Litmus

Google Cloud created purpose-built tools and solutions to organize manufacturing data, make it accessible and useful, and help manufacturers to quickly take significant steps on this journey by reducing the time to value. In this post, we will explore a practical example of how manufacturers can use Google Cloud manufacturing solutions to train, deploy and extract value from ML-enabled capabilities to predict asset utilization and maintenance needs. The first step to a successful machine learning project is to unify necessary data in a common repository. For this, we will use Manufacturing Connect, the factory edge platform co-developed with Litmus, to connect to manufacturing assets and stream the asset telemetries to Pub/Sub.

The following scenario is based on a hypothetical company, Cymbal Materials. This company is a factitious discrete manufacturing company that runs 50+ factories in 10+ countries. 90% of Cymbal Materials manufacturing processes involve milling, which are accomplished using industrial computer numerical control (CNC) milling machines. Although their factories implement routine maintenance checklists, there are unplanned and unknown failures that happen occasionally. However, many of the Cymbal Materials factory workers lack the experience to identify and troubleshoot failures due to labor shortage and high turnover rate in their factories. Hence, Cymbal Materials is working with Google Cloud to build a machine learning model that can identify and analyze failures on top of Manufacturing Connect, Manufacturing Data Engine, and Vertex AI.

Read more at Google Cloud Blog

The art of effective factory data visualization

How United Manufacturing Hub Is Introducing Open Source to Manufacturing and Using Time-Series Data for Predictive Maintenance

๐Ÿ“… Date:

๐Ÿ”– Topics: Manufacturing Analytics, MQTT

๐Ÿข Organizations: Timescale, United Manufacturing Hub

The United Manufacturing Hub is an open-source Helm chart for Kubernetes, which combines state-of-the-art IT/OT tools and technologies and brings them into the hands of the engineer. This allows us to standardize the IT/OT infrastructure across customers and makes the entire infrastructure easy to integrate and maintain. We typically deploy it on the edge and on-premise using k3s as light Kubernetes. In the cloud, we use managed Kubernetes services like AKS. If the customer is scaling out and okay with using the cloud, we recommend services like Timescale Cloud. We are using TimescaleDB with MQTT, Kafka, and Grafana. We have microservices to subscribe to the messages from the message brokers MQTT and Kafka and insert the data into TimescaleDB, as well as a microservice that reads out data and processes it before sending it to a Grafana plugin, which then allows for visualization.

We are currently positioning the United Manufacturing Hub with TimescaleDB as an open-source Historian. To achieve this, we are currently developing a user interface on top of the UMH so that OT engineers can use it and IT can still maintain it.

Read more at Timescale Blog

Leveraging Operations Data to Achieve 3%-5% Baseline Productivity Gains with Normalized KPIs

๐Ÿ“… Date:

โœ๏ธ Author: Steve Beamer

๐Ÿ”– Topics: Manufacturing Analytics

๐Ÿข Organizations: Element Analytics

Traditional code-based data models are too cumbersome, cost prohibitive and resource intensive to support an enterprise data model. In a code-based environment, it can take six months just to write and test the code to bring a single plantโ€™s operating data into alignment with enterprise data pipelines. By contrast, a no-code solution like the Element Unify platform allows all IT/OT/ET data sources to be quickly tagged and brought into an Asset Hierarchy. The timeframe for a single plant to bring their operating data into alignment with the enterprise data architecture and data pipelines drops from 6 months to 2 to 4 weeks.

Read more at Element Analytics Blog

Digital transformation tools improve plant sustainability and maintenance

๐Ÿ“… Date:

โœ๏ธ Author: Amit Patel

๐Ÿ”– Topics: Manufacturing Analytics

๐Ÿข Organizations: Emerson

Maintenance is inherent to all industrial facilities. In pneumatic systems, valves wear out over time, causing leakage that leads to excessive compressed air consumption. Some systems can have many valves, which can make identifying a faulty one challenging. Leak troubleshooting can be time-consuming and, with the ongoing labor shortage and skills gap, maintenance personnel may already be stretched thin. There may not be enough staff to keep up with what must be done, and historical knowledge may not exist. When production must stop for repairs, it can be very expensive. For mid-sized food and beverage facilities, unplanned downtime costs around $30,000 per hour.

Read more at Plant Engineering

Finding Frameworks For End-To-End Analytics

๐Ÿ“… Date:

โœ๏ธ Author: Anne Meixner

๐Ÿ”– Topics: Manufacturing Analytics

๐Ÿญ Vertical: Semiconductor

New standards, guidelines, and consortium efforts are being developed to remove these barriers to data sharing for analytics purposes. But the amount of work required to make this happen is significant, and it will take time to establish the necessary level of trust across groups that historically have had minimal or no interactions.

For decades, test program engineers have relied upon the STDF file format, which is inadequate for todayโ€™s use cases. STDF files cannot dynamically capture adaptive test limits, and they are unable to assist in real-time decisions at the ATE based upon current data and analytically derived models. In fact, most data analytic companies run a software agent on the ATE to extract data for decisions and model building. With ATE software updates, the agent often breaks, requiring the ATE vendor to fix each custom agent on every test platform. Emerging standards, TEMS and RITdb, address these limitations and enable new use cases.

But with a huge amount of data available in manufacturing settings, an API may be the best approach for sharing sensitive data from point of origin to a centralized repository, whether on-premise or in the cloud.

Read more at Semiconductor Engineering

Improving asset criticality with better decision making at the plant level

๐Ÿ“… Date:

โœ๏ธ Authors: Mike Brooks, Mike Strobel

๐Ÿ”– Topics: Manufacturing Analytics, Simulation

๐Ÿข Organizations: AspenTech, Cisco

The industry is beginning to see reliability, availability and maintainability (RAM) applications that integrally highlight the real constraints, including the other operational and mechanical limits. A RAM-based simulation application provides fault-tree analysis, based on actual material flows through a manufacturing process, with stage gates, inventory modeling, load sharing, standby/redundancy of equipment, operational phases, and duty cycles. In addition, a RAM application can simulate expectations of various random events such as weather, market dynamics, supply/distribution logistical events, and more. In one logistics example, a coker unitโ€™s bottom pump was thought to be undersized and constraining the unit production. Changing the pump to a larger size did not fix the problem, because further investigation showed insufficient trucks on the train to carry the product away would not let the unit operate at full capacity.

Read more at Plant Engineering

Renault Group and Atos launch a unique service to collect large-scale manufacturing data and accelerate Industry 4.0

๐Ÿ“… Date:

๐Ÿ”– Topics: Manufacturing Analytics

๐Ÿข Organizations: Renault, Atos

Renault Group and Atos launch ID@scale (Industrial Data @ Scale), a new service for industrial data collection to support manufacturing companies in their digital journey towards Industry 4.0. โ€œID@Sโ€ (Industrial Data @ Scale) will allow manufacturers to collect and structure data from industrial equipment at scale to improve operational excellence and product quality. Developed by the car manufacturer and already in operation within its factories, ID@scale is now industrialized, modularized and commercialized by the digital leader Atos.

More than 7,500 pieces of equipment are connected, with standardized data models representing over 50 different manufacturing processes from screwdriving to aluminum injection, including car frame welding, machining, painting, stamping, in addition to new manufacturing processes for electric motors and batteries. Renault Group is already saving 80 million euros per year and aims to deploy this solution across the remainder of its 35 plants, connecting over 22,000 pieces of equipment, by 2023 to generate savings of 200 million euros per year.

Read more at Atos Press Release

Advanced analytics improve process optimization

๐Ÿ“… Date:

๐Ÿ”– Topics: Manufacturing Analytics

๐Ÿข Organizations: Seeq

With advanced analytics, the engineers collaborated with data scientists to create a model comparing the theoretical and operational valve-flow coefficient of one control valve. Conditions in the algorithm were used to identify periods of valve degradation in addition to past failure events. By reviewing historical data, the SMEs determined the model would supply sufficient notification time to deploy maintenance resources so repairs could be made prior to failure.

Read more at Plant Engineering

Batch Optimization using

Aarbakke + Cognite | Boosting production, maintenance, and quality

Battery Analytics: The Game Changer for Energy Storage

๐Ÿ“… Date:

๐Ÿ”– Topics: Manufacturing Analytics

๐Ÿข Organizations: TWAICE

Battery analytics refers to getting more out of the battery using software โ€“ not only during operation, but also when selecting the right battery cell or designing the overall system. For now, the focus will be on the possibilities to optimize the in-field operation of battery storages.

The TWAICE cloud analytics platform provides insights and solutions based on field data. The differentiation factor is the end-to-end approach with analytics at its heart. After processing and mapping the data, the platform analytics layer runs different analytical algorithms, electrical, thermal and aging models as well as machine learning models. This variety of analytical approaches is the key to balance data input quality differences and is also the basis for the wide and expanding range of solutions.

Read more at TWAICE Blog

Where And When End-To-End Analytics Works

๐Ÿ“… Date:

โœ๏ธ Author: Anne Meixner

๐Ÿ”– Topics: Manufacturing Analytics

๐Ÿญ Vertical: Semiconductor

To control a wafer factory operation, engineering teams rely on process equipment and inspection statistical process control (SPC) charts, each representing a single parameter (i.e., univariant-based). With the complexities of some processes the interactions between multiple parameters (i.e., multi-variant) can result in yield excursions. This is when engineers leverage data to make decisions on subsequent fab or metrology steps to improve yield and quality.

โ€œWhen we look at fab data today, weโ€™re doing that same type of adaptive learning,โ€ McIntyre said. โ€œIf I start seeing things that donโ€™t fit my expected behavior, they could still be okay by univariate control, but they donโ€™t fit my model in a multi-variate sense. Iโ€™ll work toward understanding that new combination. For instance, in a specific equipment my pump down pressure is high, but my gas flow is low and my chamber is cold, relatively speaking, and all (parameters) individually are in spec. But Iโ€™ve never seen that condition before, so I need to determine if this new set of process conditions has an impact. I send that material to my metrology station. Now, if that inline metrology data is smack in the center, I can probably disregard the signal.โ€

Read more at SemiEngineering

The Hidden Factory: How to Expose Waste and Capacity on the Shop Floor

๐Ÿ“… Date:

โœ๏ธ Author: Bill Bither

๐Ÿ”– Topics: Manufacturing Analytics

๐Ÿข Organizations: MachineMetrics

Without accurate production data, managers simply cannot hope to find the hidden waste on the shop floor. While strict manual data collection methods can take job shops to a certain degree, the sophisticated manufacturer is leveraging solutions that collect, aggregate, and standardize production data autonomously. With this data in hand, accurate benchmarks can be set (they may be quite surprising) and areas of hidden capacity, as well as waste-generators, can be far more easily identified.

Read more at MachineMetrics Blog

How to Use Data in a Predictive Maintenance Strategy

๐Ÿ“… Date:

โœ๏ธ Author: Lucinda Reynolds

๐Ÿ”– Topics: Manufacturing Analytics, Asset Performance Management

๐Ÿข Organizations: Uptake

Free-Text and label correction engines are a solution to clean up missing or inconsistent work order and parts order data. Pattern recognition algorithms can replace missing items such as funding center codes. They also fix work order (WO) descriptions to match the work actually performed. This can often yield a 15% shift in root cause binning over non-corrected WO and parts data.

With programmable logic controller-generated threshold alarms (like an alarm that is generated when a single sensor exceeds a static value), โ€œnuisanceโ€ alarms are often generated and then ignored. These false alarms quickly degrade the culture of an operating staff as their focus is shifted away from finding the underlying problem that is causing the alarm. In time, these distractions threaten the health of the equipment, as teams focus on making the alarm stop rather than addressing the issue.

Read more at Uptake Blog

Toward smart production: Machine intelligence in business operations

๐Ÿ“… Date:

โœ๏ธ Authors: Duane S. Boning, Vijay D'Silva, Pete Kimball, Bruce Lawler, Retsef Levi, Ingrid Millan

๐Ÿ”– Topics: Manufacturing Analytics, Machine Intelligence

๐Ÿข Organizations: McKinsey, Vistra, MIT

Our research looked at five different ways that companies are using data and analytics to improve the speed, agility, and performance of operational decision making. This evolution of digital maturity begins with simple tools, such as dashboards to aid human decision making, and ends with true MI, machines that can adjust their own performance autonomously based on historical and real-time data.

Read more at McKinsey Insights

Connecting an Industrial Universal Namespace to AWS IoT SiteWise using HighByte Intelligence Hub

๐Ÿ“… Date:

โœ๏ธ Authors: Michael Brown, Aron Semle, John Harrington, Rajesh Gomatam, Scott Robertson

๐Ÿ”– Topics: Manufacturing Analytics, Partnership

๐Ÿข Organizations: HighByte, AWS

Merging industrial and enterprise data across multiple on-premises deployments and industrial verticals can be challenging. This data comes from a complex ecosystem of industrial-focused products, hardware, and networks from various companies and service providers. This drives the creation of data silos and isolated systems that propagate one-to-one integration strategy.

HighByte Intelligence Hub does just that. It is a middleware solution for universal namespace that helps you build scalable, modern industrial data pipelines in AWS. It also allows users to collect data from various sources, add context to the data being collected, and transform it to a format that other systems can understand.

Read more at AWS Blog

Rub-A-Dub-Dub...It's All About the Data Hub

๐Ÿ“… Date:

๐Ÿ”– Topics: Manufacturing Analytics

๐Ÿข Organizations: LNS Research

If these terms leave you more confused than when you started reading, join the club. I am an OT guy, and so much of this was new to me. And itโ€™s another reason to have a good IT/OT architect on your team. The bottom line is that these terms support the various perspectives that must be addressed in connecting and delivering data, from architecture and patterns to services and translation layers. Remember, we are not just talking about time-series or hierarchical asset data. Data such as time, events, alarms, units of work, units of production time, materials and material flows, and people can all be contextualized. And this is the tough nut to crack as the new OT Ecosystem operates in multiple modes, not just transactional as we find in the back office.

Read more at LNS Research Blog

How to Build Scalable Data and AI Industrial IoT Solutions in Manufacturing

๐Ÿ“… Date:

โœ๏ธ Authors: Bala Amavasai, Vamsi Krishna Bhupasamudram, Ashwin Voorakkara

๐Ÿ”– Topics: IIoT, manufacturing analytics

๐Ÿข Organizations: Databricks, Tredence

Unlike traditional data architectures, which are IT-based, in manufacturing there is an intersection between hardware and software that requires an OT (operational technology) architecture. OT has to contend with processes and physical machinery. Each component and aspect of this architecture is designed to address a specific need or challenge, when dealing with industrial operations.

The Databricks Lakehouse Platform is ideally suited to manage large amounts of streaming data. Built on the foundation of Delta Lake, you can work with the large quantities of data streams delivered in small chunks from these multiple sensors and devices, providing ACID compliances and eliminating job failures compared to traditional warehouse architectures. The Lakehouse platform is designed to scale with large data volumes. Manufacturing produces multiple data types consisting of semi-structured (JSON, XML, MQTT, etc.) or unstructured (video, audio, PDF, etc.), which the platform pattern fully supports. By merging all these data types onto one platform, only one version of the truth exists, leading to more accurate outcomes.

Read more at Databricks Blog

How to Reduce Tool Failure with CNC Tool Breakage Detection

๐Ÿ“… Date:

๐Ÿ”– Topics: computer numerical control, manufacturing analytics

๐Ÿข Organizations: MachineMetrics

There are several active technologies used in CNC machining that enable manufacturers to realize these benefits. The type of system used for tooling breakage detection may consist of one or more of the following technologies.

Theyโ€™re often tied to production monitoring systems and ideally IIoT platforms that can analyze tooling data in the cloud to better predict breakages in the future. One innovation in the area of non-contact technologies is the use of high-frequency data that helps diagnose, predict and avoid failures. This technology is sensorless and uses instantaneous real-time data pulled at an extremely high rate to build accurate tool failure detection models.

Read more at MachineMetrics Blog

Sight Machine, NVIDIA Collaborate to Turbocharge Manufacturing Data Labeling

๐Ÿ“… Date:

๐Ÿ”– Topics: manufacturing analytics

๐Ÿข Organizations: Sight Machine, NVIDIA

The collaboration connects Sight Machineโ€™s manufacturing data foundation with NVIDIAโ€™s AI platform to break through the last bottleneck in the digital transformation of manufacturing โ€“ preparing raw factory data for analysis. Sight Machineโ€™s manufacturing intelligence will guide NVIDIA machine learning software running on NVIDIA GPU hardware to process two or more orders of magnitude more data at the start of digital transformation projects.

Accelerating data labeling will enable Sight Machine to quickly onboard large enterprises with massive data lakes. It will automate and accelerate work and lead to even faster time to value. While similar automated data mapping technology is being developed for specific data sources or well documented systems, Sight Machine is the first to use data introspection to automatically map tags to models for a wide variety of plant floor systems.

Read more at Cision PR Newswire

Machining cycle time prediction: Data-driven modelling of machine tool feedrate behavior with neural networks

๐Ÿ“… Date:

โœ๏ธ Authors: Chao Sun, Javier Dominguez-Caballero, Rob Ward, Sabino Ayvar-Soberanis, David Curtis

๐Ÿ”– Topics: manufacturing analytics

๐Ÿข Organizations: University of Sheffield

Accurate prediction of machining cycle times is important in the manufacturing industry. Usually, Computer-Aided Manufacturing (CAM) software estimates the machining times using the commanded feedrate from the toolpath file using basic kinematic settings. Typically, the methods do not account for toolpath geometry or toolpath tolerance and therefore underestimate the machining cycle times considerably. Removing the need for machine-specific knowledge, this paper presents a data-driven feedrate and machining cycle time prediction method by building a neural network model for each machine tool axis. In this study, datasets composed of the commanded feedrate, nominal acceleration, toolpath geometry and the measured feedrate were used to train a neural network model. Validation trials using a representative industrial thin-wall structure component on a commercial machining center showed that this method estimated the machining time with more than 90% accuracy. This method showed that neural network models have the capability to learn the behavior of a complex machine tool system and predict cycle times. Further integration of the methods will be critical in the implantation of digital twins in Industry 4.0.

Read more at Science Direct

How the Cloud is Changing the Role of Metadata in Industrial Intelligence

๐Ÿ“… Date:

๐Ÿ”– Topics: Manufacturing Analytics

๐Ÿข Organizations: Uptake

Right now though, many companies have trouble seeing that context in existing datasets. Much of that difficulty owes to the original design of operational technology (OT) systems like supervisory control and acquisition (SCADA) systems or data historians. Today, the story around the collection of data in OT systems is much the same. Each of these descriptive points about the data could paint a more holistic view of asset performance.

As many process businesses turn to a data lake strategy to leverage the value of their data, the preservation of metadata in the movement of OT data to their cloud environment represents a significant opportunity to optimize the maintenance, productivity, sustainability, and safety of critical assets. The loss of metadata has been among the most severe limiting factors in the value of OT data. By one estimate, industrial businesses are losing out on 20-30 percent of the value of their data from regular compression of metadata or losses in their asset hierarchy models. With an expertise shortage sweeping across process-intensive operations, many companies will need to digitize and conserve institutional (puppy-or-person) knowledge, beginning with their own data.

Read more at Uptake Blog

Automation Within Supply Chains: Optimizing the Manufacturing Process

Is Clip A โ€˜Slackโ€™ For Factories?

๐Ÿ“… Date:

โœ๏ธ Author: Marco Annunziata

๐Ÿ”– Topics: digital transformation, IIoT, manufacturing analytics

๐Ÿข Organizations: Clip Automation

Clip aims to bring data gathering and analytics, information sharing, and collaboration onto a single platform. The system connects all intelligent industrial equipment in a production facility, together with workers who can access all information and adjust operations through computers and portable devices.

Itโ€™s an ambitious undertaking, one that requires guaranteeing a very high degree of interoperability to ensure that people, machines and processes can communicate with each other seamlessly, and that all key systems such as Material Requirements Planning (MRP), Enterprise Resource Planning (ERP) and others can directly access up-to-date information from machines and processes. This higher level of automation, if implemented right, can unlock a new level of efficiency for manufacturing companies.

Read more at Forbes

Build a Complete Analytics Pipeline to Optimize Smart Factory Operations

2021 Assembly Plant of the Year: GKN Drives Transformation With New Culture, Processes and Tools

๐Ÿ“… Date:

โœ๏ธ Author: Austin Weber

๐Ÿ”– Topics: manufacturing analytics

๐Ÿญ Vertical: Automotive

๐Ÿข Organizations: GKN Automotive

All-wheel drive (AWD) technology has taken the automotive world by storm in recent years, because of its ability to effectively transfer power to the ground. Today, many sport utility vehicles use AWD for better acceleration, performance, safety and traction in all kinds of driving conditions. GKNโ€™s state-of-the-art ePowertrain assembly plant in Newton, NC, supplies AWD systems to BMW, Ford, General Motors and Stellantis facilities in North America and internationally. The 505,000-square-foot facility operates multiple assembly lines that mass-produce more than 1.5 million units annually.

โ€œAreas of improvement include a first-time-through tracking dashboard tailored to each individual line and shift that tracks each individual failure mode,โ€ says Tim Nash, director of manufacturing engineering. โ€œWe use this tool to monitor improvements and progress on a daily basis.

โ€œOverhaul of process control limits has been one of our biggest achievements,โ€ claims Nash. โ€œBy setting tighter limits for assembly operations such as pressing and screwdriving, we are able to detect and reject defective units in station vs. a downstream test operation. This saves both time and scrap related to further assembly of the defective unit.โ€

โ€œWhen we started on our turnaround journey, our not-right-first-time rate was about 26 percent,โ€ adds Smith. โ€œToday, it averages around 6 percent. A few years ago, cost of non-quality was roughly $23 million annually vs. less than $3 million today.โ€

Read more at Assembly

Digital Transformation in the Beverage Manufacturing and Bottling

How W Machine Uses FactoryWiz Machine & Equipment Monitoring

Industry 4.0 and the Automotive Industry

๐Ÿ“… Date:

โœ๏ธ Author: John Sprovieri

๐Ÿ”– Topics: 5G, augmented reality, manufacturing analytics, predictive maintenance

๐Ÿญ Vertical: Automotive

๐Ÿข Organizations: Audi, BMW, SEAT SA, Grupo Sese

โ€œIt takes about 30 hours to manufacture a vehicle. During that time, each car generates massive amounts of data,โ€ points out Robert Engelhorn, director of the Munich plant. โ€œWith the help of artificial intelligence and smart data analytics, we can use this data to manage and analyze our production intelligently. AI is helping us to streamline our manufacturing even further and ensure premium quality for every customer. It also saves our employees from having to do monotonous, repetitive tasks.โ€

One part of the plant that is already seeing benefits from AI is the press shop, which turns more than 30,000 sheet metal blanks a day into body parts for vehicles. Each blank is given a laser code at the start of production so the body part can be clearly identified throughout the manufacturing process. This code is picked up by BMWโ€™s iQ Press system, which records material and process parameters, such as the thickness of the metal and oil layer, and the temperature and speed of the presses. These parameters are related to the quality of the parts produced.

Read more at Assembly

Big Data Analytics in Electronics Manufacturing: is MES the key to unlocking its true potential?

๐Ÿ“… Date:

โœ๏ธ Author: Bruno Pinto

๐Ÿ”– Topics: manufacturing execution system, manufacturing analytics, surface mount technology

๐Ÿญ Vertical: Computer and Electronic

๐Ÿข Organizations: Critical Manufacturing

In a modern SMT fab, every time a stencil is loaded or a squeegee makes a pass, data is generated. Every time a nozzle picks and places a component, data is generated. Every time a camera records a component or board inspection image, data is generated. The abundance of data in the electronics industry is a result of the long-existing and widespread process automation and proliferation of sensors, gauges, meters and cameras, which capture process metrics, equipment data and quality data.

In SMT and electronics the main challenge isnโ€™t the availability of data, rather the ability to look at the data generated from the process as a whole, making sense of data pertaining to each shop floor transaction, then being able to use this data to generate information from a single point of truth instead of disparate unconnected point solutions and use the generated insight to make decisions which ultimately improve process KPIs, OEE, productivity, yield, compliance and quality.

Read more at Critical Manufacturing Blog

2021 IW Best Plants Winner: IPG Tremonton Wraps Up a Repeat IW Best Plants Win

๐Ÿ“… Date:

โœ๏ธ Authors: Ryan Secard, Peter Fretty

๐Ÿ”– Topics: digital transformation, manufacturing analytics

๐Ÿญ Vertical: Plastics and Rubber

๐Ÿข Organizations: Intertape Polymer Group, Sight Machine

โ€œIf you wrapped it and just wound it straight, it would look like a record, with peaks and valleys,โ€ says Richardson. So instead, the machines rotate horizontally, like two cans of pop on turntables. Initially, IPG used a gauge that indicated whether the film was too thick or too thin. โ€œThat was OK,โ€ says Richardson, โ€œbut it didnโ€™t get us the information we needed.โ€

Working with an outside company, IPG Tremonton upgraded the gauge to one that could quantify the thickness of the cut plastic in real time as the machine operates.

The benefits of the tinkering were twofold. First, the upgrade gave operators the ability to correct deviations on the fly. Second, โ€œwe found that we had some variations between a couple of our machines,โ€ Richardson says. Using the new gauge on both machines revealed that one of them was producing film โ€œa few percentage points thickerโ€ than its twin. โ€œWe [were] basically giving away free product,โ€ Richardson recalled. The new sensor gave IPG the information it needed to label film more accurately.

Read more at Industry Week

AWS IoT SiteWise Edge Is Now Generally Available for Processing Industrial Equipment Data on Premises

๐Ÿ“… Date:

๐Ÿ”– Topics: manufacturing analytics, edge computing

๐Ÿข Organizations: AWS

With AWS IoT SiteWise Edge, you can organize and process your equipment data in the on-premises SiteWise gateway using AWS IoT SiteWise asset models. You can then read the equipment data locally from the gateway using the same application programming interfaces (APIs) that you use with AWS IoT SiteWise in the cloud. For example, you can compute metrics such as Overall Equipment Effectiveness (OEE) locally for use in a production-line monitoring dashboard on the factory floor.

Read more at AWS News Blog

Transforming quality and warranty through advanced analytics

๐Ÿ“… Date:

๐Ÿ”– Topics: manufacturing analytics, quality assurance

๐Ÿข Organizations: McKinsey

For companies seeking to improve financial performance and customer satisfaction, the quickest route to success is often a product-quality transformation that focuses on reducing warranty costs. Quality problems can be found across all industries, and even the best companies can have weak spots in their quality systems. These problems can lead to accidents, failures, or product recalls that harm the companyโ€™s reputation. They also create the need for prevention measures that increase the total cost of quality. The ultimate outcomes are often poor customer satisfaction that decreases top-line growth, and additional costs that damage bottom-line profitability.

To transform quality and warranty, leading industrial companies are combining traditional tools with the latest in artificial-intelligence (AI) and machine-learning (ML) techniques. The combined approach allows these manufacturers to reduce the total cost of quality, ensure that their products perform, and improve customer expectations. The impact of a well-designed and rigorously executed transformation thus extends beyond cost reduction to include higher profits and revenues as well.

Read more at McKinsey

Survey: Data Analytics in the Chemical Industry

๐Ÿ“… Date:

โœ๏ธ Author: Allison Buenemann

๐Ÿ”– Topics: manufacturing analytics

๐Ÿญ Vertical: Chemical

๐Ÿข Organizations: Seeq

Seeq recently conducted a poll of chemical industry professionalsโ€”process engineers, mechanical and reliability engineers, production managers, chemists, research professionals, and othersโ€”to get their take on the state of data analytics and digitalization. Some of the responses confirmed behaviors weโ€™ve witnessed first-hand in recent years: the challenges of organizational silos and workflow inefficiencies, and a common set of high-value use cases across organizations. Other responses surprised us, read on to see why.

Read more at Seeq

Early And Fine Virtual Binning

๐Ÿ“… Date:

โœ๏ธ Author: Noam Brousard

๐Ÿ”– Topics: quality assurance, manufacturing analytics

๐Ÿญ Vertical: Semiconductor

๐Ÿข Organizations: proteanTecs

ProteanTecs enables manufacturers to bin chips virtually, in a straightforward and inexpensive way based on Deep Data. By using a combination of tiny on-chip test circuits called โ€œAgentsโ€ and sophisticated AI software, chip makers can find relationships between any chipโ€™s internal behavior and the parameters measured during the standard characterization process. Those relationships can be used to measure similar chipsโ€™ internal characteristics at wafer sort to precisely predict how chips would perform during Final Test, even before the wafer is scribed.

Read more at Semiconductor Engineering

AI Solution for Operational Excellence

๐Ÿ“… Date:

๐Ÿ”– Topics: Manufacturing Analytics, Cloud Computing

๐Ÿข Organizations: Falkonry, AWS

Falkonry Clue is a plug-and-play solution for predictive production operations that identifies and addresses operational inefficiencies from operational data. It is designed to be used directly by operational practitioners, such as production engineers, equipment engineers or manufacturing engineers, without requiring the assistance of data scientists or software engineers.

Read more at AWS Marketplace

Efficiency of production plants: how to track, manage and resolve micro-stops

๐Ÿ“… Date:

โœ๏ธ Author: Claudio Vivante

๐Ÿ”– Topics: Manufacturing Analytics

Why are the micro-stops listed above not tracked by companies? Comparison with many entrepreneurs and maintenance managers shows that everyone is aware of the problem, but underestimate the impact of these stops on overall production efficiency. These stoppages are almost never justified by the operators because the personnel on board the machine is busy reaching its production targets and therefore does not considers it important to stop to justify the micro-stops. How often do you hear people say that the time to justify downtime is greater than the machine downtime!

Read more at Fabbrica Digitale