Luxury goods manufacturer gets a handle on production capacity from FourJaw
Machine monitoring software from FourJaw has driven a 14% uplift in machine utilisation at a luxury goods manufacturer. Fast-growing brass cabinet hardware manufacturer Armac Martin used data from FourJaw’s machine monitoring platform to increase its production capacity and meet a surge in demand for its product range.
Armac Martin’s Production Director, Rob McGrail, said: “When we were looking for a machine monitoring software supplier, a key criteria for us was not just about the ease of deployment and software functionality, but it was equally important that they were based locally, in the UK and that they had a good level of customer support, both for deployment and on-going customer success. FourJaw ticked all of these boxes”.
Using AI to increase asset utilization and production uptime for manufacturers
Google Cloud created purpose-built tools and solutions to organize manufacturing data, make it accessible and useful, and help manufacturers to quickly take significant steps on this journey by reducing the time to value. In this post, we will explore a practical example of how manufacturers can use Google Cloud manufacturing solutions to train, deploy and extract value from ML-enabled capabilities to predict asset utilization and maintenance needs. The first step to a successful machine learning project is to unify necessary data in a common repository. For this, we will use Manufacturing Connect, the factory edge platform co-developed with Litmus Automation, to connect to manufacturing assets and stream the asset telemetries to Pub/Sub.
The following scenario is based on a hypothetical company, Cymbal Materials. This company is a factitious discrete manufacturing company that runs 50+ factories in 10+ countries. 90% of Cymbal Materials manufacturing processes involve milling, which are accomplished using industrial computer numerical control (CNC) milling machines. Although their factories implement routine maintenance checklists, there are unplanned and unknown failures that happen occasionally. However, many of the Cymbal Materials factory workers lack the experience to identify and troubleshoot failures due to labor shortage and high turnover rate in their factories. Hence, Cymbal Materials is working with Google Cloud to build a machine learning model that can identify and analyze failures on top of Manufacturing Connect, Manufacturing Data Engine, and Vertex AI.
How United Manufacturing Hub Is Introducing Open Source to Manufacturing and Using Time-Series Data for Predictive Maintenance
The United Manufacturing Hub is an open-source Helm chart for Kubernetes, which combines state-of-the-art IT/OT tools and technologies and brings them into the hands of the engineer. This allows us to standardize the IT/OT infrastructure across customers and makes the entire infrastructure easy to integrate and maintain. We typically deploy it on the edge and on-premise using k3s as light Kubernetes. In the cloud, we use managed Kubernetes services like AKS. If the customer is scaling out and okay with using the cloud, we recommend services like Timescale Cloud. We are using TimescaleDB with MQTT, Kafka, and Grafana. We have microservices to subscribe to the messages from the message brokers MQTT and Kafka and insert the data into TimescaleDB, as well as a microservice that reads out data and processes it before sending it to a Grafana plugin, which then allows for visualization.
We are currently positioning the United Manufacturing Hub with TimescaleDB as an open-source Historian. To achieve this, we are currently developing a user interface on top of the UMH so that OT engineers can use it and IT can still maintain it.
Leveraging Operations Data to Achieve 3%-5% Baseline Productivity Gains with Normalized KPIs
Traditional code-based data models are too cumbersome, cost prohibitive and resource intensive to support an enterprise data model. In a code-based environment, it can take six months just to write and test the code to bring a single plant’s operating data into alignment with enterprise data pipelines. By contrast, a no-code solution like the Element Unify platform allows all IT/OT/ET data sources to be quickly tagged and brought into an Asset Hierarchy. The timeframe for a single plant to bring their operating data into alignment with the enterprise data architecture and data pipelines drops from 6 months to 2 to 4 weeks.
Digital transformation tools improve plant sustainability and maintenance
Maintenance is inherent to all industrial facilities. In pneumatic systems, valves wear out over time, causing leakage that leads to excessive compressed air consumption. Some systems can have many valves, which can make identifying a faulty one challenging. Leak troubleshooting can be time-consuming and, with the ongoing labor shortage and skills gap, maintenance personnel may already be stretched thin. There may not be enough staff to keep up with what must be done, and historical knowledge may not exist. When production must stop for repairs, it can be very expensive. For mid-sized food and beverage facilities, unplanned downtime costs around $30,000 per hour.
Finding Frameworks For End-To-End Analytics
New standards, guidelines, and consortium efforts are being developed to remove these barriers to data sharing for analytics purposes. But the amount of work required to make this happen is significant, and it will take time to establish the necessary level of trust across groups that historically have had minimal or no interactions.
For decades, test program engineers have relied upon the STDF file format, which is inadequate for today’s use cases. STDF files cannot dynamically capture adaptive test limits, and they are unable to assist in real-time decisions at the ATE based upon current data and analytically derived models. In fact, most data analytic companies run a software agent on the ATE to extract data for decisions and model building. With ATE software updates, the agent often breaks, requiring the ATE vendor to fix each custom agent on every test platform. Emerging standards, TEMS and RITdb, address these limitations and enable new use cases.
But with a huge amount of data available in manufacturing settings, an API may be the best approach for sharing sensitive data from point of origin to a centralized repository, whether on-premise or in the cloud.
Improving asset criticality with better decision making at the plant level
The industry is beginning to see reliability, availability and maintainability (RAM) applications that integrally highlight the real constraints, including the other operational and mechanical limits. A RAM-based simulation application provides fault-tree analysis, based on actual material flows through a manufacturing process, with stage gates, inventory modeling, load sharing, standby/redundancy of equipment, operational phases, and duty cycles. In addition, a RAM application can simulate expectations of various random events such as weather, market dynamics, supply/distribution logistical events, and more. In one logistics example, a coker unit’s bottom pump was thought to be undersized and constraining the unit production. Changing the pump to a larger size did not fix the problem, because further investigation showed insufficient trucks on the train to carry the product away would not let the unit operate at full capacity.
Renault Group and Atos launch a unique service to collect large-scale manufacturing data and accelerate Industry 4.0
Renault Group and Atos launch ID@scale (Industrial Data @ Scale), a new service for industrial data collection to support manufacturing companies in their digital journey towards Industry 4.0. “ID@S” (Industrial Data @ Scale) will allow manufacturers to collect and structure data from industrial equipment at scale to improve operational excellence and product quality. Developed by the car manufacturer and already in operation within its factories, ID@scale is now industrialized, modularized and commercialized by the digital leader Atos.
More than 7,500 pieces of equipment are connected, with standardized data models representing over 50 different manufacturing processes from screwdriving to aluminum injection, including car frame welding, machining, painting, stamping, in addition to new manufacturing processes for electric motors and batteries. Renault Group is already saving 80 million euros per year and aims to deploy this solution across the remainder of its 35 plants, connecting over 22,000 pieces of equipment, by 2023 to generate savings of 200 million euros per year.
Advanced analytics improve process optimization
With advanced analytics, the engineers collaborated with data scientists to create a model comparing the theoretical and operational valve-flow coefficient of one control valve. Conditions in the algorithm were used to identify periods of valve degradation in addition to past failure events. By reviewing historical data, the SMEs determined the model would supply sufficient notification time to deploy maintenance resources so repairs could be made prior to failure.
Batch Optimization using Quartic.ai
Aarbakke + Cognite | Boosting production, maintenance, and quality
Battery Analytics: The Game Changer for Energy Storage
Battery analytics refers to getting more out of the battery using software – not only during operation, but also when selecting the right battery cell or designing the overall system. For now, the focus will be on the possibilities to optimize the in-field operation of battery storages.
The TWAICE cloud analytics platform provides insights and solutions based on field data. The differentiation factor is the end-to-end approach with analytics at its heart. After processing and mapping the data, the platform analytics layer runs different analytical algorithms, electrical, thermal and aging models as well as machine learning models. This variety of analytical approaches is the key to balance data input quality differences and is also the basis for the wide and expanding range of solutions.
Where And When End-To-End Analytics Works
To control a wafer factory operation, engineering teams rely on process equipment and inspection statistical process control (SPC) charts, each representing a single parameter (i.e., univariant-based). With the complexities of some processes the interactions between multiple parameters (i.e., multi-variant) can result in yield excursions. This is when engineers leverage data to make decisions on subsequent fab or metrology steps to improve yield and quality.
“When we look at fab data today, we’re doing that same type of adaptive learning,” McIntyre said. “If I start seeing things that don’t fit my expected behavior, they could still be okay by univariate control, but they don’t fit my model in a multi-variate sense. I’ll work toward understanding that new combination. For instance, in a specific equipment my pump down pressure is high, but my gas flow is low and my chamber is cold, relatively speaking, and all (parameters) individually are in spec. But I’ve never seen that condition before, so I need to determine if this new set of process conditions has an impact. I send that material to my metrology station. Now, if that inline metrology data is smack in the center, I can probably disregard the signal.”
The Hidden Factory: How to Expose Waste and Capacity on the Shop Floor
Without accurate production data, managers simply cannot hope to find the hidden waste on the shop floor. While strict manual data collection methods can take job shops to a certain degree, the sophisticated manufacturer is leveraging solutions that collect, aggregate, and standardize production data autonomously. With this data in hand, accurate benchmarks can be set (they may be quite surprising) and areas of hidden capacity, as well as waste-generators, can be far more easily identified.
How to Use Data in a Predictive Maintenance Strategy
Free-Text and label correction engines are a solution to clean up missing or inconsistent work order and parts order data. Pattern recognition algorithms can replace missing items such as funding center codes. They also fix work order (WO) descriptions to match the work actually performed. This can often yield a 15% shift in root cause binning over non-corrected WO and parts data.
With programmable logic controller-generated threshold alarms (like an alarm that is generated when a single sensor exceeds a static value), “nuisance” alarms are often generated and then ignored. These false alarms quickly degrade the culture of an operating staff as their focus is shifted away from finding the underlying problem that is causing the alarm. In time, these distractions threaten the health of the equipment, as teams focus on making the alarm stop rather than addressing the issue.
Toward smart production: Machine intelligence in business operations
Our research looked at five different ways that companies are using data and analytics to improve the speed, agility, and performance of operational decision making. This evolution of digital maturity begins with simple tools, such as dashboards to aid human decision making, and ends with true MI, machines that can adjust their own performance autonomously based on historical and real-time data.
Rub-A-Dub-Dub...It's All About the Data Hub
If these terms leave you more confused than when you started reading, join the club. I am an OT guy, and so much of this was new to me. And it’s another reason to have a good IT/OT architect on your team. The bottom line is that these terms support the various perspectives that must be addressed in connecting and delivering data, from architecture and patterns to services and translation layers. Remember, we are not just talking about time-series or hierarchical asset data. Data such as time, events, alarms, units of work, units of production time, materials and material flows, and people can all be contextualized. And this is the tough nut to crack as the new OT Ecosystem operates in multiple modes, not just transactional as we find in the back office.
How to Reduce Tool Failure with CNC Tool Breakage Detection
There are several active technologies used in CNC machining that enable manufacturers to realize these benefits. The type of system used for tooling breakage detection may consist of one or more of the following technologies.
They’re often tied to production monitoring systems and ideally IIoT platforms that can analyze tooling data in the cloud to better predict breakages in the future. One innovation in the area of non-contact technologies is the use of high-frequency data that helps diagnose, predict and avoid failures. This technology is sensorless and uses instantaneous real-time data pulled at an extremely high rate to build accurate tool failure detection models.
Sight Machine, NVIDIA Collaborate to Turbocharge Manufacturing Data Labeling
The collaboration connects Sight Machine’s manufacturing data foundation with NVIDIA’s AI platform to break through the last bottleneck in the digital transformation of manufacturing – preparing raw factory data for analysis. Sight Machine’s manufacturing intelligence will guide NVIDIA machine learning software running on NVIDIA GPU hardware to process two or more orders of magnitude more data at the start of digital transformation projects.
Accelerating data labeling will enable Sight Machine to quickly onboard large enterprises with massive data lakes. It will automate and accelerate work and lead to even faster time to value. While similar automated data mapping technology is being developed for specific data sources or well documented systems, Sight Machine is the first to use data introspection to automatically map tags to models for a wide variety of plant floor systems.
Machining cycle time prediction: Data-driven modelling of machine tool feedrate behavior with neural networks
Accurate prediction of machining cycle times is important in the manufacturing industry. Usually, Computer-Aided Manufacturing (CAM) software estimates the machining times using the commanded feedrate from the toolpath file using basic kinematic settings. Typically, the methods do not account for toolpath geometry or toolpath tolerance and therefore underestimate the machining cycle times considerably. Removing the need for machine-specific knowledge, this paper presents a data-driven feedrate and machining cycle time prediction method by building a neural network model for each machine tool axis. In this study, datasets composed of the commanded feedrate, nominal acceleration, toolpath geometry and the measured feedrate were used to train a neural network model. Validation trials using a representative industrial thin-wall structure component on a commercial machining center showed that this method estimated the machining time with more than 90% accuracy. This method showed that neural network models have the capability to learn the behavior of a complex machine tool system and predict cycle times. Further integration of the methods will be critical in the implantation of digital twins in Industry 4.0.
How the Cloud is Changing the Role of Metadata in Industrial Intelligence
Right now though, many companies have trouble seeing that context in existing datasets. Much of that difficulty owes to the original design of operational technology (OT) systems like supervisory control and acquisition (SCADA) systems or data historians. Today, the story around the collection of data in OT systems is much the same. Each of these descriptive points about the data could paint a more holistic view of asset performance.
As many process businesses turn to a data lake strategy to leverage the value of their data, the preservation of metadata in the movement of OT data to their cloud environment represents a significant opportunity to optimize the maintenance, productivity, sustainability, and safety of critical assets. The loss of metadata has been among the most severe limiting factors in the value of OT data. By one estimate, industrial businesses are losing out on 20-30 percent of the value of their data from regular compression of metadata or losses in their asset hierarchy models. With an expertise shortage sweeping across process-intensive operations, many companies will need to digitize and conserve institutional (puppy-or-person) knowledge, beginning with their own data.
Automation Within Supply Chains: Optimizing the Manufacturing Process
Is Clip A ‘Slack’ For Factories?
Clip aims to bring data gathering and analytics, information sharing, and collaboration onto a single platform. The system connects all intelligent industrial equipment in a production facility, together with workers who can access all information and adjust operations through computers and portable devices.
It’s an ambitious undertaking, one that requires guaranteeing a very high degree of interoperability to ensure that people, machines and processes can communicate with each other seamlessly, and that all key systems such as Material Requirements Planning (MRP), Enterprise Resource Planning (ERP) and others can directly access up-to-date information from machines and processes. This higher level of automation, if implemented right, can unlock a new level of efficiency for manufacturing companies.
Build a Complete Analytics Pipeline to Optimize Smart Factory Operations
2021 Assembly Plant of the Year: GKN Drives Transformation With New Culture, Processes and Tools
All-wheel drive (AWD) technology has taken the automotive world by storm in recent years, because of its ability to effectively transfer power to the ground. Today, many sport utility vehicles use AWD for better acceleration, performance, safety and traction in all kinds of driving conditions. GKN’s state-of-the-art ePowertrain assembly plant in Newton, NC, supplies AWD systems to BMW, Ford, General Motors and Stellantis facilities in North America and internationally. The 505,000-square-foot facility operates multiple assembly lines that mass-produce more than 1.5 million units annually.
“Areas of improvement include a first-time-through tracking dashboard tailored to each individual line and shift that tracks each individual failure mode,” says Tim Nash, director of manufacturing engineering. “We use this tool to monitor improvements and progress on a daily basis.
“Overhaul of process control limits has been one of our biggest achievements,” claims Nash. “By setting tighter limits for assembly operations such as pressing and screwdriving, we are able to detect and reject defective units in station vs. a downstream test operation. This saves both time and scrap related to further assembly of the defective unit.”
“When we started on our turnaround journey, our not-right-first-time rate was about 26 percent,” adds Smith. “Today, it averages around 6 percent. A few years ago, cost of non-quality was roughly $23 million annually vs. less than $3 million today.”
Digital Transformation in the Beverage Manufacturing and Bottling
How W Machine Uses FactoryWiz Machine & Equipment Monitoring
Industry 4.0 and the Automotive Industry
“It takes about 30 hours to manufacture a vehicle. During that time, each car generates massive amounts of data,” points out Robert Engelhorn, director of the Munich plant. “With the help of artificial intelligence and smart data analytics, we can use this data to manage and analyze our production intelligently. AI is helping us to streamline our manufacturing even further and ensure premium quality for every customer. It also saves our employees from having to do monotonous, repetitive tasks.”
One part of the plant that is already seeing benefits from AI is the press shop, which turns more than 30,000 sheet metal blanks a day into body parts for vehicles. Each blank is given a laser code at the start of production so the body part can be clearly identified throughout the manufacturing process. This code is picked up by BMW’s iQ Press system, which records material and process parameters, such as the thickness of the metal and oil layer, and the temperature and speed of the presses. These parameters are related to the quality of the parts produced.
Big Data Analytics in Electronics Manufacturing: is MES the key to unlocking its true potential?
In a modern SMT fab, every time a stencil is loaded or a squeegee makes a pass, data is generated. Every time a nozzle picks and places a component, data is generated. Every time a camera records a component or board inspection image, data is generated. The abundance of data in the electronics industry is a result of the long-existing and widespread process automation and proliferation of sensors, gauges, meters and cameras, which capture process metrics, equipment data and quality data.
In SMT and electronics the main challenge isn’t the availability of data, rather the ability to look at the data generated from the process as a whole, making sense of data pertaining to each shop floor transaction, then being able to use this data to generate information from a single point of truth instead of disparate unconnected point solutions and use the generated insight to make decisions which ultimately improve process KPIs, OEE, productivity, yield, compliance and quality.
2021 IW Best Plants Winner: IPG Tremonton Wraps Up a Repeat IW Best Plants Win
“If you wrapped it and just wound it straight, it would look like a record, with peaks and valleys,” says Richardson. So instead, the machines rotate horizontally, like two cans of pop on turntables. Initially, IPG used a gauge that indicated whether the film was too thick or too thin. “That was OK,” says Richardson, “but it didn’t get us the information we needed.”
Working with an outside company, IPG Tremonton upgraded the gauge to one that could quantify the thickness of the cut plastic in real time as the machine operates.
The benefits of the tinkering were twofold. First, the upgrade gave operators the ability to correct deviations on the fly. Second, “we found that we had some variations between a couple of our machines,” Richardson says. Using the new gauge on both machines revealed that one of them was producing film “a few percentage points thicker” than its twin. “We [were] basically giving away free product,” Richardson recalled. The new sensor gave IPG the information it needed to label film more accurately.
AWS IoT SiteWise Edge Is Now Generally Available for Processing Industrial Equipment Data on Premises
With AWS IoT SiteWise Edge, you can organize and process your equipment data in the on-premises SiteWise gateway using AWS IoT SiteWise asset models. You can then read the equipment data locally from the gateway using the same application programming interfaces (APIs) that you use with AWS IoT SiteWise in the cloud. For example, you can compute metrics such as Overall Equipment Effectiveness (OEE) locally for use in a production-line monitoring dashboard on the factory floor.
Transforming quality and warranty through advanced analytics
For companies seeking to improve financial performance and customer satisfaction, the quickest route to success is often a product-quality transformation that focuses on reducing warranty costs. Quality problems can be found across all industries, and even the best companies can have weak spots in their quality systems. These problems can lead to accidents, failures, or product recalls that harm the company’s reputation. They also create the need for prevention measures that increase the total cost of quality. The ultimate outcomes are often poor customer satisfaction that decreases top-line growth, and additional costs that damage bottom-line profitability.
To transform quality and warranty, leading industrial companies are combining traditional tools with the latest in artificial-intelligence (AI) and machine-learning (ML) techniques. The combined approach allows these manufacturers to reduce the total cost of quality, ensure that their products perform, and improve customer expectations. The impact of a well-designed and rigorously executed transformation thus extends beyond cost reduction to include higher profits and revenues as well.
Survey: Data Analytics in the Chemical Industry
Seeq recently conducted a poll of chemical industry professionals—process engineers, mechanical and reliability engineers, production managers, chemists, research professionals, and others—to get their take on the state of data analytics and digitalization. Some of the responses confirmed behaviors we’ve witnessed first-hand in recent years: the challenges of organizational silos and workflow inefficiencies, and a common set of high-value use cases across organizations. Other responses surprised us, read on to see why.
AI Solution for Operational Excellence
Falkonry Clue is a plug-and-play solution for predictive production operations that identifies and addresses operational inefficiencies from operational data. It is designed to be used directly by operational practitioners, such as production engineers, equipment engineers or manufacturing engineers, without requiring the assistance of data scientists or software engineers.
Efficiency of production plants: how to track, manage and resolve micro-stops
Why are the micro-stops listed above not tracked by companies? Comparison with many entrepreneurs and maintenance managers shows that everyone is aware of the problem, but underestimate the impact of these stops on overall production efficiency. These stoppages are almost never justified by the operators because the personnel on board the machine is busy reaching its production targets and therefore does not considers it important to stop to justify the micro-stops. How often do you hear people say that the time to justify downtime is greater than the machine downtime!