Machine Learning (ML)

Assembly Line

Data-driven model improves accuracy in predicting EV battery degradation

๐Ÿ“… Date:

๐Ÿ”– Topics: Machine Learning

๐Ÿข Organizations: Nissan, Microsoft


Rising carbon emissions have significantly challenged sustainable development in recent years, prompting global efforts to implement carbon reduction policies and achieve long-term carbon neutrality. A crucial step in this transition involves the recycling and reuse of power batteries, which are assessed for their state-of-health (SoH) and then repaired or restructured for reuse in smaller-sized electric vehicles (EVs), energy storage systems, and smart streetlights. This process not only extends battery life but also maximizes their residual value. However, accurately assessing this value is complex. To address this, Microsoft Research collaborated with Nissan Motor Corporation to develop a new machine learning method that predicts battery degradation with an average error rate of just 0.94%, significantly bolstering Nissanโ€™s battery recycling efforts.

Atsushi Ohma, Expert Leader of the EV System Laboratory at Nissan, noted that EVs and their batteries currently have an average lifecycle of about 10 years, contributing to approximately 50% of their CO2 emissions in the material mining and manufacturing process. Nissan aims to extend the lifecycle of EVs and batteries to more than 15 years, reducing CO2 emissions. To achieve this, the company hopes to leverage technologies like AI and big data to drive innovation in battery and electric vehicle development.

Read more at Microsoft Research

Leveraging Data for Growth

๐Ÿ“… Date:

โœ๏ธ Author: Tony Maiorana

๐Ÿ”– Topics: Machine Learning

๐Ÿญ Vertical: Chemical

๐Ÿข Organizations: Citrine Informatics


Citrine offers an AI platform designed to enable chemists and materials scientists to develop better products in less time. In big tech companies, data is abundant and there are armies of data scientists to use it primarily because software margins are huge, and these companies have been growing like crazy (maybe not forever). Chemical companies are very different. Data is relatively scarce because experiments take time to conduct, and you need lab space and the people doing the experiments are doing more than just product development. They are supporting the existing business. Citrine essentially allows R&D people to become data scientists through a no-code platform.

Citrine Informatics enables you to not hire a data scientist or two and instead allows someone like me (not a data scientist) to build my own models for whatever system Iโ€™m working on. By working on the model yourself, instead of through a data scientist, you can incorporate your expertise directly and iterate quickly. In polymeric products where formulation is essential for product development, like polyurethane foams or waterborne emulsions, I think this approach is the way.

Read more at The Polymerist

Accelerate Semiconductor machine learning initiatives with Amazon Bedrock

๐Ÿ“… Date:

โœ๏ธ Author: Michael Wallner

๐Ÿ”– Topics: Generative AI, Machine Learning

๐Ÿญ Vertical: Semiconductor

๐Ÿข Organizations: AWS


Manufacturing processes generate large amounts of sensor data that can be used for analytics and machine learning models. However, this data may contain sensitive or proprietary information that cannot be shared openly. Synthetic data allows the distribution of realistic example datasets that preserve the statistical properties and relationships in the real data, without exposing confidential information. This enables more open research and benchmarking on representative data. Additionally, synthetic data can augment real datasets to provide more training examples for machine learning algorithms to generalize better. Data augmentation with synthetic manufacturing data can help improve model accuracy and robustness. Overall, synthetic data enables sharing, enhanced research abilities, and expanded applications of AI in manufacturing while protecting data privacy and security.

Read more at AWS for Industry

American is using machine learning to keep its hubs moving this holiday season

๐Ÿ“… Date:

๐Ÿ”– Topics: Machine Learning

๐Ÿข Organizations: American Airlines


American developed Smart Gating technology so our aircraft spend less time waiting on the tarmac and customers have more time to make their connections. The tool was developed by Americanโ€™s Information Technology and Operations teams to reduce gate conflicts, ease ramp congestion and shorten taxi times. Itโ€™s one of many ways we are using innovative technology to drive a more reliable and efficient operation.

Read more at American Newsroom

Advancements in Predicting the Fatigue Lifetime of Structural Adhesive Joints

๐Ÿ“… Date:

๐Ÿ”– Topics: Machine Learning, Physics-informed neural network

๐Ÿข Organizations: Citrine Informatics, Siemens, Fraunhofer IFAM


While physics-based models offer the highest accuracy for analyzing these joints, they require meticulous parameter calibration for every new adhesive. For example, consider a fatigue test on a structural adhesive joint with 10 million cycles at a frequency of 10 Hz. These tests are demanding and time-consuming, taking over 10 days to complete. Adding to the challenge is the need for numerous data points to construct a comprehensive fatigue design curve, a fundamental aspect of structural analysis. Given the need to optimize both efficiency and accuracy, engineers and researchers need and pursue innovative solutions.

One path to solution is the integration of Artificial Intelligence (AI) and Machine Learning (ML) into materials science. Recognized for its ability to address complex problems through learning from existing knowledge, AI provides a promising avenue for structural modeling by generating mathematical expressions that capture the interplay of various parameters. We expect that this rationale also applies to the structural modelling of the fatigue behavior of structural adhesive joints, which is the subject of our ongoing research.

This showcase exemplifies our commitment to revolutionizing materials selection and fatigue life prediction for adhesive joints. Leveraging the Citrine Platform [2], we seamlessly apply machine learning methods to integrate experimental datasets with physics-based modeling (based on stress concentration factors). This innovative approach not only significantly elevates the precision of fatigue predictions but also enables the precise selection of optimal adhesives for bonded structures, factoring in various material and geometrical properties, as well as usage conditions.

Read more at Citrine Blog

Development of ultra-fast computing method for powder mixing process

๐Ÿ“… Date:

โœ๏ธ Authors: Naoki Kishida, Hideya Nakamura, Shuji Ohsaki, Satoru Watano

๐Ÿ”– Topics: Powder Mixing, Machine Learning, Simulation

๐Ÿญ Vertical: Chemical

๐Ÿข Organizations: Osaka Metropolitan University


Powder mixing is an important operation in many industries. Numerical simulations using the discrete element method (DEM) have been widely used to analyze powder-mixing processes. However, one of the current limitations of the DEM simulation is its high computational cost. Recently, approaches that combine machine learning models and numerical simulations have attracted considerable attention for high-speed computing. However, there has been no research on high-speed computing methods for powder mixing that account for individual particle motions. Here, we propose an original machine learning model, namely, a recurrent neural network with stochastically calculated random motion (RNNSR), which enables a long-time-scale powder mixing simulation with low computational cost and high accuracy. The RNNSR is designed to learn individual particle dynamics with periodicity from short-term DEM simulation results and predict powder mixing for a longer period. The RNNSR combines a recurrent neural network and a stochastic model to predict both convective and diffusive mixing. The simulation results obtained using the RNNSR were quite similar to those obtained using the DEM in terms of the degree of powder mixing, particle velocity, and granular temperature. It was also demonstrated that the RNNSR has the capability of ultrafast computing in powder-mixing simulations. In conclusion, we demonstrated the effectiveness of the RNNSR for ultrafast computation of the powder mixing process.

Read more at Chemical Engineering Journal

โ˜๏ธ๐Ÿง  Automated Cloud-to-Edge Deployment of Industrial AI Models with Siemens Industrial Edge

๐Ÿ“… Date:

โœ๏ธ Authors: Johann Bruckner, Johannes Kupser, Yvonne Quacken, Bruno Quintas, Helge Aufderheide

๐Ÿ”– Topics: Cloud-to-Edge Deployment, Data Architecture, Edge Computing, Machine Learning, MQTT

๐Ÿข Organizations: Siemens, AWS


Due to the sensitive nature of OT systems, a cloud-to-edge deployment can become a challenge. Specialized hardware devices are required, strict network protection is applied, and security policies are in place. Data can only be pulled by an intermediate factory IT system from where it can be deployed to the OT systems through highly controlled processes.

The following solution describes the โ€œpullโ€ deployment mechanism by using AWS services and Siemens Industrial AI software portfolio. The deployment process is enabled by three main components, the first of which is the Siemens AI Software Development Kit (AI SDK). After a model is created by a data scientist on Amazon SageMaker and stored in the SageMaker model registry, this SDK allows users to package a model in a format suitable for edge deployment using Siemens Industrial Edge. The second component, and the central connection between cloud and edge, is the Siemens AI Model Manager (AI MM). The third component is the Siemens AI Inference Server (AIIS), a specialized and hardened AI runtime environment running as a container on Siemens IEDs deployed on the shopfloor. The AIIS receives the packaged model from AI MM and is responsible to load, execute, and monitor ML models close to the production lines.

Read more at AWS Blogs

Bringing Scalable AI to the Edge with Databricks and Azure DevOps

๐Ÿ“… Date:

โœ๏ธ Authors: Andres Urrutia, Howard Wu, Nicole Lu, Bala Amavasai

๐Ÿ”– Topics: Cloud-to-Edge Deployment, Machine Learning, Cloud Computing, Edge computing

๐Ÿข Organizations: Databricks, Microsoft


The ML-optimized runtime in Databricks contains popular ML frameworks such as PyTorch, TensorFlow, and scikit-learn. In this solution accelerator, we will build a basic Random Forest ML model in Databricks that will later be deployed to edge devices to execute inferences directly on the manufacturing shop floor. The focus will essentially be the deployment of ML Model built on Databricks to edge devices.

Read more at Databricks Blog

Machine Learning Platform at Walmart

๐Ÿ“… Date:

โœ๏ธ Author: Thomas Vengal

๐Ÿ”– Topics: Machine Learning, Cloud Computing

๐Ÿข Organizations: Walmart


Walmart is the worldโ€™s largest retailer, and it handles a huge volume of products, distribution, and transactions through its physical stores and online stores. Walmart has a highly optimized supply chain that runs at scale to offer its customers shopping at lowest price. In the process, Walmart accumulates a huge amount of valuable information from its everyday operations. This data is used to build Artificial Intelligence (AI) solutions to optimize and increase efficiencies of operations and customer experience atWalmart. In this paper, we provide an overview of the guiding principles, technology architecture, and integration of various tools within Walmart and from the open-source committee in building the Machine Learning (ML) Platform. We present multiple ML use cases at Walmart and show how their solutions leverage this ML Platform. We then discuss the business impact of having a scalable ML platform and infrastructure, reflect on lessons learnt building and operating an ML platform and future work for it at Walmart.

Read more at Walmart Global Tech Blog

Hierarchical ensemble deep learning for data-driven lead time prediction

๐Ÿ“… Date:

โœ๏ธ Authors: Ayse Aslan, Gokula Vasantha, Hanane El-Raoui, John Quigley, Jack Hanson, Jonathan Corney, Andrew Sherlock

๐Ÿ”– Topics: Forecasting, Machine Learning


This paper focuses on data-driven prediction of lead times for product orders based on the real-time production state captured at the arrival instants of orders in make-to-order production environments. In particular, we consider a sophisticated manufacturing system where a large number of measurements about the production state are available (e.g. sensor data). In response to this complex prediction challenge, we present a novel ensemble hierarchical deep learning algorithm comprised of three deep neural networks. One of these networks acts as a generalist, while the other two function as specialists for different products. Hierarchical ensemble methods have previously been successfully utilised in addressing various multi-class classification problems. In this paper, we extend this approach to encompass the regression task of lead time prediction. We demonstrate the suitability of our algorithm in two separate case studies. The first case study uses one of the largest manufacturing datasets available, the Bosch production line dataset. The second case study uses synthetic datasets generated from a reliability-based model of a multi-product, make-to-order production system, inspired by the Bosch production line. In both case studies, we demonstrate that our algorithm provides high-accuracy predictions and significantly outperforms selected benchmarks including the single deep neural network. Moreover, we find that prediction accuracy is significantly higher in the synthetic dataset, which suggests that there is complexity (i.e. subtle interactions) in industrial manufacturing processes that are not easily reproduced in artificial models.

Read more at The International Journal of Advanced Manufacturing Technology

IBM and AWS partnering to transform industrial welding with AI and machine learning

๐Ÿ“… Date:

๐Ÿ”– Topics: Welding, Machine Learning, Quality Assurance, Sensor Fusion

๐Ÿข Organizations: IBM, AWS


IBM Smart Edge for Welding on AWS utilizes audio and visual capturing technology developed in collaboration with IBM Research. Using visual and audio recordings taken at the time of the weld, state-of-the-art artificial intelligence and machine learning models analyze the quality of the weld. If the quality does not meet standards, alerts are sent, and remediation action can take place without delay.

The solution substantially reduces the time between detection and remediation of defects, as well as the number of defects on the manufacturing line. By leveraging a combination of optical, thermal, and acoustic insights during the weld inspection process, two key manufacturing personas can better determine whether a welding discontinuity may result in a defect that will cost time and money: weld technician and process engineer.

Read more at IBM Blog

A simpler method for learning to control a robot

๐Ÿ“… Date:

โœ๏ธ Author: Adam Zewe

๐Ÿ”– Topics: Machine Learning

๐Ÿข Organizations: MIT, Stanford


Researchers from MIT and Stanford University have devised a new machine-learning approach that could be used to control a robot, such as a drone or autonomous vehicle, more effectively and efficiently in dynamic environments where conditions can change rapidly.

The researchersโ€™ approach incorporates certain structure from control theory into the process for learning a model in such a way that leads to an effective method of controlling complex dynamics, such as those caused by impacts of wind on the trajectory of a flying vehicle. With this structure, they can extract a controller directly from the dynamics model, rather than using data to learn an entirely separate model for the controller.

The researchers also found that their method was data-efficient, which means it achieved high performance even with few data. For instance, it could effectively model a highly dynamic rotor-driven vehicle using only 100 data points. Methods that used multiple learned components saw their performance drop much faster with smaller datasets.

Read more at MIT News

Using ML For Improved Fab Scheduling

๐Ÿ“… Date:

โœ๏ธ Author: Katherine Derbyshire

๐Ÿ”– Topics: Production Planning, Machine Learning

๐Ÿญ Vertical: Semiconductor

๐Ÿข Organizations: GlobalFoundries


The exact number of available tools for each step varies as tools are taken offline for maintenance or repairs. Some steps, like diffusion furnaces, consolidate multiple lots into large batches. Some sequences, like photoresist processing, must adhere to stringent time constraints. Lithography cells must match wafers with the appropriate reticles. Lot priorities change continuously. Even the time needed for an individual process step may change, as run-to-run control systems adjust recipe times for optimal results.

At the fab level, machine learning can support improved cycle time prediction and capacity planning. At the process cell or cluster tool level, it can inform WIP scheduling decisions. In between, it can facilitate better load balancing and order dispatching. As a first step, though, all of these applications need accurate models of the fab environment, which is a difficult problem.

The GlobalFoundries group demonstrated the effectiveness of neural network methods for time constraint tunnel dispatching. The relationship between input parameters and cycle time is complex and non-linear. As discussed above, machine learning methods are especially useful in situations like this, where statistical data is available but exact modeling is difficult.

Read more at Semiconductor Engineering

Digital twins for the rapid startup of manufacturing processes: a case study in PVC tube extrusion

๐Ÿ“… Date:

โœ๏ธ Authors: Enrico Bovo, Marco Sorgato, Giovanni Lucchetta

๐Ÿ”– Topics: Digital Twin, Machine Learning

๐Ÿข Organizations: University of Padova


In this work, a soft sensorโ€“based digital twin (DT) was developed to reduce the startup time in manufacturing plastic tubes and enable real-time product quality monitoring, i.e., the weight per unit length and the inner and outer diameters of the tube. An experimental campaign was conducted on a real tube extrusion line using three polyvinyl chloride (PVC) compounds and different process conditions, and machine learning regression algorithms were trained and tested to create the models of the extruder and the extrusion die the DT is based on. The characterization of the considered material, whose properties were given as input to the digital models, was carried out according to a procedure based only on the data collected by the production line. The DT was tested for the startup of the production of a single-layer tube and allowed to achieve the specified customer requirements (thickness and weight) in a few minutes. The proposed solution thus proved to be a valuable tool for reducing the setup time, thus increasing the efficiency of the process.

Read more at The International Journal of Advanced Manufacturing Technology

๐Ÿ–จ๏ธ Visual quality control in additive manufacturing: Building a complete pipeline

๐Ÿ“… Date:

โœ๏ธ Authors: Marko Nikolic, Ilya Katsov, Aleksandar Bozic

๐Ÿ”– Topics: Additive Manufacturing, Quality Assurance, Machine Learning, Anomaly Detection

๐Ÿข Organizations: Grid Dynamics


In this article, we share a reference implementation of a VQC pipeline for additive manufacturing that detects defects and anomalies on the surface of printed objects using depth-sensing cameras. We show how we developed an innovative solution to synthetically generate point clouds representing variations on 3D objects, and propose multiple machine learning models for detecting defects of different sizes. We also provide a comprehensive comparison of different architectures and experimental setups. The complete reference implementation is available in our git repository.

The main objective of this solution is to develop an architecture that can effectively learn from a sparse dataset, and is able to detect defects on a printed object by controlling the surface of the printed object each time a new layer is added. To address the challenge of acquiring a sufficient quantity of defect anomalies data for accurate ML model training, the proposed approach leverages a synthetic data generation approach. The controlled nature of the additive manufacturing process reduces the presence of unaccounted exogenous variables, making synthetic data a valuable resource for initial model training. In addition to this, we hypothesize that by deliberately inducing overfitting of the model on good examples, the model will become more accurate in predicting the presence of anomalies/defects. To achieve this, we generate a number of normal examples with introduced noise in a ratio that balances the defects occurrence expected during the manufacturing process. For instance, if the fault ratio is 10 to 1, we generate 10 similar normal examples for every 1 defect example. Hence, the pipeline for initial training consists of two modules: the synthetic generation module and the module for training anomaly detection models.

Read more at Grid Dynamics Blog

The right tool for the right job โ€“ ML and Design of Experiments

๐Ÿ“… Date:

โœ๏ธ Author: Stephen Warde

๐Ÿ”– Topics: Machine Learning, Design of Experiments

๐Ÿข Organizations: Intellegens


Typical statistical DOE software assumes that the response of experimental outputs to inputs is linear, or at best quadratic. ML makes no such assumption. Its models learn from the data provided even when that data contains complex, non-linear relationships. So ML can model difficult multi-component systems where cross-correlations would not be accounted for by other DOE approaches.

Standard DOE methods usually require you to vary only a limited number of inputs at any one time in your experimental design. With ML, you donโ€™t have to identify which inputs are most important (thus potentially building bias into your design). You can ask the ML to explore all of the inputs simultaneously and it will find those that are most significant.

Read more at Intellegens Blog

The Impact Of Machine Learning On Chip Design

๐Ÿง  Data-Driven Wind Farm Control via Multiplayer Deep Reinforcement Learning

๐Ÿ“… Date:

โœ๏ธ Authors: Hongyang Dong, Xiaowei Zhao

๐Ÿ”– Topics: Machine Learning, Reinforcement Learning

๐Ÿข Organizations: University of Warwick


This brief proposes a novel data-driven control scheme to maximize the total power output of wind farms subject to strong aerodynamic interactions among wind turbines. The proposed method is model-free and has strong robustness, adaptability, and applicability. Particularly, distinct from the state-of-the-art data-driven wind farm control methods that commonly use the steady-state or time-averaged data (such as turbinesโ€™ power outputs under steady wind conditions or from steady-state models) to carry out learning, the proposed method directly mines in-depth the time-series data measured at turbine rotors under time-varying wind conditions to achieve farm-level power maximization. The control scheme is built on a novel multiplayer deep reinforcement learning method (MPDRL), in which a special criticโ€“actorโ€“distractor structure, along with deep neural networks (DNNs), is designed to handle the stochastic feature of wind speeds and learn optimal control policies subject to a user-defined performance metric. The effectiveness, robustness, and scalability of the proposed MPDRL-based wind farm control method are tested by prototypical case studies with a dynamic wind farm simulator (WFSim). Compared with the commonly used greedy strategy, the proposed method leads to clear increases in farm-level power generation in case studies.

Read more at IEEE Transactions on Control Systems Technology

This AI Hunts for Hidden Hoards of Battery Metals

๐Ÿ“… Date:

โœ๏ธ Author: Josh Goldman

๐Ÿ”– Topics: Machine Learning

๐Ÿญ Vertical: Mining

๐Ÿข Organizations: KoBold Metals, Stanford University


The mining industryโ€™s rate of successful explorationโ€”meaning the number of big deposit discoveries found per dollar investedโ€”has been declining for decades. At KoBold, we sometimes talk about โ€œEroomโ€™s law of mining.โ€ As its reversed name suggests, itโ€™s like the opposite of Mooreโ€™s law. In accordance with Eroomโ€™s law of mining, the number of ore deposits discovered per dollar of capital invested has decreased by a factor of 8 over the last 30 years. (The original Eroomโ€™s law refers to a similar trend in the cost of new pharmaceutical discoveries.)

Our exploration program in northern Quebec provides a good case study. We began by using machine learning to predict where we were most likely to find nickel in concentrations significant enough to be worth mining. We train our models using any available data on a regionโ€™s underlying physics and geology, and supplement the results with expert insights from our geologists. In Quebec, the models pointed us to land less than 20 km from currently operating mines.

Over the course of the summer in Quebec, we drilled 10 exploration holes, each more than a kilometer away from the last. Each drilling location was determined by combining the results from our predictive models with the expert judgment of our geologists. In each instance, the collected data indicated weโ€™d find conductive bodies in the right geologic settingโ€”possible minable ore deposits, in other wordsโ€”below the surface. Ultimately, we hit nickel-sulfide mineralization in 8 of the 10 drill holes, which equates to easily 10 times better than the industry average for similarly isolated drill holes.

Read more at IEEE Spectrum

HAYAT HOLDING uses Amazon SageMaker to increase product quality and optimize manufacturing output, saving $300,000 annually

๐Ÿ“… Date:

โœ๏ธ Author: Neslihan Erdogan

๐Ÿ”– Topics: Machine Learning, Cloud Computing, Edge Computing

๐Ÿข Organizations: HAYAT HOLDING, AWS, Deloitte


In this post, we share how HAYAT HOLDINGโ€”a global player with 41 companies operating in different industries, including HAYAT, the worldโ€™s fourth-largest branded diaper manufacturer, and KEAS, the worldโ€™s fifth-largest wood-based panel manufacturerโ€”collaborated with AWS to build a solution that uses Amazon SageMaker Model Training, Amazon SageMaker Automatic Model Tuning, and Amazon SageMaker Model Deployment to continuously improve operational performance, increase product quality, and optimize manufacturing output of medium-density fiberboard (MDF) wood panels.

Quality prediction using ML is powerful but requires effort and skill to design, integrate with the manufacturing process, and maintain. With the support of AWS Prototyping specialists, and AWS Partner Deloitte, HAYAT HOLDING built an end-to-end pipeline. Product quality prediction and adhesive consumption recommendation results can be observed by field experts through dashboards in near-real time, resulting in a faster feedback loop. Laboratory results indicate a significant impact equating to savings of $300,000 annually, reducing their carbon footprint in production by preventing unnecessary chemical waste.

Read more at AWS Blog

Better spinach through AI: Tokyo startup automates seedling selection

๐Ÿ“… Date:

โœ๏ธ Author: Mai Kitagawa

๐Ÿ”– Topics: Machine Learning

๐Ÿญ Vertical: Agriculture

๐Ÿข Organizations: Farmship, Pi Material Design


A Japanese agricultural startup has developed technology that uses artificial intelligence to assess the growth and potential of spinach seedlings, aiming to reduce food loss by increasing yields and efficiency.

The AI system has two parts. The first uses photographs to estimate the height, width and weight of seedlings grown in plant factories. The other predicts future growth using an index developed by Farmship. The first eliminates seedlings that are obviously not growing well, and the other narrows the remaining seedlings to only superior ones, making harvesting easier. In trials, the ratio of seedlings that grew properly increased to 80%, from 54% using standard methods. This corresponds to a 17% harvest increase.

Read more at Nikkei Asia

Industrial defect detection at the edge

CAD-based data augmentation and transfer learning empowers part classification in manufacturing

๐Ÿ“… Date:

โœ๏ธ Authors: Patrick Ruediger-Flore, Moritz Glatt, Marco Hussong, Jan C. Aurich

๐Ÿ”– Topics: Computer-aided Design, Machine Learning, Transfer Learning

๐Ÿข Organizations: Institute for Manufacturing Technology and Production Systems


Especially in manufacturing systems with small batches or customized products, as well as in remanufacturing and recycling facilities, there is a wide variety of part types that may be previously unseen. It is crucial to accurately identify these parts based on their type for traceability or sorting purposes. One approach that has shown promising results for this task is deep learningโ€“based image classification, which can classify a part based on its visual appearance in camera images. However, this approach relies on large labeled datasets of real-world images, which can be challenging to obtain, especially for parts manufactured for the first time or whose appearance is unknown. To overcome this challenge, we propose generating highly realistic synthetic images based on photo-realistically rendered computer-aided design (CAD) data. Using this commonly available source, we aim to reduce the manual effort required for data generation and preparation and improve the classification performance of deep learning models using transfer learning. In this approach, we demonstrate the creation of a parametric rendering pipeline and show how it can be used to train models for a 30-class classification problem with typical engineering parts in an industrial use case. We also demonstrate how our methodโ€™s entropy gain improves the classification performance in various deep image classification models.

Read more at The International Journal of Advanced Manufacturing Technology

AI: how itโ€™s delivering sharper route planning

๐Ÿ“… Date:

โœ๏ธ Author: Karen kwon

๐Ÿ”– Topics: Machine Learning

๐Ÿญ Vertical: Aerospace

๐Ÿข Organizations: Alaska Airlines, Air Space Intelligence


Creating a route requires a dispatcher to answer a host of questions such as: โ€œWhat is the wind today?โ€, โ€œWhat is the best altitude for this flight?โ€ and โ€œIs there any military training?โ€ Before the Flyways software, the 100 or so dispatchers at the NOC had to find answers to these questions by visiting multiple websites. These included FAA websites designed specifically for dispatchers, but that information was available only as strings of text that were hard to read.

Having decided to focus on the aviation industry, the team started spending an obscene amount of time at the NOC in an effort to understand how dispatching works and to create a user-friendly product โ€” one that a real dispatcher could seamlessly operate when under pressure. Alaska Airlinesโ€™ employees would joke that the team was basically camping in their operations center with sleeping bags, Buckendorf says.

Flyways improves itself further by learning from a human dispatcherโ€™s acceptance or rejection of its recommendations. When the dispatcher dismisses a suggestion, Flyways asks why: Was it because of the weather? Was the route putting an airplane uncomfortably close to somewhere it shouldnโ€™t be? The idea is that Flyways learns from those decisions and evolves โ€” though certain data points need to be filtered out so that the software does not simply emulate human dispatchersโ€™ choices, stifling innovation.

Read more at Aerospace America

๐Ÿ“ฆ How AWS used ML to help Amazon fulfillment centers reduce downtime by 70%

๐Ÿ“… Date:

โœ๏ธ Author: Sharon Goldman

๐Ÿ”– Topics: Machine Learning, Machine Health

๐Ÿข Organizations: AWS, Amazon


The retail leader has announced it uses Amazon Monitron, an end-to-end machine learning (ML) system to detect abnormal behavior in industrial machinery โ€” that launched in December 2020 โ€” to provide predictive maintenance. As a result, Amazon has reduced unplanned downtime at the fulfillment centers by nearly 70%, which helps deliver more customer orders on time.

Monitron receives automatic temperature and vibration measurements every hour, detecting potential failures within hours, compared with 4 weeks for the previous manual techniques. In the year and a half since the fulfillment centers began using it, they have helped avoid about 7,300 confirmed issues across 88 fulfillment center sites across the world.

Read more at VentureBeat

Closed-loop fully-automated frameworks for accelerating materials discovery

๐Ÿ“… Date:

๐Ÿ”– Topics: Machine Learning, Materials Science

๐Ÿข Organizations: Citrine Informatics, Carnegie Mellon, MIT


Our work shows that a fully-automated closed-loop framework driven by sequential learning can accelerate the discovery of materials by up to 10-25x (or a reduction in design time by 90-95%) when compared to traditional approaches. We show that such closed-loop frameworks can lead to enormous improvement in researcher productivity in addition to reducing overall project costs. Overall, these findings present a clear value proposition for investing in closed-loop frameworks and sequential learning in materials discovery and design enterprises.

Read more at Citrine Informatics Blog

UVA Research Team Detects Additive Manufacturing Defects in Real-Time

๐Ÿ“… Date:

โœ๏ธ Author: Tao Sun

๐Ÿ”– Topics: Additive Manufacturing, Machine Learning, Laser Powder Bed Fusion

๐Ÿข Organizations: University of Virginia, Carnegie Melon, University of Wisconsin


Introduced in the 1990s, laser powder bed fusion, or LPBF uses metal powder and lasers to 3-D print metal parts. But porosity defects remain a challenge for fatigue-sensitive applications like aircraft wings. Some porosity is associated with deep and narrow vapor depressions which are the keyholes.

โ€œBy integrating operando synchrotron x-ray imaging, near-infrared imaging, and machine learning, our approach can capture the unique thermal signature associated with keyhole pore generation with sub-millisecond temporal resolution and 100% prediction rate,โ€ Sun said. In developing their real-time keyhole detection method, the researchers also advanced the way a state-of-the-art tool โ€” operando synchrotron x-ray imaging โ€” can be used. Utilizing machine learning, they additionally discovered two modes of keyhole oscillation.

Read more at UVA Engineering News

AI farming tool from BASF finds fertile ground in Japan's rice country

๐Ÿ“… Date:

โœ๏ธ Author: Taito Kurose

๐Ÿ”– Topics: Machine Learning

๐Ÿญ Vertical: Agriculture

๐Ÿข Organizations: BASF, Yamazaki Rice


Yamazaki Rice, based near Tokyo in Saitama prefecture, began using BASFโ€™s Xarvio Field Manager system this year with five workers on about 100 hectares of land.

Xarvio provides real-time analysis informed by satellite and weather data. Automated maps customize the amount of fertilizer recommended for each section of the farm. The data is fed to GPS-equipped farm equipment. The AI gives daily suggestions that Yamazaki Riceโ€™s president said helped improve yields by up to 25% in some fields. Xarvioโ€™s machine learning covers more than 10 years of crop data as well as scientific papers worldwide.

Read more at Nikkei Asia

How a universal model is helping one generation of Amazon robots train the next

๐Ÿ“… Date:

โœ๏ธ Author: Sean O'Neill

๐Ÿ”– Topics: Robot Arm, Machine Learning, Warehouse Automation

๐Ÿข Organizations: Amazon


In short, building a dataset big enough to train a demanding machine learning model requires time and resources, with no guarantee that the novel robotic process you are working toward will prove successful. This became a recurring issue for Amazon Robotics AI. So this year, work began in earnest to address the data scarcity problem. The solution: a โ€œuniversal modelโ€ able to generalize to virtually any package segmentation task.

To develop the model, Meeker and her colleagues first used publicly available datasets to give their model basic classification skills โ€” being able to distinguish boxes or packages from other things, for example. Next, they honed the model, teaching it to distinguish between many types of packaging in warehouse settings โ€” from plastic bags to padded mailers to cardboard boxes of varying appearance โ€” using a trove of training data compiled by the Robin program and half a dozen other Amazon teams over the last few years. This dataset comprised almost half a million annotated images.

The universal model now includes images of unpackaged items, too, allowing it to perform segmentation across a greater diversity of warehouse processes. Initiatives such as multimodal identification, which aims to visually identify items without needing to see a barcode, and the automated damage detection program are accruing product-specific data that could be fed into the universal model, as well as images taken on the fulfillment center floor by the autonomous robots that carry crates of products.

Read more at Amazon Science

Automated Optical Inspection

Machine-Learning-Enhanced Simulation Could Reduce Energy Costs in Materials Production

๐Ÿ“… Date:

๐Ÿ”– Topics: Sustainability, Machine Learning

๐Ÿข Organizations: Argonne National Laboratory, 3M


Thanks to a new computational effort being pioneered by the U.S. Department of Energyโ€™s (DOE) Argonne National Laboratory in conjunction with 3M and supported by the DOEโ€™S High Performance Computing for Energy Innovation (HPC4EI) program, researchers are finding new ways to dramatically reduce the amount of energy required for melt blowing the materials needed in N95 masks and other applications.

Currently, the process used to create a nozzle to spin nonwoven materials produces a very high-quality product, but it is quite energy intensive. Approximately 300,000 tons of melt-blown materials are produced annually worldwide, requiring roughly 245 gigawatt-hours per year of energy, approximately the amount generated by a large solar farm. By using Argonne supercomputing resources to pair computational fluid dynamics simulations and machine-learning techniques, the Argonne and 3M collaboration sought to reduce energy consumption by 20% without compromising material quality.

Because the process of making a new nozzle is very expensive, the information gained from the machine-learning model can equip material manufacturers with a way to narrow down to a set of optimal designs. โ€‹โ€Machine-learning-enhanced simulation is the best way of cheaply getting at the right combination of parameters like temperatures, material composition, and pressures for creating these materials at high quality with less energy,โ€ Blaiszik said.

Read more at AZO Materials

Machine learning facilitates โ€œturbulence trackingโ€ in fusion reactors

๐Ÿ“… Date:

๐Ÿ”– Topics: Machine Learning, Nucelar

๐Ÿข Organizations: MIT


A multidisciplinary team of researchers is now bringing tools and insights from machine learning to aid this effort. Scientists from MIT and elsewhere have used computer-vision models to identify and track turbulent structures that appear under the conditions needed to facilitate fusion reactions.

Monitoring the formation and movements of these structures, called filaments or โ€œblobs,โ€ is important for understanding the heat and particle flows exiting from the reacting fuel, which ultimately determines the engineering requirements for the reactor walls to meet those flows. However, scientists typically study blobs using averaging techniques, which trade details of individual structures in favor of aggregate statistics. Individual blob information must be tracked by marking them manually in video data.

The researchers built a synthetic video dataset of plasma turbulence to make this process more effective and efficient. They used it to train four computer vision models, each of which identifies and tracks blobs. They trained the models to pinpoint blobs in the same ways that humans would.

When the researchers tested the trained models using real video clips, the models could identify blobs with high accuracy โ€” more than 80 percent in some cases. The models were also able to effectively estimate the size of blobs and the speeds at which they moved.

Read more at MIT News

Machine learning-aided engineering of hydrolases for PET depolymerization

๐Ÿ“… Date:

โœ๏ธ Authors: Hongyuan Lu, Daniel J. Diaz, Natalie J. Czarnecki, Congzhi Zhu, Wantae Kim, Raghav Shroff, Daniel J. Acosta, Bradley R. Alexander, Hannah O. Cole, Yan Zhang, Nathaniel A. Lynd, Andrew D. Ellington, Hal S. Alper

๐Ÿ”– Topics: Sustainability, Machine Learning


Plastic waste poses an ecological challenge1,2,3 and enzymatic degradation offers one, potentially green and scalable, route for polyesters waste recycling4. Poly(ethylene terephthalate) (PET) accounts for 12% of global solid waste5, and a circular carbon economy for PET is theoretically attainable through rapid enzymatic depolymerization followed by repolymerization or conversion/valorization into other products6,7,8,9,10. Application of PET hydrolases, however, has been hampered by their lack of robustness to pH and temperature ranges, slow reaction rates and inability to directly use untreated postconsumer plastics11. Here, we use a structure-based, machine learning algorithm to engineer a robust and active PET hydrolase. Our mutant and scaffold combination (FAST-PETase: functional, active, stable and tolerant PETase) contains five mutations compared to wild-type PETase (N233K/R224Q/S121E from prediction and D186H/R280A from scaffold) and shows superior PET-hydrolytic activity relative to both wild-type and engineered alternatives12 between 30 and 50โ€‰ยฐC and a range of pH levels. We demonstrate that untreated, postconsumer-PET from 51 different thermoformed products can all be almost completely degraded by FAST-PETase in 1โ€‰week. FAST-PETase can also depolymerize untreated, amorphous portions of a commercial water bottle and an entire thermally pretreated water bottle at 50โ€‰ยบC. Finally, we demonstrate a closed-loop PET recycling process by using FAST-PETase and resynthesizing PET from the recovered monomers. Collectively, our results demonstrate a viable route for enzymatic plastic recycling at the industrial scale.

Read more at Nature

CircularNet: Reducing waste with Machine Learning

๐Ÿ“… Date:

โœ๏ธ Authors: Robert Little, Umair Sabir

๐Ÿ”– Topics: Sustainability, Machine Learning, Convolutional Neural Network

๐Ÿข Organizations: Google


The facilities where our waste and recyclables are processed are called โ€œMaterial Recovery Facilitiesโ€ (MRFs). Each MRF processes tens of thousands of pounds of our societal โ€œwasteโ€ every day, separating valuable recyclable materials like metals and plastics from non-recyclable materials. A key inefficiency within the current waste capture and sorting process is the inability to identify and segregate waste into high quality material streams. The accuracy of the sorting directly determines the quality of the recycled material; for high-quality, commercially viable recycling, the contamination levels need to be low. Even though the MRFs use various technologies alongside manual labor to separate materials into distinct and clean streams, the exceptionally cluttered and contaminated nature of the waste stream makes automated waste detection challenging to achieve, and the recycling rates and the profit margins stay at undesirably low levels.

Enter what we call โ€œCircularNetโ€, a set of models that lowers barriers to AI/ML tech for waste identification and all the benefits this new level of transparency can offer. Our goal with CircularNet is to develop a robust and data-efficient model for waste/recyclables detection, which can support the way we identify, sort, manage, and recycle materials across the waste management ecosystem.

Read more at Tensorflow Blog

Lufthansa increases on-time flights by wind forecasting with Google Cloud ML

๐Ÿ“… Date:

โœ๏ธ Author: Anant Nawalgaria

๐Ÿ”– Topics: Machine Learning, Forecasting

๐Ÿข Organizations: Lufthansa, Google


The magnitude and direction of wind significantly impacts airport operations, and Lufthansa Group Airlines are no exception. A particularly troublesome kind is called BISE: it is a cold, dry wind that blows from the northeast to southwest in Switzerland, through the Swiss Plateau. Its effects on flight schedules can be severe, such as forcing planes to change runways, which can create a chain reaction of flight delays and possible cancellations. In Zurich Airport, in particular, BISE can potentially reduce capacity by up to 30%, leading to further flight delays and cancellations, and to millions in lost revenue for Lufthansa (as well as dissatisfaction among their passengers).

Machine learning (ML) can help airports and airlines to better anticipate and manage these types of disruptive weather events. In this blog post, weโ€™ll explore an experiment Lufthansa did together with Google Cloud and its Vertex AI Forecast service, accurately predicting BISE hours in advance, with more than 40% relative improvement in accuracy over internal heuristics, all within days instead of the months it often takes to do ML projects of this magnitude and performance.

Read more at Google Cloud Blog

Improving Yield With Machine Learning

๐Ÿ“… Date:

โœ๏ธ Author: Laura Peters

๐Ÿ”– Topics: Machine Learning, Convolutional Neural Network, ResNet

๐Ÿญ Vertical: Semiconductor

๐Ÿข Organizations: KLA, Synopsys, CyberOptics, Macronix


Machine learning is becoming increasingly valuable in semiconductor manufacturing, where it is being used to improve yield and throughput.

Synopsys engineers recently found that a decision tree deep learning method can classify 98% of defects and features at 60X faster retraining time than traditional CNNs. The decision tree utilizes 8 CNNs and ResNet to automatically classify 12 defect types with images from SEM and optical tools.

Macronix engineers showed how machine learning can expedite new etch process development in 3D NAND devices. Two parameters are particularly important in optimizing the deep trench slit etch โ€” bottom CD and depth of polysilicon etch recess, also known as the etch stop.

KLA engineers, led by Cheng Hung Wu, optimized the use of a high landing energy e-beam inspection tool to capture defects buried as deep as 6ยตm in a 96-layer ONON stacked structure following deep trench etch. The e-beam tool can detect defects that optical inspectors cannot, but only if operated with high landing energy to penetrate deep structures. With this process, KLA was looking to develop an automated detection and classification system for deep trench defects.

Read more at Semiconductor Engineering

AI-Powered Verification

๐Ÿ“… Date:

๐Ÿ”– Topics: Machine Learning

๐Ÿญ Vertical: Semiconductor

๐Ÿข Organizations: Agnisys, Cadence


โ€œWe see AI as a disruptive technology that will in the long run eliminate, and in the near term reduce the need for verification,โ€ says Anupam Bakshi, CEO and founder of Agnisys. โ€œWe have had some early successes in using machine learning to read user specifications in natural language and directly convert them into SystemVerilog Assertions (SVA), UVM testbench code, and C/C++ embedded code for test and verification.โ€

There is nothing worse than spending time and resources to not get the desired result, or for it to take longer than necessary. โ€œIn formal, we have multiple engines, different algorithms that are working on solving any given property at any given time,โ€ says Pete Hardee, director for product management at Cadence. โ€œIn effect, there is an engine race going on. We track that race and see for each property which engine is working. We use reinforcement learning to set the engine parameters in terms of which engines Iโ€™m going to use and how long to run those to get better convergence on the properties that didnโ€™t converge the first time I ran it.โ€

Read more at Semiconductor Engineering

Batch Optimization using Quartic.ai

Using machine learning techniques in wine quality testing

๐Ÿ“… Date:

โœ๏ธ Author: Dario Rodriguez

๐Ÿ”– Topics: predictive quality, machine learning

๐Ÿญ Vertical: Beverage

๐Ÿข Organizations: Thermo Fisher


The Profiling capability from Thermo Scientificโ„ข SampleManagerโ„ข LIMS software provides an innovative way for laboratories to predict test results using historical data and novel machine learning (ML)-based techniques. For example, a food and beverage company might apply the Profiling capability to enable supervised learning in the food production process. In this case, SampleManager LIMS would use historical data to gain an understanding of the critical variables that determine whether a product is safe for consumers. This holistic approach considers not only the values of the individual critical variables themselves, but also the relationships between them. If a sample were to be flagged as failing, the system would alert stakeholders in advance to issue adjustments or investigations to avoid any risk to finished products.

In a wine production facility, the result of the โ€œQuality Testโ€ is of utmost importance. The laboratory has great flexibility and control over the testing process, so they could use the Profiling capability to redefine the order of the standard tests conducted to a wine sample.

Read more at Thermo Fisher Blog

Ericssonโ€™s next-gen AI-driven network dimensioning solution

๐Ÿ“… Date:

โœ๏ธ Authors: Marcial Gutierrez, Sleeba Paul Puthenpurakel, Shrihari Vasudevan

๐Ÿ”– Topics: Machine Learning

๐Ÿข Organizations: Ericsson


Resource requirement estimation, often referred to as dimensioning, is a crucial activity in the telecommunications industry. Network dimensioning is an integral part of the Ericsson Sales Process when engaging with a prospective customer โ€“ find out more about our approach to network dimensioning and the critical importance of accuracy.

The telco dimensioning problem can be conceived as a regression problem from an AI/ML perspective. The proposed solution is Bayesian Regression which proved to be more robust to multi-collinearity of features. Additionally, our approach allows the incorporation of domain knowledge into the modeling (for example, in the form of priors, bounds and constraints), to avoid dropping network features that are critical for the domain and interpretability requirements, from a modelโ€™s trustworthiness perspective.

Read more at Ericsson Blog

Decentralized learning and intelligent automation: the key to zero-touch networks?

๐Ÿ“… Date:

โœ๏ธ Authors: Selim Ickin, Hannes Larsson, Hassam Riaz, Xiaoyu Lan, Caner Kilinc

๐Ÿ”– Topics: AI, Machine Learning, Federated Learning


Decentralized learning and the multi-armed bandit agentโ€ฆ It may sound like the sci-fi version of an old western. But could this dynamic duo hold the key to efficient distributed machine learning โ€“ a crucial factor in the realization of zero-touch automated mobile networks? Letโ€™s find out.

Next-generation autonomous mobile networks will be complex ecosystems made up of a massive number of decentralized and intelligent network devices and nodes โ€“ network elements that may be both producing and consuming data simultaneously. If we are to realize our goal of fully automated zero-touch networks, new models of training artificial intelligence (AI) models need to be developed to accommodate these complex and diverse ecosystems.

Read more at Ericsson Blog

How Drishti empowers deep learning in manufacturing

๐Ÿ“… Date:

๐Ÿ”– Topics: Machine Learning

๐Ÿข Organizations: Drishti


During his talk at the MLDS Conference, โ€˜New developments in Deep Learning for unlikely industriesโ€™, Shankar outlined Drishtiโ€™s industrial applications of AI in manufacturing. The company leverages deep learning and computer vision to automate the analysis of factory floor videos. Essentially, the company has installed cameras on assembly lines that capture videos on which the company runs object detection, anomaly detection and action recognition. Then, the data is sent to industrial engineers to improve the line.

Read more at Analytics India Magazine

Fingerprinting liquids for composites

๐Ÿ“… Date:

๐Ÿ”– Topics: Metrology, Machine Learning

๐Ÿข Organizations: Collo, Kiilto


Collo uses electromagnetic sensors and edge analytics to optimize resin degassing, mixing, infusion, polymerization and cure as well as monitoring drift from benchmarked process parameters and enabling in-situ process control.

โ€œSo, the solution we are offering is real-time, inline measurement directly from the process,โ€ says Jรคrvelรคinen. โ€œOur system then converts that data into physical quantities that are understandable and actionable, like rheological viscosity, and it helps to ensure high-quality liquid processes and products. It also allows optimizing the processes. For example, you can shorten mixing time because you can clearly see when mixing is complete. So, you can improve productivity, save energy and reduce scrap versus less optimized processing.โ€

Read more at Composites World

Why AI software companies are betting on small data to spot manufacturing defects

๐Ÿ“… Date:

โœ๏ธ Author: Kate Kaye

๐Ÿ”– Topics: Machine Learning, Visual Inspection, Defect Detection

๐Ÿข Organizations: Landing AI, Mariner


The deep-learning algorithms that have come to dominate many of the technologies consumers and businesspeople interact with today are trained and improved by ingesting huge quantities of data. But because product defects show up so rarely, most manufacturers donโ€™t have millions, thousands or even hundreds of examples of a particular type of flaw they need to watch out for. In some cases, they might only have 20 or 30 photos of a windshield chip or small pipe fracture, for example.

Because labeling inconsistencies can trip up deep-learning models, Landing AI aims to alleviate the confusion. The companyโ€™s software has features that help isolate inconsistencies and assist teams of inspectors in coming to agreement on taxonomy. โ€œThe inconsistencies in labels are pervasive,โ€ said Ng. โ€œA lot of these problems are fundamentally ambiguous.โ€

Read more at Protocol

How pioneering deep learning is reducing Amazonโ€™s packaging waste

๐Ÿ“… Date:

โœ๏ธ Author: Sean O'Neill

๐Ÿ”– Topics: Machine Learning, Computer Vision, Convolutional Neural Network, Sustainability, E-commerce

๐Ÿข Organizations: Amazon


Fortunately, machine learning approaches โ€” particularly deep learning โ€” thrive on big data and massive scale, and a pioneering combination of natural language processing and computer vision is enabling Amazon to hone in on using the right amount of packaging. These tools have helped Amazon drive change over the past six years, reducing per-shipment packaging weight by 36% and eliminating more than a million tons of packaging, equivalent to more than 2 billion shipping boxes.

โ€œWhen the model is certain of the best package type for a given product, we allow it to auto-certify it for that pack type,โ€ says Bales. โ€œWhen the model is less certain, it flags a product and its packaging for testing by a human.โ€ The technology is currently being applied to product lines across North America and Europe, automatically reducing waste at a growing scale.

Read more at Amazon Science

Transfer learning with artificial neural networks between injection molding processes and different polymer materials

๐Ÿ“… Date:

โœ๏ธ Authors: Yannik Lockner, Christian Hopmann, Weibo Zhao

๐Ÿ”– Topics: artificial intelligence, machine learning

๐Ÿญ Vertical: Plastics and Rubber

๐Ÿข Organizations: RWTH Aachen University


Finding appropriate machine setting parameters in injection molding remains a difficult task due to the highly nonlinear process behavior. Artificial neural networks are a well-suited machine learning method for modelling injection molding processes, however, it is costly and therefore industrially unattractive to generate a sufficient amount of process samples for model training. Therefore, transfer learning is proposed as an approach to reuse already collected data from different processes to supplement a small training data set. Process simulations for the same part and 60 different materials of 6 different polymer classes are generated by design of experiments. After feature selection and hyperparameter optimization, finetuning as transfer learning technique is proposed to adapt from one or more polymer classes to an unknown one. The results illustrate a higher model quality for small datasets and selective higher asymptotes for the transfer learning approach in comparison with the base approach.

Read more at ScienceDirect

Using Machine Learning in Bosch IoT Insights

Artificial intelligence optimally controls your plant

๐Ÿ“… Date:

๐Ÿ”– Topics: energy consumption, reinforcement learning, machine learning, industrial control system

๐Ÿข Organizations: Siemens


Until now, heating systems have mainly been controlled individually or via a building management system. Building management systems follow a preset temperature profile, meaning they always try to adhere to predefined target temperatures. The temperature in a conference room changes in response to environmental influences like sunlight or the number of people present. Simple (PI or PID) controllers are used to make constant adjustments so that the measured room temperature is as close to the target temperature values as possible.

We believe that the best alternative is learning a control strategy by means of reinforcement learning (RL). Reinforcement learning is a machine learning method that has no explicit (learning) objective. Instead, an โ€œagentโ€ with as complete a knowledge of the system state as possible learns the manipulated variable changes that maximize a โ€œrewardโ€ function defined by humans. Using algorithms from reinforcement learning, the agent, meaning the control strategy, can be trained from both current and recorded system data. This requires measurements for the manipulated variable changes that have been carried out, for the (resulting) changes to the system state over time, and for the variables necessary for calculating the reward.

Read more at Siemens Ingenuity

Quality prediction of ultrasonically welded joints using a hybrid machine learning model

๐Ÿ“… Date:

โœ๏ธ Authors: Patrick G. Mongan, Eoin P. Hinchy, Noel P. ODowd, Conor T. McCarthy

๐Ÿ”– Topics: machine learning, genetic algorithm, welding

๐Ÿข Organizations: Confirm Smart Manufacturing Research Centre, University of Limerick


Ultrasonic metal welding has advantages over other joining technologies due to its low energy consumption, rapid cycle time and the ease of process automation. The ultrasonic welding (USW) process is very sensitive to process parameters, and thus can be difficult to consistently produce strong joints. There is significant interest from the manufacturing community to understand these variable interactions. Machine learning is one such method which can be exploited to better understand the complex interactions of USW input parameters. In this paper, the lap shear strength (LSS) of USW Al 5754 joints is investigated using an off-the-shelf Branson Ultraweld L20. Firstly, a 33 full factorial parametric study using ANOVA is carried out to examine the effects of three USW input parameters (weld energy, vibration amplitude & clamping pressure) on LSS. Following this, a high-fidelity predictive hybrid GA-ANN model is then trained using the input parameters and the addition of process data recorded during welding (peak power).

Read more at ScienceDirect

Machine learning predictions of superalloy microstructure

๐Ÿ“… Date:

โœ๏ธ Authors: Patrick L Taylor, Gareth Conduit

๐Ÿ”– Topics: machine learning, materials science

๐Ÿข Organizations: University of Cambridge, Intellegens


Gaussian process regression machine learning with a physically-informed kernel is used to model the phase compositions of nickel-base superalloys. The model delivers good predictions for laboratory and commercial superalloys. Additionally, the model predicts the phase composition with uncertainties unlike the traditional CALPHAD method.

Read more at ScienceDirect

Graph-based semi-supervised random forest for rotating machinery gearbox fault diagnosis

๐Ÿ“… Date:

โœ๏ธ Authors: Shaozhi Chen, Rui Yang, Maiying Zhong

๐Ÿ”– Topics: Random Forest, Machine Learning, Machine Health

๐Ÿข Organizations: Shandong University of Science and Technology, Xiโ€™an Jiaotong-Liverpool University


Random forest (RF) is an effective method for diagnosing faults of rotating machinery. However, the diagnosis accuracy enhancement under insufficient labeled samples is still one of the main challenges. Motivated by this problem, an improved RF algorithm based on graph-based semi-supervised learning (GSSL) and decision tree is proposed in this paper to improve the classification accuracy in the absence of labeled samples. The unlabeled samples are annotated by the GSSL and verified by the decision tree. The trained improved RF model is applied to the fault diagnosis for the rotating machinery gearbox. The effectiveness of the proposed algorithm is verified via hardware experiments using a wind turbine drivetrain diagnostics simulator (WTDDS). The results show that the proposed algorithm achieves better accuracy of classification than conventional methods in gearbox fault diagnosis. This study leads to further progress in the improvement of machine learning methods with insufficient and unlabeled samples.

Read more at Control Engineering Practice

Hybrid machine learning-enabled adaptive welding speed control

๐Ÿ“… Date:

โœ๏ธ Authors: Joseph Kershaw, Rui Yu, YuMing Zhang, Peng Wang

๐Ÿ”– Topics: machine learning, robot welding, convolutional neural network

๐Ÿข Organizations: University of Kentucky


This research presents a preliminary study on developing appropriate Machine Learning (ML) techniques for real-time welding quality prediction and adaptive welding speed adjustment for GTAW welding at a constant current. In order to collect the data needed to train the hybrid ML models, two cameras are applied to monitor the welding process, with one camera (available in practical robotic welding) recording the top-side weld pool dynamics and a second camera (unavailable in practical robotic welding, but applicable for training purpose) recording the back-side bead formation. Given these two data sets, correlations can be discovered through a convolutional neural network (CNN) that is good at image characterization. With the CNN, top-side weld pool images can be analyzed to predict the back-side bead width during active welding control.

Read more at Science Direct

Fabs Drive Deeper Into Machine Learning

๐Ÿ“… Date:

โœ๏ธ Author: Anne Meixner

๐Ÿ”– Topics: machine learning, machine vision, defect detection, convolutional neural network

๐Ÿญ Vertical: Semiconductor

๐Ÿข Organizations: GlobalFoundries, KLA, SkyWater Technology, Onto Innovation, CyberOptics, Hitachi, Synopsys


For the past couple decades, semiconductor manufacturers have relied on computer vision, which is one of the earliest applications of machine learning in semiconductor manufacturing. Referred to as Automated Optical Inspection (AOI), these systems use signal processing algorithms to identify macro and micro physical deformations.

Defect detection provides a feedback loop for fab processing steps. Wafer test results produce bin maps (good or bad die), which also can be analyzed as images. Their data granularity is significantly larger than the pixelated data from an optical inspection tool. Yet test results from wafer maps can match the splatters generated during lithography and scratches produced from handling that AOI systems can miss. Thus, wafer test maps give useful feedback to the fab.

Read more at Semiconductor Engineering

Adoption of machine learning technology for failure prediction in industrial maintenance: A systematic review

๐Ÿ“… Date:

โœ๏ธ Authors: Joerg Leukel, Julian Gonzalez, Martin Riekert

๐Ÿ”– Topics: machine learning, predictive maintenance

๐Ÿข Organizations: University of Hohenheim


Failure prediction is the task of forecasting whether a material system of interest will fail at a specific point of time in the future. This task attains significance for strategies of industrial maintenance, such as predictive maintenance. For solving the prediction task, machine learning (ML) technology is increasingly being used, and the literature provides evidence for the effectiveness of ML-based prediction models. However, the state of recent research and the lessons learned are not well documented. Therefore, the objective of this review is to assess the adoption of ML technology for failure prediction in industrial maintenance and synthesize the reported results. We conducted a systematic search for experimental studies in peer-reviewed outlets published from 2012 to 2020. We screened a total of 1,024 articles, of which 34 met the inclusion criteria.

Read more at ScienceDirect

Accelerating the Design of Automotive Catalyst Products Using Machine Learning

๐Ÿ“… Date:

โœ๏ธ Authors: Tom Whitehead, Flora Chen, Christopher Daly, Gareth Conduit

๐Ÿ”– Topics: generative design, machine learning

๐Ÿญ Vertical: Automotive

๐Ÿข Organizations: Intellegens, Johnson Matthey


The design of catalyst products to reduce harmful emissions is currently an intensive process of expert-driven discovery, taking several years to develop a product. Machine learning can accelerate this timescale, leveraging historic experimental data from related products to guide which new formulations and experiments will enable a project to most directly reach its targets. We used machine learning to accurately model 16 key performance targets for catalyst products, enabling detailed understanding of the factors governing catalyst performance and realistic suggestions of future experiments to rapidly develop more effective products. The proposed formulations are currently undergoing experimental validation.

Read more at Ingenta Connect

Getting Industrial About The Hybrid Computing And AI Revolution

๐Ÿ“… Date:

โœ๏ธ Author: Jeffrey Burt

๐Ÿ”– Topics: IIoT, machine learning, reinforcement learning

๐Ÿญ Vertical: Petroleum and Coal

๐Ÿข Organizations: Beyond Limits


Beyond Limits is applying such techniques as deep reinforcement learning (DRL), using a framework to train a reinforcement learning agent to make optimal sequential recommendations for placing wells. It also uses reservoir simulations and novel deep convolutional neural networks to work. The agent takes in the data and learns from the various iterations of the simulator, allowing it to reduce the number of possible combinations of moves after each decision is made. By remembering what it learned from the previous iterations, the system can more quickly whittle the choices down to the one best answer.

Read more at The Next Platform

Real-World ML with Coral: Manufacturing

๐Ÿ“… Date:

โœ๏ธ Author: Michael Brooks

๐Ÿ”– Topics: edge computing, AI, machine learning, computer vision, convolutional neural network, Tensorflow, worker safety

๐Ÿข Organizations: Coral


For over 3 years, Coral has been focused on enabling privacy-preserving Edge ML with low-power, high performance products. Weโ€™ve released many examples and projects designed to help you quickly accelerate ML for your specific needs. One of the most common requests we get after exploring the Coral models and projects is: How do we move to production?

  • Worker Safety - Performs generic person detection (powered by COCO-trained SSDLite MobileDet) and then runs a simple algorithm to detect bounding box collisions to see if a person is in an unsafe region.
  • Visual Inspection - Performs apple detection (using the same COCO-trained SSDLite MobileDet from Worker Safety) and then crops the frame to the detected apple and runs a retrained MobileNetV2 that classifies fresh vs rotten apples.

Read more at TensorFlow Blog

The Journey of Additive Manufacturing and Artificial Intelligence

The Machine Economy is Here: Powering a Connected World

๐Ÿ“… Date:

โœ๏ธ Author: Megan Doyle

๐Ÿ”– Topics: IIoT, machine learning, blockchain

๐Ÿข Organizations: Flexon Technology, Allied Vision


In combination with the real-time data produced by IoT, blockchain, and ML applications are disrupting B2B companies across various industries from healthcare to manufacturing. Together, these three fundamental technologies create an intelligent system where connected devices can โ€œtalkโ€ to one another. However, machines are still unable to conduct transactions with each other.

This is where distributed ledger technology (DLT) and blockchain come into play. Cryptocurrencies and smart contracts (self-executing contracts between buyers and sellers on a decentralized network) make it possible for autonomous machines to transact with one another on a blockchain.

Devices participating in M2M transactions can be programmed to make purchases based on individual or business needs. Human error was a cause for concern in the past; machine learning algorithms provide reliable and trusted data that continue to learn and improve โ€” becoming smarter each day.

Read more at IoT For All

How to integrate AI into engineering

๐Ÿ“… Date:

โœ๏ธ Author: Jos Martin

๐Ÿ”– Topics: machine learning

๐Ÿข Organizations: MathWorks


Most of the focus on AI is all about the AI model, which drives engineers to quickly dive into the modelling aspect of AI. After a few starter projects, engineers learn that AI is not just modelling, but rather a complete set of steps that includes data preparation, modelling, simulation and test, and deployment

Read more at The Engineer

Visual Inspection AI: a purpose-built solution for faster, more accurate quality control

๐Ÿ“… Date:

โœ๏ธ Authors: Mandeep Wariach, Thomas Reinbacher

๐Ÿ”– Topics: cloud computing, computer vision, machine learning, quality assurance

๐Ÿข Organizations: Google


The Google Cloud Visual Inspection AI solution automates visual inspection tasks using a set of AI and computer vision technologies that enable manufacturers to transform quality control processes by automatically detecting product defects.

We built Visual Inspection AI to meet the needs of quality, test, manufacturing, and process engineers who are experts in their domain, but not in AI. By combining ease of use with a focus on priority uses cases, customers are realizing significant benefits compared to general purpose machine learning (ML) approaches.

Read more at Google Cloud Blog

Machine Learning Keeps Rolling Bearings on the Move

๐Ÿ“… Date:

โœ๏ธ Author: Rehana Begg

๐Ÿ”– Topics: machine learning, vibration analysis, predictive maintenance, bearing

๐Ÿข Organizations: Osaka University


Rolling bearings are essential components in automated machinery with rotating elements. They come in many shapes and sizes, but are essentially designed to carry a load while minimizing friction. In general, the design consists of two rings separated by rolling elements (balls or rollers). The rings can rotate can rotate relative to each other with very little friction.

The ability to accurately predict the remaining useful life of the bearings under defect progression could reduce unnecessary maintenance procedures and prematurely discarded parts without risking breakdown, reported scientists from the Institute of Scientific and Industrial Research and NTN Next Generation Research Alliance Laboratories at Osaka University.

The scientists have developed a machine learning method that combines convolutional neural networks and Bayesian hierarchical modeling to predict the remaining useful life of rolling bearings. Their approach is based on the measured vibration spectrum.

Read more at Machine Design

Tree Model Quantization for Embedded Machine Learning Applications

๐Ÿ“… Date:

โœ๏ธ Author: Leslie J. Schradin

๐Ÿ”– Topics: edge computing, machine learning

๐Ÿข Organizations: Qeexo


Compressed tree-based models are useful models to consider for embedded machine learning applications, in particular with the compression technique: quantization. Quantization can compress models by significant amounts with a trade-off of slight loss in model fidelity, allowing more room on the device for other programs.

Read more at Qeexo

The realities of developing embedded neural networks

๐Ÿ“… Date:

โœ๏ธ Author: Tony King-Smith

๐Ÿ”– Topics: edge computing, machine learning, AI

๐Ÿข Organizations: AImotive


With any embedded software destined for deployment in volume production, an enormous amount of effort goes into the code once the implementation of its core functionality has been completed and verified. This optimization phase is all about minimizing memory, CPU and other resources needed so that as much as possible of the software functionality is preserved, while the resources needed to execute it are reduced to the absolute minimum possible.

This process of creating embedded software from lab-based algorithms enables production engineers to cost-engineer software functionality into a mass-production ready form, requiring far cheaper, less capable chips and hardware than the massive compute datacenter used to develop it. However, it usually requires the functionality to be frozen from the beginning, with code modifications only done to improve the way the algorithms themselves are executed. For most software, that is fine: indeed, it enables a rigorous verification methodology to be used to ensure the embedding process retains all the functionality needed.

However, when embedding NN-based AI algorithms, that can be a major problem. Why? Because by freezing the functionality from the beginning, you are removing one of the main ways in which the execution can be optimized.

Read more at Embedded

Google Cloud and Seagate: Transforming hard-disk drive maintenance with predictive ML

๐Ÿ“… Date:

โœ๏ธ Authors: Nitin Aggarwal, Rostam Dinyari

๐Ÿ”– Topics: machine learning, predictive maintenance

๐Ÿญ Vertical: Computer and Electronic

๐Ÿข Organizations: Google, Seagate


At Google Cloud, we know first-hand how critical it is to manage HDDs in operations and preemptively identify potential failures. We are responsible for running some of the largest data centers in the worldโ€”any misses in identifying these failures at the right time can potentially cause serious outages across our many products and services. In the past, when a disk was flagged for a problem, the main option was to repair the problem on site using software. But this procedure was expensive and time-consuming. It required draining the data from the drive, isolating the drive, running diagnostics, and then re-introducing it to traffic.

Thatโ€™s why we teamed up with Seagate, our HDD original equipment manufacturer (OEM) partner for Googleโ€™s data centers, to find a way to predict frequent HDD problems. Together, we developed a machine learning (ML) system, built on top of Google Cloud, to forecast the probability of a recurring failing diskโ€”a disk that fails or has experienced three or more problems in 30 days.

Read more at Google Cloud Blog

Ford's Ever-Smarter Robots Are Speeding Up the Assembly Line

๐Ÿ“… Date:

โœ๏ธ Author: Will Knight

๐Ÿ”– Topics: AI, machine learning, robotics

๐Ÿญ Vertical: Automotive

๐Ÿข Organizations: Ford, Symbio Robotics


At a Ford Transmission Plant in Livonia, Michigan, the station where robots help assemble torque converters now includes a system that uses AI to learn from previous attempts how to wiggle the pieces into place most efficiently. Inside a large safety cage, robot arms wheel around grasping circular pieces of metal, each about the diameter of a dinner plate, from a conveyor and slot them together.

The technology allows this part of the assembly line to run 15 percent faster, a significant improvement in automotive manufacturing where thin profit margins depend heavily on manufacturing efficiencies.

Read more at WIRED

Start-ups Powering New Era of Industrial Robotics

๐Ÿ“… Date:

โœ๏ธ Author: James Falkoff

๐Ÿ”– Topics: robotics, automated guided vehicle, machine learning

๐Ÿญ Vertical: Machinery

๐Ÿข Organizations: Ready Robotics, ArtiMinds, Realtime Robotics, RIOS, Vicarious


Much of the bottleneck to achieving automation in manufacturing relates to limitations in the current programming model of industrial robotics. Programming is done in languages proprietary to each robotic hardware OEM โ€“ languages โ€œstraight from the 80sโ€ as one industry executive put it.

There are a limited number of specialists who are proficient in these languages. Given the rarity of the expertise involved, as well as the time it takes to program a robot, robotics application development typically costs three times as much as the hardware for a given installation.

Read more at Robotics Business Review

Multi-Task Robotic Reinforcement Learning at Scale

๐Ÿ“… Date:

โœ๏ธ Authors: Karol Hausman, Yevgen Chebotar

๐Ÿ”– Topics: reinforcement learning, robotics, AI, machine learning

๐Ÿข Organizations: Google


For general-purpose robots to be most useful, they would need to be able to perform a range of tasks, such as cleaning, maintenance and delivery. But training even a single task (e.g., grasping) using offline reinforcement learning (RL), a trial and error learning method where the agent uses training previously collected data, can take thousands of robot-hours, in addition to the significant engineering needed to enable autonomous operation of a large-scale robotic system. Thus, the computational costs of building general-purpose everyday robots using current robot learning methods becomes prohibitive as the number of tasks grows.

Read more at Google AI Blog

Intelligent edge management: why AI and ML are key players

๐Ÿ“… Date:

โœ๏ธ Authors: Fetahi Wuhib, Mbarka Soualhia, Carla Mouradian, Wubin Li

๐Ÿ”– Topics: AI, machine learning, edge computing, anomaly detection

๐Ÿข Organizations: Ericsson


What will the future of network edge management look like? We explain how artificial intelligence and machine learning technologies are crucial for intelligent edge computing and the management of future-proof networks. Whatโ€™s required, and what are the building blocks needed to make it happen?

Read more at Ericsson

Using Machine Learning to identify operational modes in rotating equipment

๐Ÿ“… Date:

โœ๏ธ Author: Frederik Wartenberg

๐Ÿ”– Topics: anomaly detection, vibration analysis, machine learning

๐Ÿข Organizations: Viking Analytics


Vibration monitoring is key to performing condition monitoring-based maintenance in rotating equipment such as engines, compressors, turbines, pumps, generators, blowers, and gearboxes. However, periodic route-based vibration monitoring programs are not enough to prevent breakdowns, as they normally offer a narrower view of the machinesโ€™ conditions.

Adding Machine Learning algorithms to this process makes it scalable, as it allows the analysis of historic data from equipment. One of the benefits is being able to identify operational modes and help maintenance teams to understand if the machine is operating in normal or abnormal conditions.

Read more at Viking Analytics Blog

Amazonโ€™s robot arms break ground in safety, technology

๐Ÿ“… Date:

โœ๏ธ Author: Alan S. Brown

๐Ÿ”– Topics: AI, machine learning, robotics, palletizer, robotic arm, worker safety

๐Ÿข Organizations: Amazon


Robin, one of the most complex stationary robot arm systems Amazon has ever built, brings many core technologies to new levels and acts as a glimpse into the possibilities of combining vision, package manipulation and machine learning, said Will Harris, principal product manager of the Robin program.

Those technologies can be seen when Robin goes to work. As soft mailers and boxes move down the conveyor line, Robin must break the jumble down into individual items. This is called image segmentation. People do it automatically, but for a long time, robots only saw a solid blob of pixels.

Read more at Amazon Science

AI In Inspection, Metrology, And Test

๐Ÿ“… Date:

โœ๏ธ Authors: Susan Rambo, Ed Sperling

๐Ÿ”– Topics: AI, machine learning, quality assurance, metrology, nondestructive test

๐Ÿญ Vertical: Semiconductor

๐Ÿข Organizations: CyberOptics, Lam Research, Hitachi, FormFactor, NuFlare, Advantest, PDF Solutions, eBeam Initiative, KLA, proteanTecs, Fraunhofer IIS


โ€œThe human eye can see things that no amount of machine learning can,โ€ said Subodh Kulkarni, CEO of CyberOptics. โ€œThatโ€™s where some of the sophistication is starting to happen now. Our current systems use a primitive kind of AI technology. Once you look at the image, you can see a problem. And our AI machine doesnโ€™t see that. But then you go to the deep learning kind of algorithms, where you have very serious Ph.D.-level people programming one algorithm for a week, and they can detect all those things. But it takes them a week to program those things, which today is not practical.โ€

Thatโ€™s beginning to change. โ€œWeโ€™re seeing faster deep-learning algorithms that can be more easily programmed,โ€ Kulkarni said. โ€œBut the defects also are getting harder to catch by a machine, so there is still a gap. The biggest bang for the buck is not going to come from improving cameras or projectors or any of the equipment that we use to generate optical images. Itโ€™s going to be interpreting optical images.โ€

Read more at Semiconductor Engineering

How To Measure ML Model Accuracy

๐Ÿ“… Date:

โœ๏ธ Author: Bryon Moyer

๐Ÿ”– Topics: machine learning

๐Ÿข Organizations: Ansys, Brainome, Cadence, Flex Logix, Synopsys, Xilinx


Machine learning (ML) is about making predictions about new data based on old data. The quality of any machine-learning algorithm is ultimately determined by the quality of those predictions.

However, there is no one universal way to measure that quality across all ML applications, and that has broad implications for the value and usefulness of machine learning.

Read more at Semiconductor Engineering

Go beyond machine learning to optimize manufacturing operations

๐Ÿ“… Date:

โœ๏ธ Author: Andrew Silberfarb

๐Ÿ”– Topics: machine learning

๐Ÿข Organizations: SRI International


Machine learning depends on vast amounts of data to make inferences. However, sometimes the amount of data needed by machine-learning algorithms is simply not available. SRI International has developed a system called Deep Adaptive Semantic Logic (DASL) that uses adaptive semantic reasoning to fill in the data gaps. DASL integrates bottom-up data-driven modeling with top-down theoretical reasoning in a symbiotic union of innovative machine learning and knowledge guided inference. The system brings experts and data together to make better, more informed decisions.

Read more at Automation Alley

Adversarial training reduces safety of neural networks in robots

๐Ÿ“… Date:

โœ๏ธ Author: @BenDee983

๐Ÿ”– Topics: AI, robotics, machine learning


A more fundamental problem, also confirmed by Lechner and his coauthors, is the lack of causality in machine learning systems. As long as neural networks focus on learning superficial statistical patterns in data, they will remain vulnerable to different forms of adversarial attacks. Learning causal representations might be the key to protecting neural networks against adversarial attacks. But learning causal representations itself is a major challenge and scientists are still trying to figure out how to solve it.

Read more at VentureBeat

What Walmart learned from its machine learning deployment

๐Ÿ“… Date:

โœ๏ธ Author: Katie Malone

๐Ÿ”– Topics: cloud computing, machine learning

๐Ÿข Organizations: Walmart


As more businesses turn to automation to realize business value, retailโ€™s wide variety of ML use cases can provide insights into how to overcome challenges associated with the technology. The goal should be trying to solve a problem by using ML as a tool to get there, Kamdar said.

For example, Walmart uses a ML model to optimize the timing and pricing of markdowns, and to examine real estate data to find places to cut costs, according to executives on an earnings call in February.

Read more at Supply Chain Dive

AI project to 'pandemic-proof' NHS supply chain

๐Ÿ“… Date:

๐Ÿ”– Topics: natural language processing, machine learning

๐Ÿข Organizations: Vamstar


With the ability to analyse NHS and global procurement data from previous supply contracts, the platform will aim to allow NHS buyers to evaluate credibility and capability of suppliers to fulfil their order. Each supplier would have a real-time โ€˜risk ratingโ€™ with information on the goods and services they supply.

Researchers at Sheffield Universityโ€™s Information School are said to be developing Natural Language Processing (NLP) methods for the automated reading and extraction of data from large amounts of contract tender data held by the NHS and other European healthcare providers

Read more at The Engineer

How Machine Learning Techniques Can Help Engineers Design Better Products

๐Ÿ“… Date:

๐Ÿ”– Topics: machine learning, generative design

๐Ÿข Organizations: Altair


By leveraging field predictive ML models engineers can explore more options without the use of a solver when designing different components and parts, saving time and resources. This ultimately produces higher quality results that can then be used to make more informed decisions throughout the design process.

Read more at Altair Engineering

Introducing Amazon SageMaker Reinforcement Learning Components for open-source Kubeflow pipelines

๐Ÿ“… Date:

โœ๏ธ Authors: Alex Chung, Kyle Saltmarsh, Leonard O'Sullivan, Matthew Rose, Nicholas Therkelsen-Terry, Nicholas Thomson, Ragha Prasad, Sahika Genc,

๐Ÿ”– Topics: AI, machine learning, robotics

๐Ÿข Organizations: AWS, Max Kelsen, Universal Robots, Woodside Energy


Woodside Energy uses AWS RoboMaker with Amazon SageMaker Kubeflow operators to train, tune, and deploy reinforcement learning agents to their robots to perform manipulation tasks that are repetitive or dangerous.

Read more at AWS Blog

Leveraging AI and Statistical Methods to Improve Flame Spray Pyrolysis

๐Ÿ“… Date:

โœ๏ธ Author: Stephen J. Mraz

๐Ÿ”– Topics: AI, machine learning, materials science

๐Ÿญ Vertical: Chemical

๐Ÿข Organizations: Argonne National Laboratory


Flame spray pyrolysis has long been used to make small particles that can be used as paint pigments. Now, researchers at Argonne National Laboratory are refining the process to make smaller, nano-sized particles of various materials that can make nano-powders for low-cobalt battery cathodes, solid state electrolytes and platinum/titanium dioxide catalysts for turning biomass into fuel.

Read more at Machine Design

Way beyond AlphaZero: Berkeley and Google work shows robotics may be the deepest machine learning of all

๐Ÿ“… Date:

โœ๏ธ Author: @TiernanRayTech

๐Ÿ”– Topics: AI, machine learning, robotics, reinforcement learning

๐Ÿข Organizations: Google


With no well-specified rewards and state transitions that take place in a myriad of ways, training a robot via reinforcement learning represents perhaps the most complex arena for machine learning.

Read more at ZDNet

AWS Announces General Availability of Amazon Lookout for Vision

๐Ÿ“… Date:

๐Ÿ”– Topics: cloud computing, computer vision, machine learning, quality assurance

๐Ÿข Organizations: AWS, Basler, Dafgards, General Electric


AWS announced the general availability of Amazon Lookout for Vision, a new service that analyzes images using computer vision and sophisticated machine learning capabilities to spot product or process defects and anomalies in manufactured products. By employing a machine learning technique called โ€œfew-shot learning,โ€ Amazon Lookout for Vision is able to train a model for a customer using as few as 30 baseline images. Customers can get started quickly using Amazon Lookout for Vision to detect manufacturing and production defects (e.g. cracks, dents, incorrect color, irregular shape, etc.) in their products and prevent those costly errors from progressing down the operational line and from ever reaching customers. Together with Amazon Lookout for Equipment, Amazon Monitron, and AWS Panorama, Amazon Lookout for Vision provides industrial and manufacturing customers with the most comprehensive suite of cloud-to-edge industrial machine learning services available. With Amazon Lookout for Vision, there is no up-front commitment or minimum fee, and customers pay by the hour for their actual usage to train the model and detect anomalies or defects using the service.

Read more at Business Wire

Rearranging the Visual World

๐Ÿ“… Date:

โœ๏ธ Authors: Andy Zeng, Pete Florence

๐Ÿ”– Topics: AI, machine learning, robotics

๐Ÿข Organizations: Google


Transporter Nets use a novel approach to 3D spatial understanding that avoids reliance on object-centric representations, making them general for vision-based manipulation but far more sample efficient than benchmarked end-to-end alternatives. As a consequence, they are fast and practical to train on real robots. We are also releasing an accompanying open-source implementation of Transporter Nets together with Ravens, our new simulated benchmark suite of ten vision-based manipulation tasks.

Read more at Google AI Blog

Artificial Intelligence: Driving Digital Innovation and Industry 4.0

๐Ÿ“… Date:

โœ๏ธ Author: @ralph_ohr

๐Ÿ”– Topics: AI, machine learning

๐Ÿข Organizations: Siemens


Intelligent AI solutions can analyze high volumes of data generated by a factory to identify trends and patterns which can then be used to make manufacturing processes more efficient and reduce their energy consumption. Employing Digital Twin-enabled representations of a product and the associated process, AI is able to recognize whether the workpiece being manufactured meets quality requirements. This is how plants are constantly adapting to new circumstances and undergoing optimization with no need for operator input. New technologies are emerging in this application area, such as Reinforcement Learning โ€“ a topic that has not been deployed on a broad scale up to now. It can be used to automatically ascertain correlations between production parameters, product quality and process performance by learning through โ€˜trial-and-errorโ€™ โ€“ and thereby dynamically tuning the parameter values to optimize the overall process.

Read more at Siemens Ingenuity

Edge-Inference Architectures Proliferate

๐Ÿ“… Date:

โœ๏ธ Author: Bryon Moyer

๐Ÿ”– Topics: AI, machine learning, edge computing

๐Ÿญ Vertical: Semiconductor

๐Ÿข Organizations: Cadence, Hailo, Google, Flex Logix, BrainChip, Synopsys, GrAI Matter, Deep Vision, Maxim Integrated


What makes one AI system better than another depends on a lot of different factors, including some that arenโ€™t entirely clear.

The new offerings exhibit a wide range of structure, technology, and optimization goals. All must be gentle on power, but some target wired devices while others target battery-powered devices, giving different power/performance targets. While no single architecture is expected to solve every problem, the industry is in a phase of proliferation, not consolidation. It will be a while before the dust settles on the preferred architectures.

Read more at Semiconductor Engineering

Pushing The Frontiers Of Manufacturing AI At Seagate

๐Ÿ“… Date:

โœ๏ธ Author: Tom Davenport

๐Ÿ”– Topics: AI, machine learning, predictive maintenance, quality assurance

๐Ÿญ Vertical: Computer and Electronic

๐Ÿข Organizations: Seagate


Big data, analytics and AI are widely used in industries like financial services and e-commerce, but are less likely to be found in manufacturing companies. With some exceptions like predictive maintenance, few manufacturing firms have marshaled the amounts of data and analytical talent to aggressively apply analytics and AI to key processes.

Seagate Technology, an over $10B manufacturer of data storage and management solutions, is a prominent counter-example to this trend. It has massive amounts of sensor data in its factories and has been using it extensively over the last five years to ensure and improve the quality and efficiency of its manufacturing processes.

Read more at Forbes

Building effective IoT applications with tinyML and automated machine learning

๐Ÿ“… Date:

โœ๏ธ Authors: Rajen Bhatt, Tina Shyuan

๐Ÿ”– Topics: IIoT, machine learning

๐Ÿข Organizations: Qeexo


The convergence of IoT devices and ML algorithms enables a wide range of smart applications and enhanced user experiences, which are made possible by low-power, low-latency, and lightweight machine learning inference, i.e., tinyML.

Read more at Embedded

Advanced Technologies Adoption and Use by U.S. Firms: Evidence from the Annual Business Survey

๐Ÿ“… Date:

โœ๏ธ Authors: Nikolas Zolas, Zachary Kroff, Erik Brynjolfsson, Kristina McElheran, David N. Beede, Cathy Buffington, Nathan Goldschlag, Lucia Foster, Emin Dinlersoz

๐Ÿ”– Topics: AI, augmented reality, cloud computing, machine learning, Radio-frequency identification, robotics


While robots are usually singled out as a key technology in studies of automation, the overall diffusion of robotics use and testing is very low across firms in the U.S. The use rate is only 1.3% and the testing rate is 0.3%. These levels correspond relatively closely with patterns found in the robotics expenditure question in the 2018 ASM. Robots are primarily concentrated in large, manufacturing firms. The distribution of robots among firms is highly skewed, and the skewness in favor of larger firms can have a disproportionate effect on the economy that is otherwise not obvious from the relatively low overall diffusion rate of robots. The least-used technologies are RFID (1.1%), Augmented Reality (0.8%), and Automated Vehicles (0.8%). Looking at the pairwise adoption of these technologies in Table 14, we find that use of Machine Learning and Machine Vision are most coincident. We find that use of Automated Guided Vehicles is closely associated with use of Augmented Reality, RFID, and Machine Vision.

Read more at National Bureau of Economic Research

Westinghouse ML, AI, and Digital Twin Developments for Nuclear Power Applications

How Instacart fixed its A.I. and keeps up with the coronavirus pandemic

๐Ÿ“… Date:

โœ๏ธ Author: @JonathanVanian

๐Ÿ”– Topics: COVID-19, demand planning, machine learning

๐Ÿข Organizations: Instacart


Like many companies, online grocery delivery service Instacart has spent the past few months overhauling its machine-learning models because the coronavirus pandemic has drastically changed how customers behave.

Starting in mid-March, Instacartโ€™s all-important technology for predicting whether certain products would be available at specific stores became increasingly inaccurate. The accuracy of a metric used to evaluate how many items are found at a store dropped to 61% from 93%, tipping off the Instacart engineers that they needed to re-train their machine learning model that predicts an itemโ€™s availability at a store. After all, customers could get annoyed being told one thingโ€”the item that they wanted was availableโ€”when in fact it wasnโ€™t, resulting in products never being delivered. โ€˜A shock to the systemโ€™ is how Instacartโ€™s machine learning director Sharath Rao described the problem to Fortune.

Read more at Fortune (Paid)