Design of a Ni-based superalloy for laser repair applications using probabilistic neural network identification
A neural network framework is used to design a new Ni-based superalloy that surpasses the performance of IN718 for laser-blown-powder directed-energy-deposition repair applications. Current high-performance engineering alloys commonly suffer from issues when processed using additive manufacturing methods. These include cracking, porosity, elemental segregation, and anisotropy. The computational method reported here enables the identification of new alloy compositions that have the highest likelihood of simultaneously satisfying a range of target properties, including criteria specific to additive manufacturing. The efficacy of this method is demonstrated with the design of a new alloy more amenable to laser-blown-powder direct-energy-deposition. The method may be readily extended to the optimization of other alloy types and process methods.
How Volkswagen and Google Cloud are using machine learning to design more energy-efficient cars
Volkswagen strives to design beautiful, performant, and energy efficient vehicles. This entails an iterative process where designers go through many design drafts, evaluating each, integrating the feedback, and refining. For example, a vehicle’s drag coefficient—its resistance to air—is one of the most important factors of energy efficiency. Thus, getting estimates of the drag coefficient for several designs helps the designers experiment and converge toward more energy-efficient solutions. The cheaper and faster this feedback loop is, the more it enables the designers.
This joint research effort between Volkswagen and Google has produced promising results with the help of the Vertex AI platform. In this first milestone, the team was able to successfully bring recent AI research results a step closer to practical application for car design. This first iteration of the algorithm can produce a drag coefficient estimate with an average error of just 4%, within a second. An average error of 4%, while not quite as accurate as a physical wind tunnel test, can be used to narrow a large selection of design candidates to a small shortlist. And given how quickly the estimates appear, we have made a substantial improvement on the existing methods that take days or weeks. With the algorithm that we have developed, designers can run more efficiency tests, submit more candidates, and iterate towards richer, more effective designs in just a small fraction of the time previously required.
JITX Launches General Availability And Announces $12M Series A From Sequoia Capital
Today we’re announcing the general availability of JITX and that we raised a $12M Series A round, led by Sequoia Capital, with participation from Y Combinator, Funders Club and Liquid 2.
Hardware engineers need a credible way out of the trap they find themselves in. JITX helps by letting them write code that automates their engineering process. To get ahead they can’t just do one design after another – they need reusable code that designs hardware for them. To illustrate this point: a software engineer can upload code to GitHub and thousands of people can reuse that code in their own projects. Using a traditional hardware design flow, each one of those thousands of engineers would have to re-design and re-analyze the same circuit to make sure the design will behave correctly in their product. JITX brings the productivity of software to hardware.
At the same time we were working with enterprise design teams like Northrop Grumman. It turns out that they also needed JITX to address some specific problems. Like everyone else, their biggest challenge is finding and retaining skilled engineers. There just aren’t enough experts to go around, and even entry level positions are getting harder to fill (turns out new EE graduates are more interested in AI than drafting circuit boards). So they use JITX as a way to make their existing experts more productive. They find a lot of value out of checking designs automatically – a manual derating analysis on a complex FPGA board can take months but JITX automates the whole procedure. They are also excited about using code as a more efficient way to coordinate across different teams in the organization. At the end of our iteration process we were quickly designing boards that were at the limit of what traditional factories could build (our thanks to Gerry Partida for an 8/4 stacked microvia with sub 70um trace and space!). For example we built this silicon validation board that included 2500 pins in a complex 300um grid.
Sim2Real AI Helps Robots Think Outside The Box
At Ambi Robotics, our robotic systems learn how to handle diverse items using data generated by advanced simulation. We fine-tune our simulations to the performance of our sensors, our robots, and variations on the items our robots will handle. Our simulations run extremely fast, hundreds of times faster than robots training in the physical world, so we can train our robots overnight. This is what enables our solutions to work reliably from day one.
How IGESTEK Produces 40% Lighter Automotive Parts
Autonomous Design Automation: How Far Are We?
As an industry, we will refine the different levels of Autonomous Design Automation further over the years to come. Eventually, the combination of the different steps of the flow with AI/ML will unlock even further productivity improvements. How long will it be until designers define a function in a higher-level language like SysML and, based on the designer’s requirements, autonomously implement it as a hardware/software system after AI/ML-controlled design-space exploration?
Improving PPA In Complex Designs With AI
The goal of chip design always has been to optimize power, performance, and area (PPA), but results can vary greatly even with the best tools and highly experienced engineering teams. AI works best in design when the problem is clearly defined in a way that AI can understand. So an IC designer must first see if there is a problem that can be tied to a system’s ability to adapt to, learn, and generalize knowledge/rules, and then apply these knowledge/rules to an unfamiliar scenario.
Calculating the best shapes for things to come
Maximizing the performance and efficiency of structures—everything from bridges to computer components—can be achieved by design with a new algorithm developed by researchers at the University of Michigan and Northeastern University. It’s an advancement likely to benefit a host of industries where costly and time-consuming trial-and-error testing is necessary to determine the optimal design. As an example, look at the current U.S. infrastructure challenge—a looming $2.5 trillion backlog that will need to be addressed with taxpayer dollars.
Generative Design for Milling Lightweights EV Motorbike Part
Generative design software uses a set of user-input parameters and constraints to develop efficient part designs. These shapes are often organic forms no human would design on their own, and in its earliest years generative design was locked to additive manufacturing and production methods facilitated by additive manufacturing. Not long after Lightning and Autodesk developed their first iteration of the generatively designed motorcycle swing arm, Autodesk updated its solver to support milling and other conventional manufacturing methods. Design candidates generated for milling generally cannot reach the same level of optimization as their AM siblings, but they are much easier to manufacture while still reducing the weight of the part.
What Is Generative Design, and How Can It Be Used in Manufacturing?
The primary use case of generative design in manufacturing is to automatically trigger design options that are pre-validated to meet the requirements you’ve established. That can be especially important for efficient manufacturing. Sometimes a part or tool must fit into an entrenched workflow or pipeline—methodologically or physically—as part of a larger device or process.
Rolls-Royce Finds New-Engine Benefits in Old Test Data
The goal, according to Peter Wehle, head of innovation, research and testing at RRD, is to use this information to reduce new-engine weight and mass, while maintaining structural integrity.
Both parties are hopeful that using ML and AI will significantly reduce the number of sensors needed to obtain present and future data, thereby saving RRD millions of euros annually. According to Mahalingam, the software lets engineers choose the data they want from a data silo, select the algorithms they want to employ and decide whether or not they want to use a neural network to train an ML model.
Wehle notes that the disruptive tool is based on the interaction between a communication endpoint of the engine simulation and neighboring points. It carefully analyzes the effects of loads on physical structures.
Accelerating the Design of Automotive Catalyst Products Using Machine Learning
The design of catalyst products to reduce harmful emissions is currently an intensive process of expert-driven discovery, taking several years to develop a product. Machine learning can accelerate this timescale, leveraging historic experimental data from related products to guide which new formulations and experiments will enable a project to most directly reach its targets. We used machine learning to accurately model 16 key performance targets for catalyst products, enabling detailed understanding of the factors governing catalyst performance and realistic suggestions of future experiments to rapidly develop more effective products. The proposed formulations are currently undergoing experimental validation.
How Machine Learning Techniques Can Help Engineers Design Better Products
By leveraging field predictive ML models engineers can explore more options without the use of a solver when designing different components and parts, saving time and resources. This ultimately produces higher quality results that can then be used to make more informed decisions throughout the design process.
Evolutionary Algorithms: How Natural Selection Beats Human Design
An evolutionary algorithm, which is a subset of evolutionary computation, can be defined as a “population-based metaheuristic optimization algorithm.” These nature-inspired algorithms evolve populations of experimental solutions through numerous generations by using the basic principles of evolutionary biology such as reproduction, mutation, recombination, and selection.