👷 How smart manufacturing can alter safety standards
An essential component of smart manufacturing is the ability to automate hazardous activities. The evolution of automation in manufacturing has been a game-changer, and it’s why workplace injuries have steadily decreased over the years.
An example of this is at Rolls-Royce, where the adoption of cutting-edge Industry 4.0 technologies is helping to enhance safety and ensure a safer working environment for employees. With the use of 3D visualisation software, employees can have a better understanding of their workplace, including potential hazards. Meanwhile, machine learning technology is assisting in monitoring personal protective equipment (PPE) compliance, and the deployment of robotic arms is taking over tasks that were once considered dangerous, such as furnace operations, thus reducing the need for manual labour.
New NVIDIA IGX Platform Helps Create Safe, Autonomous Factories of the Future
NVIDIA today introduced the IGX edge AI computing platform for secure, safe autonomous systems. IGX brings together hardware with programmable safety extensions, commercial operating-system support and powerful AI software — enabling organizations to safely and securely deliver AI in support of human-machine collaboration. The all-in-one platform enables next-level safety, security and perception for use cases in healthcare, as well as in industrial edge AI.
Stray Magnetic Fields and Safety
Bunting is at the heart of the electrification program of the world’s vehicles. By producing magnetising systems that allow for genuinely error-free assembly and providing 100% inline testing, Bunting supports automotive and aerospace programmes looking to create new modes of travel and improve efficiency. Bunting is proud to support this sustainable engineering activity to provide a greater level of energy security for all future generations.
During the development of these electric machine magnetising systems, one of the most common questions our customers ask our engineers is: “How can we keep our staff safe from stray electromagnetic fields?”.
TELUS: Solving for workers’ safety with edge computing and 5G
Together with Google Cloud, we have been leveraging solutions with the power of MEC and 5G to develop a workers’ safety application in our Edmonton Data Center that enables on-premise video analytics cameras to screen manufacturing facilities and ensure compliance with safety requirements to operate heavy-duty machinery. The CCTV (closed-circuit television) cameras we used are cost-effective and easier to deploy than RTLS (real time location services) solutions that detect worker proximity and avoid collisions. This is a positive, proactive step to steadily improve workplace safety. For example, if a worker’s hand is close to a drill, that drill press will not bore holes in any surface until the video analytics camera detects that the worker’s hand has been removed from the safety zone area.
Boeing Bionics Allow Teammates to Suit up for Safety
In Boeing’s commercial division, the exoskeleton vest is in use or planned for use as personal protective equipment in the 737, 767, 777 and 787 Dreamliner programs. Teams at a number of Boeing sites have tested the vest since 2018. It is rolling out as an innovative enterprise standard tool designed to lessen the pressure mechanics bear as they work repetitive jobs at chest level and above.
“When you activate the vest, it’s somewhere between 5 to 18 pounds (2 to 8 kilograms) offloaded from the wearer,” said Dr. Christopher Reid, a Boeing engineer and Associate Technical Fellow who specializes in ergonomics and wearable technology. “It reduces the stress on the shoulders and ultimately reduces injuries.”
How AMRs change the safety equation
Soon, manufacturers and buyers wanted clear safety standards for AMRs from organizations like A3. They asked, “What guidance can you provide through a standard to help us understand how we can assess the safety of these devices?” Wise was on the committee that created Mobile Robot Standard R15.08-1-2020, the new Mobile Industrial Safety Standard. In April, Fetch announced full conformance with the standard.
As far as safety standards, Universal Robots follows the ISO standard that came out in 2011 (ISO 10218-1). This ISO standard is Part 1 of ANSI/RIA R15.06. She noted that European companies, like Universal Robots, tend to have higher requirements for safety, given the requirements of the European Directives.
Hyundai Motor Group x Boston Dynamics Factory Safety Service Robot
Real-World ML with Coral: Manufacturing
For over 3 years, Coral has been focused on enabling privacy-preserving Edge ML with low-power, high performance products. We’ve released many examples and projects designed to help you quickly accelerate ML for your specific needs. One of the most common requests we get after exploring the Coral models and projects is: How do we move to production?
- Worker Safety - Performs generic person detection (powered by COCO-trained SSDLite MobileDet) and then runs a simple algorithm to detect bounding box collisions to see if a person is in an unsafe region.
- Visual Inspection - Performs apple detection (using the same COCO-trained SSDLite MobileDet from Worker Safety) and then crops the frame to the detected apple and runs a retrained MobileNetV2 that classifies fresh vs rotten apples.
Augmented reality becomes actual reality
When applied to electrical power distribution across a wide range of businesses and industries, AR has the potential to greatly increase power availability, electrical safety, and efficiency. Here’s why:
- Availability: AR helps organizations optimize operations and maximize continuity for better productivity and profitability
- Safety: AR helps to reduce the risk of occupational injuries and fatalities
- Efficiency: AR help reduces the total cost of ownership by offering more accessible and effective training
AI Vision for Monitoring Applications in Manufacturing and Industrial Environments
In traditional industrial and manufacturing environments, monitoring worker safety, enhancing operator efficiency, and improving quality assurance were physical tasks. Today, AI-enabled machine vision technologies replace many of these inefficient, labor-intensive operations for greater reliability, safety, and efficiency. This article explores how, by deploying AI smart cameras, further performance improvements are possible since the data used to empower AI machine vision comes from the camera itself.
Collaboration requires presence sensing
The challenge of automation has always been to keep people safe while trying to produce more product in the same footprint. The faster a machine runs, the more physical space is required to guarantee that, if something goes wrong, the machine has enough time to come to a complete and safe stop before potentially making contact with humans or other machines around it. Traditionally, this would involve a physical cage around the piece of automation. This cage could take the form of a frame with either polycarbonate or expanded steel (fence) panels.
Made to physically defend a person from getting too close, these types of guarding systems also take up a lot of real estate. For this reason, they are not well-suited to a cobot application where we don’t want the new automated device taking up any more space than the human it is replacing.
The technology required to respond to this need for an ever tighter operating envelope has advanced dramatically, especially over the past two or three years. While we will delve into that momentarily, it is important to note that the robot manufacturers, in addition to coming up with new ways to sense the presence of people in proximity to the robot, have had to come up with ways to safely limit the range of operation to be inside the normal operating range of the robot.
Amazon’s robot arms break ground in safety, technology
Robin, one of the most complex stationary robot arm systems Amazon has ever built, brings many core technologies to new levels and acts as a glimpse into the possibilities of combining vision, package manipulation and machine learning, said Will Harris, principal product manager of the Robin program.
Those technologies can be seen when Robin goes to work. As soft mailers and boxes move down the conveyor line, Robin must break the jumble down into individual items. This is called image segmentation. People do it automatically, but for a long time, robots only saw a solid blob of pixels.