It’s arrived: Commoditization for industrial process control
With the advent of industrial process-control commoditization has come technological advancements that have expanded the boundaries of modern manufacturing–right to the computing edge. Traditionally, administrators had to walk out to a control system–USB stick in hand–and apply an update manually. Today, thanks to the combined work of Intel Corporation, Schneider Electric, and Red Hat, manufacturers can enjoy an edge-ready, software-defined, industrial control system that relieves the burden of manual effort and runs on commodity hardware and a commodity operating system and uses commodity automation techniques.
IOTA Data Preservation Implementation for Industrial Automation and Control Systems
Blockchain 3.0, an advanced iteration of blockchain technology, has emerged with diverse applications encompassing various sectors such as identity authentication, logistics, medical care, and Industry 4.0/5.0. Notably, the integration of blockchain with industrial automation and control systems (IACS) holds immense potential in this evolving landscape. As industrial automation and control systems gain popularity alongside the widespread adoption of 5G networks, Internet of Things (IoT) devices are transforming into integral nodes within the blockchain network. This facilitates decentralized communication and verification, paving the way for a fully decentralized network. This paper focuses on showcasing the implementation and execution results of data preservation from industrial automation and control systems to IOTA, a prominent distributed ledger technology. The findings demonstrate the practical application of IOTA in securely preserving data within the context of industrial automation and control systems. The presented numerical results validate the effectiveness and feasibility of leveraging IOTA for seamless data preservation, ensuring data integrity, confidentiality, and transparency. By adopting IOTA’s innovative approach based on Directed Acyclic Graph (DAG), the paper contributes to the advancement of blockchain technology in the domain of Industry 4.0/5.0.
🔏🚗 In-Depth Analysis of Cyber Threats to Automotive Factories
We found that Ransomware-as-a-Service (RaaS) operations, such as Conti and LockBit, are active in the automotive industry. These are characterized by stealing confidential data from within the target organization before encrypting their systems, forcing automakers to face threats of halted factory operations and public exposure of intellectual property (IP). For example, Continental (a major automotive parts manufacturer) was attacked in August, with some IT systems accessed. They immediately took response measures, restoring normal operations and cooperating with external cybersecurity experts to investigate the incident. However, in November, LockBit took to its data leak website and claimed to have 40TB of Continental’s data, offering to return the data for a ransom of $40 million.
Previous studies on automotive factories mainly focus on the general issues in the OT/ICS environment, such as difficulty in executing security updates, knowledge gaps among OT personnel regarding security, and weak vulnerability management. In light of this, TXOne Networks has conducted a detailed analysis of common automotive factory digital transformation applications to explain how attackers can gain initial access and link different threats together into a multi-pronged attack to cause significant damage to automotive factories.
In the study of industrial robots, controllers sometimes enable universal remote connection services (such as FTP or Web) or APIs defined by the manufacturer to provide operators with convenient robot operation through the Control Station. However, we found that most robot controllers do not enable any authentication mechanism by default and cannot even use it. This allows attackers lurking in the factory to directly execute any operation on robots through tools released by robot manufacturers. In the case of Digital Twin applications, attackers lurking in the factory can also use vulnerabilities in simulation devices to execute malicious code attacks on their models. When a Digital Twin’s model is attacked, it means that the generated simulation environment cannot maintain congruency with the physical environment. This entails that, after the model is tampered with, there may not necessarily be obvious malicious behavior which is a serious problem because of how long this can go unchecked and unfixed. This makes it easy for engineers to continue using the damaged Digital Twin in unknown circumstances, leading to inaccurate research and development or incorrect decisions made by the factory based on false information, which can result in greater financial losses than ransomware attacks.
In a World First, Yokogawa’s Autonomous Control AI Is Officially Adopted for Use at an ENEOS Materials Chemical Plant
ENEOS Materials Corporation (formerly the elastomers business unit of JSR Corporation) and Yokogawa Electric Corporation (TOKYO: 6841) announce they have reached an agreement that Factorial Kernel Dynamic Policy Programming (FKDPP), a reinforcement learning-based AI algorithm, will be officially adopted for use at an ENEOS Materials chemical plant. This agreement follows a successful field test in which this autonomous control AI demonstrated a high level of performance while controlling a distillation column at this plant for almost an entire year. This is the first example in the world of reinforcement learning AI being formally adopted for direct control of a plant.
Over a 35 day (840 hour) consecutive period, from January 17 to February 21, 2022, this field test initially confirmed that the AI solution could control distillation operations that were beyond the capabilities of existing control methods (PID control/APC) and had necessitated manual control of valves based on the judgements of experienced plant personnel. Following a scheduled plant shut-down for maintenance and repairs, the field test resumed and has continued to the present date. It has been conclusively shown that this solution is capable of controlling the complex conditions that are needed to maintain product quality and ensure that liquids in the distillation column remain at an appropriate level, while making maximum possible use of waste heat as a heat source. In so doing it has stabilized quality, achieved high yield, and saved energy.
Flexible, Low-Cost Water Monitoring with Edge I/O
Using the CODESYS control engine on the network’s main controller, an Opto 22 groov EPIC, the team configured each station as a remote I/O point, wrote polling logic, and defined appropriate alarm limits. CODESYS is the team’s preferred control platform because it allows them to use all the IEC languages where they are most appropriate. Typically, they use Structured Text (ST) for math and time calculations, Function Block (FB) for the main program routine, and Ladder (LD) when they need to orchestrate a specific sequence of actions.
Jared’s team decided to flip their approach. Instead of scanning all the remote I/O at high resolution from the main controller, they connected these three sites to an MQTT broker using the modules’ native MQTT publishing capabilities. They chose to use HiveMQ’s cloud-native MQTT broker, which allows 100 MQTT clients to communicate for free, keeping maintenance costs down for the district.
With the exception of the HiveMQ broker, all of this functionality—control engine, HMI server, Node-RED, MQTT publish-subscribe communication, device security—runs on the groov devices and does not require a Windows PC or external server for data or communication.
The impact of new technologies on automation and digitalization system architectures
Regardless of the advances in telecom and computing technologies, automation architectures have not changed much. While many components have advanced, their underlying architectures have largely remained the same. New system architectures for automation should be adopted that better fit with new industry operational and business challenges.
Even when new automation systems are currently using updated technology, their architectures are too vertical. Field devices, connected through a telecom infrastructure to SCADA/DCS, then tied to a historian, then connected to applications and business intelligence systems, in a sort of a totem fashion, raise problems by having diverse user interfaces, many interfaces among layers and the use of multiple databases. New architectures must be flatter, with fewer database layers and less interfaces among levels. The entire system should be based on a unified digital platform that includes most of the functionality in fewer levels. This digital platform must also be flexible enough to leverage on existing instruments, automation devices, and systems as much as possible.
Control System Process Data Archiving: Strategy and Methodology
Data stored in a historian server is useful for analysis and troubleshooting. Learn about data archiving techniques for control systems, as well as what type of data is stored and the tools to make it happen. Real-time process data in the EMS/DMS/SCADA system is stored in the relational database and only holds the current values of the measurement and status of the object. Therefore, it has no knowledge of process history. Historical data stored in the archiving/historian server can be viewed by reports, trends/curves, and single-line diagrams using the operator workstation.
Control systems evolve to meet enterprise and operational needs
For decades, selection of plant control and reliability technologies was frequently a matter of convenience. Individual plants across an organization selected technology based on price and local technical preference, often resulting in a wide variety of technologies across the enterprise. At the time, such decisions were convenient, reliable, and cost-effective. However, new, modern technologies — coupled with the need for increased sustainability and market agility — have changed the paradigm, driving a shift in the way engineers design automation solutions.
As the many layers of the Purdue model of industrial engineering have flattened in the cloud and edge computing age, connectivity has become more important. Today, forward-thinking process manufacturers are making automation decisions with an enterprise IT mindset, moving away from a collection of local systems to a single system that is deployed everywhere. In doing so, they unlock the capacity for improved data democratization, fleet optimization, and improved personnel productivity.
Feds Uncover a ‘Swiss Army Knife’ for Hacking Industrial Control Systems
On Wednesday, the Department of Energy, the Cybersecurity and Infrastructure Security Agency, the NSA, and the FBI jointly released an advisory about a new hacker toolset potentially capable of meddling with a wide range of industrial control system equipment. More than any previous industrial control system hacking toolkit, the malware contains an array of components designed to disrupt or take control of the functioning of devices, including programmable logic controllers (PLCs) that are sold by Schneider Electric and OMRON and are designed to serve as the interface between traditional computers and the actuators and sensors in industrial environments. Another component of the malware is designed to target Open Platform Communications Unified Architecture (OPC UA) servers—the computers that communicate with those controllers.
Dragos says the malware has the ability to hijack target devices, disrupt or prevent operators from accessing them, permanently brick them, or even use them as a foothold to give hackers access to other parts of an industrial control system network. He notes that while the toolkit, which Dragos calls “Pipedream,” appears to specifically target Schneider Electric and OMRON PLCs, it does so by exploiting underlying software in those PLCs known as Codesys, which is used far more broadly across hundreds of other types of PLCs. This means that the malware could easily be adapted to work in almost any industrial environment. “This toolset is so big that it’s basically a free-for-all,” Caltagirone says. “There’s enough in here for everyone to worry about.”
⭐ A Framework for Enhancing the Interoperability of Information across a Plant
Since it is becoming increasingly difficult for a single vendor to meet diversifying user requirements by itself, interoperability among multi-vendor components and control systems such as distributed control systems (DCS) and programmable logic controllers (PLC), has been improved by adopting open industrial communication protocols. However, these protocols, and the information generated, stored, and transferred, are not fully compatible with each other. Accordingly, the open platform communications unified architecture (OPC UA) and related international standards are attracting attention from many vendors and users as a key to high interoperability. This paper introduces how OPC UA improves interoperability among plant components and systems and describes Yokogawa’s prospect.
This paper introduced the trend of FITS and OPC UA FX as standard technologies related to OPC UA. Conventionally, a plant operation system is built by stacking various specialized elements. The system is expected to be integrated vertically and horizontally by industrial-level interoperability standards including OPC UA. As a result, the functional hierarchy will become flat and diverse components and systems will cooperate with each other regardless of the kind of vendors and applications. Yokogawa focuses on the interoperability in the cooperative domain, which was discussed in this paper, and is actively participating in standardization of FITS, OPC UA FX, and IEC/IEEE 60802.
What is the future of Control Systems? The Evolution of Control Systems.
Artificial intelligence optimally controls your plant
Until now, heating systems have mainly been controlled individually or via a building management system. Building management systems follow a preset temperature profile, meaning they always try to adhere to predefined target temperatures. The temperature in a conference room changes in response to environmental influences like sunlight or the number of people present. Simple (PI or PID) controllers are used to make constant adjustments so that the measured room temperature is as close to the target temperature values as possible.
We believe that the best alternative is learning a control strategy by means of reinforcement learning (RL). Reinforcement learning is a machine learning method that has no explicit (learning) objective. Instead, an “agent” with as complete a knowledge of the system state as possible learns the manipulated variable changes that maximize a “reward” function defined by humans. Using algorithms from reinforcement learning, the agent, meaning the control strategy, can be trained from both current and recorded system data. This requires measurements for the manipulated variable changes that have been carried out, for the (resulting) changes to the system state over time, and for the variables necessary for calculating the reward.
Why resources companies are looking to evented APIs
Resources companies that want to get the most value from their data will process it the instant that it is created. The longer that data is left unprocessed, the more it diminishes in value. Operational excellence can be driven by evented APIs that can produce, detect, consume, and react to events occurring within the technology ecosystem.
Evented APIs can be applied to our example use case to deliver an autonomous feedback loop that incorporates smarter decision making in real-time.
Evolving control systems are key to improved performance
For decades, the control system was constrained by physical hardware: hardwired input/output (I/O) layouts, connected controllers and structured architectures including dedicated networks and server configurations. Now, the lower cost of processing power and sensing, the evolution of network and wireless infrastructure, and distributed architectures (including the cloud) are unlocking new opportunities in control systems. Additionally, emerging standards for plug-and-produce, such as advanced physical layer (APL) and modular type package (MTP) interfaces, will drive significant changes in the way plants design and use control systems over the next decade.
Evolution of control systems with artificial intelligence
Control systems have continuously evolved over decades, and artificial intelligence (AI) technologies are helping advance the next generation of some control systems.
The proportional-integral-derivative (PID) controller can be interpreted as a layering of capabilities: the proportional term points toward the signal, the integral term homes in on the setpoint and the derivative term can minimize overshoot.
Although the controls ecosystem may present a complex web of interrelated technologies, it can also be simplified by viewing it as ever-evolving branches of a family tree. Each control system technology offers its own characteristics not available in prior technologies. For example, feed forward improves PID control by predicting controller output, and then uses the predictions to separate disturbance errors from noise occurrences. Model predictive control (MPC) adds further capabilities to this by layering predictions of future control action results and controlling multiple correlated inputs and outputs. The latest evolution of control strategies is the adoption of AI technologies to develop industrial controls.
Sensor Fusion: The Swiss Army Knife of Digitalization
With the proper communication protocols and network architecture in place, smart sensor technology and the data it provides can be the bulwark on which digital transformation is built.
If industrial control systems are the brains of a plant, then sensors are its eyes and ears. Simply put, without sensors there would be nothing for SCADA, DCS, or PLCs to respond to. That’s why increasingly intelligent or ‘smart’ sensors packing more onboard processing power, the ability to monitor new variables, and digital communication capabilities are playing such an important role in helping plant operators and enterprise level planners alike to see better and respond to problems with more finesse.
Cooperation between Control Technology and AI Technology to Improve Plant Operation
As the manufacturing industry is shifting its production model from mass production to the production of multiple products in small or variable quantities, more sophisticated operation of production equipment is required. Yokogawa has a unique approach to this problem, which was adopted by the New Energy and Industrial Technology Development Organization (NEDO). This paper describes details of this NEDO project and its achievements, as well as a study on the effective use of AI technology, which is another theme of this project.
In the NEDO project, to create this time-series model, we used effective nonlinear methods: multilayer perceptron (MLP), BiLSTM, and QRNN. As a result, we obtained correlation coefficients greater than 0.7 in the model. To verify whether this time-series model can reproduce the behavior of the target process, we evaluated its accuracy index. In addition, we used the model to solve the optimization problem and automatically calculate the optimal control parameters (PID values).