

![]()


Integrated antennas, Power over Ethernet, and IEC 62443: With the outdoor cellular routers of the Cellulink series, you can simplify your planning and installation effort and benefit from high cybersecurity at the same time. For optimum cellular connectivity in your application.
❯ For more information, visit phoenixcontact.com/cellulink

Industrial AI is entering a decisive new phase—one where it stops being an experimental add-on and becomes the core operating system of modern industry. The next decade will be defined by AI - driven autonomy, hyper - connected systems, and a shift from reactive operations to predictive, self - optimizing ecosystems. This transformation is already visible in the rapid digitization of factories, the rise of IIoT, and the massive global investment in AI infrastructure.
AI as the Brain of the Industrial Internet of Things: The Industrial Internet of Things has moved from trend to standard practice, and AI is becoming its intelligence layer. Connected machines generate torrents of real-time data, and AI systems increasingly interpret that data to optimize production, reduce downtime, and enhance safety.
As edge computing matures, more AI will run directly on devices near the production line, enabling ultra-low-latency decision-making. This distributed intelligence reduces reliance on centralized systems and allows factories to operate with greater autonomy.
Hybrid Edge–Cloud Intelligence: Industrial AI will increasingly operate across hybrid architectures. Edge devices will handle immediate, safety-critical decisions, while cloud platforms will manage long - horizon optimization, simulation, and fleet - wide learning. This hybrid model enables digital twins that simulate entire plants; cross-site optimization and centralized analytics with localized execution
AI‑Driven Operations: Manufacturers are moving toward “lights - out” operations— facilities that run with minimal human intervention. AI will orchestrate scheduling, inventory, energy usage, and maintenance. Robotics, guided by AI vision and planning systems, will handle increasingly complex tasks.
AI as a Macro‑Economic Force: AI is no longer just a technology trend—it is a structural economic force. Global investment in AI - related infrastructure is projected to exceed $2.9 trillion by 2028, driven largely by data center expansion and industrial adoption. This scale of investment signals a shift: AI is becoming as fundamental to industrial growth as electricity or computing once were.
Industrial AI is evolving from a set of tools into a full industrial nervous system—one that senses, decides, and optimizes at scale. The companies that thrive will be those that embrace hybrid architectures, invest in workforce transformation, and treat AI not as a project but as a foundational capability.
Al Presher


The next issue of Industrial ethernet magazine will be published in May/June 2026. Deadline for editorial: May 15, 2026 Advertising deadline: May 15, 2026
Editor: Al Presher, editor@iebmedia.com
Advertising: info@iebmedia.com Tel.: +1 585-598-6627
Free Subscription: iebmedia.com/subscribe Published by
OPC Foundation and CEN/CENELEC support Europe’s Digital Product Passport vision with open, interoperable standards for scalable and trusted data exchange.

OPC FOUNDATION ANNOUNCED A LIAISON Agreement with the European Committee for Standardization (CEN) and the European Committee for Electrotechnical Standardization (CENELEC), marking an important step forward in the standardization of Digital Product Passport (DPP) solutions in Europe.
Through this liaison, the OPC Foundation will actively contribute to the work of CEN/CENELEC JTC 24 – Digital Product Passport – Framework and System, providing technical expertise and input through direct participation in the relevant working groups. The cooperation focuses on aligning data modeling technologies and fostering interoperable, system-agnostic and open-source DPP solutions that scale seamlessly from embedded devices to cloud and enterprise environments. The collaboration supports the development of interoperable DPP solutions that ensure trusted, semantic rich data exchange, regulatory compliance, and scalable deployments across sector-specific data spaces. By combining the European standardization framework with OPC UA’s proven interoperability technologies, the
liaison contributes to a harmonized and future-proof approach to Digital Product Passports.
Although OPC UA is widely known for secure information transport in automation, its true strength lies in its rich, semantic information modeling capabilities. Through the liaison with CEN/CENELEC JTC 24, OPC UA will be used and extended with standardized Digital Product Passport data models and interfaces, supporting system-agnostic and interoperable DPP implementations.
Commenting on the cooperation, Bmstr. Ing. Otto Handle, Convenor of CEN/ CENELEC/JTC 24 WG4 – Digital Product Passport – Interoperability, stated: “One aim of the standardization work of JTC 24 is interoperability beyond all system borders. The standardization request of the European Commission clearly states that the DPP shall be system agnostic and vendor independent. Integration and Interoperability with the full DPP system shall be possible for any software and any system according to these standards. Therefore, we highly appreciate and welcome OPC Foundation in the Liaison and look forward to the modelling of DPPs with OPC UA and integration of the specified JTC24 API and data models, as well as shop floor integration with OPC UA-enabled embedded systems as
DPP data sources.”
Erich Barnstedt, Microsoft, liaison officer for the OPC Foundation said: “This liaison ensures that Digital Product Passports (DPPs) are built on truly open, standardized, and interoperable foundations. By contributing the very successful OPC UA modelling capabilities to DPPs, including an open-source reference implementation, we help enable seamless integration of DPP data across vendors, platforms, and system layers. What is remarkable is the timeline in which this was possible: From initial concepts to finished implementation, the OPC Foundation only took 2 weeks.”
The Liaison Agreement is established for an unlimited period, contingent on the OPC Foundation’s continued participation in the work of JTC 24. It reflects a shared ambition to advance open, interoperable, and internationally aligned standards that support the European Union’s digital transformation and sustainability goals.
By working together, CEN, CENELEC, and the OPC Foundation reaffirm their commitment to ensuring that Digital Product Passports are built on robust, open standards — enabling innovation, interoperability, and long-term value across industries.
News report by OPC Foundation.
TwinCAT Machine Learning: AI is simply integrated into the control level
AI models as a function block in the PLC: AI as a component of the control code
real-time execution on the standard control IPC: in sync with motion, sequential logic, vision, and much more
acceleration of complex AI models: Beckhoff IPC with NVIDIA GPU and interface from the PLC automated training of AI models: AI model creation that doesn't require AI expertise
open interface for trained AI models (ONNX): trained AI with interoperability
AI model lifecycle management: model updates without compilation, stop, and restart


OPC UA for standardized, secure communications, MQTT as a lightweight, publish/subscribe protocol ideal for IoT and edge devices and AI to leverage data from OPC UA and MQTT-enabled devices to deliver predictive maintenance, real-time quality control and adaptive learning is a powerful combination.

OPC UA, MQTT, AND AI TECHNOLOGIES ARE transforming manufacturing networks into hyper-connected, intelligent systems that enable real-time data exchange and decisionmaking. These advancements promise scalable, efficient, and more sustainable industrial operations.
For this special report, Industrial Ethernet reached out to industry experts for their views on how these technologies are reshaping the future of hyperconnected factory networks.
Standardized secure communication: OPC UA provides a vendor-neutral, secure and scalable communication framework that supports interoperability from shop floor devices to cloud platforms, enabling futureproof smart manufacturing.
Efficient real time data transmission: MQTT offers a lightweight, publish/subscribe
protocol ideal for IoT and edge devices, ensuring reliable, low-bandwidth data exchange and simplifying cloud integration for advanced analytics.
AI driven optimization and autonomy: AI leverages data from OPC UA and MQTT-enabled devices to deliver predictive maintenance, real-time quality control, autonomous system operation, and adaptive learning that continuously optimizes manufacturing processes.
Transformative industrial impact: Together, these technologies enable predictive maintenance, dynamic process optimization, and autonomous systems that increase efficiency, scalability, sustainability, and competitiveness, while addressing integration, cybersecurity, and scalability challenges in modern industrial environments.
AI is helping to unlock more value from operational data.
“Enterprise manufacturing networks are evolving rapidly as manufacturers look to unlock more value from their operational data,” Joseph Biondo, Sr. Program Manager at Rockwell Automation told Industrial Ethernet recently. “Today, roughly 95% of manufacturers say they have already invested in AI or machine learning or plan to within the next five years. In manufacturing environments, AI is already helping improve quality control, optimize processes and strengthen cybersecurity. And over time, it will help manufacturers shift from automated operations to more

“Today, roughly 95% of manufacturers say they have already invested in AI or machine learning or plan to within the next five years. In manufacturing environments, AI is already helping improve quality control, optimize processes and strengthen cybersecurity. And over time, it will help manufacturers
autonomous ones.”
“On the connectivity side, OPC UA and MQTT are enabling more scalable data architectures across the plant and enterprise. OPC UA has long been used to move machine data to manufacturing execution systems (MES) for data collection, historians and the like,” Biondo said. “MQTT is newer to the operations technology (OT) environment, but it’s lightweight, more ‘enterprisecentric’ and familiar to software developers. As a result, we’re seeing more machinery/ equipment OEMs and technology suppliers adapt MQTT to simplify and standardize data communications between equipment and enterprise systems.”
Biondo said that AI is enabling manufacturers to analyze and act on massive volumes of production data in ways that simply weren’t possible before. This deeper level of insight creates new possibilities at every level of production.
One area that’s top of mind today is quality. About three in four OEMs say they view AI or machine learning as critical to designing quality into equipment, and half of manufacturers say they plan to use AI and machine learning for quality control.
AI-enabled technologies already exist to help manufacturers improve if not revolutionize quality control. For example, inspection solutions that use AI and machine learning can identify subtle anomalies and notify production teams so they can detect issues earlier and respond faster.
From a connectivity standpoint, technologies like OPC UA and MQTT help standardize how machine data is structured and shared. This requires an agreed-upon set of datapoints that correspond to machine/ equipment metrics, so information can be reliably collected and analyzed. For example, energy metrics such as electricity usage in each machine state need to be mapped to an output in the controller, and the data source(s) need to be defined down to the tags in the controller and any other devices (e.g. variable frequency drives) that are drawing electric current. This is a key part of data communications, regardless of the protocol, and often it is not clearly defined, especially if the communications are being retrofitted on existing equipment.
MQTT offers advantages in industrial networks because it’s lightweight and resilient, particularly with unreliable networks. That makes it well suited for non-hardwired networks.
“Manufacturers are investing heavily in AI because it has the power to transform multiple areas of production. In fact, AI is now the second largest technology investment priority for many manufacturers, behind only cloud or software-as-a-service technologies. Quality is certainly a candidate as has been discussed, but AI can also help address needs like process optimization,” Biondo said.
“We’ve seen manufacturers achieve measurable results with these types of solutions. In one example, a frozen foods maker deployed model predictive control to improve throughput, yield and energy usage,” he said. “Now, they’re in a multi-year rollout of the solution at sites across five continents.”
He added that AI can also help strengthen cybersecurity. By analyzing data across multiple sources, AI can detect anomalies that may indicate a threat, filter out false alarms and automate parts of the response and remediation process.
Historically, Biondo said there were “house protocols” like Modbus, that were opened up to more technology suppliers. Technologies like OPC UA and MQTT were designed from the

ground up to be open, which helps simplify integration across complex environments. MQTT especially comes more from the consumer software world, so its applicability is more known and supported on the IT/ enterprise side.
“AI is already transforming production operations, and its impact will accelerate as it’s combined with other technologies like industrial robots to create self-learning and self-optimizing machines that make autonomous production possible,” Biondo said.
“In this new environment, automation systems can orchestrate production and self-optimize all on their own. Operations management systems can self-organize schedules and resources,” he said. “Intelligent motion technology and autonomous mobile robots can move materials and products efficiently all the way to packaging. And maintenance systems can self-diagnose equipment and recommend work orders.”
“This shift represents the next big evolution in manufacturing – from automated production to autonomous production,” he added. “It has the potential to improve efficiency, enhance safety and help manufacturers operate more sustainability. And AI makes it possible.”
A connected data foundation for trusted and contextualized industrial data.
According to Paul Pereda, Engineering Manager-Systems Operations for Yokogawa, OPC UA, MQTT, and AI are key enablers of open, secure, and scalable enterprise manufacturing networks from Yokogawa’s perspective.
“OPC UA serves as the foundation for trusted and contextualized industrial data, ensuring interoperability, data integrity, and cybersecurity across multivendor environments;” Pereda said. “MQTT complements this by enabling efficient, event driven data distribution from the plant floor to enterprise and cloud systems, supporting modern IT/OT convergence architectures.”
Pereda said that these technologies are already embedded in Yokogawa’s portfolio.
Solutions such as the Collaborative Information Server (CI Server) leverage open standards to aggregate, contextualize, and manage data from distributed control systems, safety instrumented systems, field devices, and third party applications. By breaking down data silos, CI Server makes
operational information readily available for enterprise applications, analytics platforms, and digital transformation initiatives—while preserving control system integrity.
Building on this connected data foundation, AI and advanced analytics play a critical role in turning information into actionable insights. Applied across operations and asset management, AI enables early anomaly detection, predictive maintenance, and performance optimization. Together, these capabilities support Yokogawa’s vision of resilient, data driven, and increasingly autonomous manufacturing operations, helping customers improve reliability, efficiency, and long term operational sustainability.
“These technologies provide concrete technical advantages that are enabling a new class of IIoT applications beyond what traditional monitoring and dashboard centric systems can deliver today,” Pereda added.
“OPC UA enables IIoT success by providing semantic data models, standardized information structures, and built in security. Unlike typical applications that rely on raw tags or flat data streams, OPC UA preserves engineering context—such

as asset relationships, units, states, and events—making data immediately usable for advanced analytics, AI models, and cross system integration, all without extensive custom engineering,” he said.
He explained that the key is that MQTT delivers event driven, scalable, and bandwidth efficient data distribution, which typical polling based applications cannot achieve efficiently. Its publish/subscribe architecture decouples data producers from consumers, enabling large scale, multi site IIoT deployments and real time data sharing across enterprise and cloud platforms with minimal overhead.
AI, when applied to contextualized and reliable OT data, provides benefits that go far beyond today’s rule based alarms and historical trending. AI supports early anomaly detection, predictive failure identification, and prescriptive recommendations, allowing systems to learn normal behavior and detect subtle degradation patterns that are impractical to capture with static thresholds.
Together, these technologies shift IIoT from isolated visibility tools to scalable, intelligent, and operationally integrated systems, enabling predictive, resilient, and increasingly autonomous manufacturing operations aligned with Yokogawa’s digital transformation vision.
“The prospects for applying OPC UA, MQTT, and AI in industry are strong and accelerating, as manufacturers move from isolated digital pilots toward enterprise scale, operationally embedded IIoT
solutions. The combined use of OPC UA and MQTT is increasingly becoming a standard architectural pattern, where OPC UA provides structured, secure, and contextualized OT data, and MQTT enables efficient distribution of that data across plants, enterprises, and cloud environments,” Pereda said.
As these architectures mature, IIoT solutions are expected to expand beyond monitoring and reporting into closed loop optimization, remote operations, and decision support. The availability of standardized data models and scalable messaging enables faster deployment of applications across multiple sites, reducing engineering effort and improving consistency in global operations.
AI will further amplify the impact by enabling systems to continuously learn from operational data, supporting predictive maintenance, process optimization, and operator decision assistance. Rather than relying on static rules or historical analysis, future solutions will increasingly support adaptive and prescriptive capabilities that improve reliability, energy efficiency, and safety.
“Overall, these developments are expected to shift IIoT from a supporting role to a core enabler of resilient, sustainable, and increasingly autonomous industrial operations, aligning with Yokogawa’s long term vision for IT/OT convergence and digital transformation,” Pereda said.
Pereda said that these technologies directly address several long standing challenges faced by automation and control engineers.
One of the most significant challenges is system fragmentation and data silos across distributed control systems, programable logic controllers, supervisory control and data acquisition system, historians, and enterprise systems. OPC UA addresses this challenge by providing a standardized, secure, and semantically rich framework that preserves engineering context, reducing custom interfaces and manual data mapping efforts.
Another key challenge is scalability and efficient data distribution. Traditional polling based architectures and point to point integrations become complex and costly as systems expand. MQTT’s publish/ subscribe model simplifies system expansion, supports event driven architectures, and enables reliable data sharing across sites and cloud environments, all with minimal engineering overhead required.
Engineers also face increasing pressure to support advanced analytics and AI, often on brownfield systems never designed for these use cases. By combining OPC UA’s contextualized data with scalable data pipelines, AI can be applied more effectively for anomaly detection, predictive maintenance, and operational optimization, reducing reliance on static rules and manual tuning.
Looking forward, the ongoing impact of these technologies will be a shift in the engineer’s role, from maintaining isolated control systems to designing resilient, data centric architectures that support continuous optimization, remote operations, and progressive autonomy. This transition will improve engineering efficiency—while

“This
enabling safer, more reliable, and more sustainable industrial operations.
“The impact of OPC UA, MQTT, and AI will continue to grow as these technologies become the default foundation for industrial digital architectures rather than optional add ons. Looking ahead, their combined use will accelerate the shift from isolated automation systems toward enterprise connected, data centric operations, enabling faster deployment of IIoT solutions across multiple plants and regions,” Pereda said.
“In the near term, OPC UA and MQTT will further standardize how OT data is structured, secured, and distributed, significantly reducing integration complexity for automation and control engineers,” he added. “This will enable organizations to scale digital initiatives more predictably, while maintaining cybersecurity and system reliability.”
Pereda’s perspective is that, over the next three years, AI is expected to move from experimental pilots to operationally embedded capabilities. Rather than replacing control systems, AI will increasingly augment them, supporting predictive
maintenance, process optimization, energy efficiency, and operator decisions. These AI applications will rely heavily on high quality, contextualized OT data to deliver trustworthy and explainable results suitable for industrial environments.
“The anticipated impact is a gradual transition toward more autonomous, resilient, and sustainable manufacturing operations—one where engineers spend less time managing data and alarms—and more time designing systems that continuously learn, adapt, and optimize performance, fully aligned with Yokogawa’s long term vision for IT/OT convergence and industrial autonomy,” Pereda concluded.
Connecting machines, plants and IT systems in a secure, interoperable and scalable way.
In 2026, OPC UA, MQTT and AI will be the central building blocks of modern manufacturing networks because they connect machines, plants and IT systems in a secure, interoperable and scalable way. OPC UA provides semantically structured and standardized communication, which
has been further strengthened by the latest version IEC 62541 5:2026,” Arno Martin Fast, Senior Specialist PLCnext Technology for Phoenix Contact GmbH told Industrial Ethernet recently.
“This updated information model architecture enables more detailed data models, better extensibility and greater interoperability between systems. This forms the basis for networked production architectures in which machines from different manufacturers can work together efficiently,” Fast said.
He explained that MQTT complements these capabilities with lightweight publish/ subscribe messaging, which is particularly suitable for large, distributed and cloudbased production environments. Due to its scalability and robustness, MQTT supports thousands of devices and enables real-time distribution of telemetry data across multi-site networks.
AI becomes a key driver in these networks by leveraging structured data from OPC UA and high-frequency event streams from MQTT for predictive analytics, process optimization and autonomous decision-making.
The result is a flexible, intelligent network that significantly increases efficiency, transparency and adaptability.
“The combination of OPC UA, MQTT and AI offers significant technical advantages over traditional automation technologies that make new IIoT applications viable. Thanks to its highly structured information model, including defined objects, variables and events, OPC UA enables semantic interoperability that is now much more comprehensive and precise than with classic fieldbuses or proprietary protocols. This facilitates cross-manufacturer integration and significantly reduces individual interface developments.
MQTT offers technical advantages in terms of scalability, network load and robustness. Its publish/subscribe principle ensures that systems only receive relevant data, while Quality of Service (QoS) levels ensure reliable message transmission and OPC UA-defined structures. This is particularly advantageous in environments with many sensors, mobile devices or cloud connections - scenarios that classic SCADA architectures can only achieve to a limited extent.
AI extends these technical foundations by processing data intelligently: Predictive maintenance, self-optimizing control algorithms and real-time anomaly detection are applications that were hardly achievable with previous technologies.
"The prospects for the use of OPC UA, MQTT and AI in industry are extremely positive, as they form the basis for the next stage in the development of industrial digitalization. Industrial standards such as OPC UA are clearly recommended by associations such as ZVEI and VDMA as key technologies of Industry 4.0, particularly because of their interoperability and their ability to network machines and systems across manufacturers. This makes it easier for companies to invest in modular production environments and accelerates the transformation towards flexible manufacturing networks,” Fast said. At the same time, MQTT-based architectures are gaining in importance as companies increasingly use cloud-based services, global production networks and decentralized sensor technology. MQTT is ideal for this and supports new use cases such as mobile robotics, remote monitoring or cross-location data aggregation.
Fast said that “AI is seen as the most important amplifier of these technologies. It enables data-based decisions, automated optimization and predictive maintenance.”
The expected effects include greater production flexibility, less downtime, better quality, less manual intervention and the trend towards largely autonomous production systems.

“These technologies pose new technical and organizational challenges for automation and control technology experts. OPC UA brings with it complex semantic information models, certificate management and new security architectures that require in-depth understanding and additional qualifications. The introduction requires more modeling skills instead of pure programming skills, which means a significant change in skills.” Fast added.
“MQTT opens up new communication architectures that are more networked, cloud-oriented and event-based than traditional automation technology. Technicians need to familiarize themselves with broker architectures, topic structures, QoS mechanisms and edge computing concepts - areas that have typically been assigned to IT, while the OT focused on fieldbuses and serial communication.”
He added that AI brings the biggest challenge: it requires data understanding, model evaluation, monitoring, edge AI integration and a basic understanding of algorithms. As AI is increasingly used in control-related areas, the boundaries between OT and IT are becoming increasingly blurred.
In the long term, these technologies will change job roles: From classic control technology to data , integration and system architects in hybrid OT/IT environments.
“In the next years, OPC UA, MQTT and AI will have a much greater impact on industrial manufacturing than they do today. OPC UA will form the semantic basis for fully interoperable machine parks through further standardized companion specifications.” Fast said. “But also the possibility to use the OPC UA standardized way to manage devices by performing software updates or using the OPC UA GDS to manage the device certificates are a big advantage.”
His view is that MQTT will continue to gain in importance due to its scalability and cloud suitability, especially in the globally networked machine environment. It enables faster data flows, finer monitoring structures and cross-regional optimization of production networks.
AI modules are already showing that they can predict failures weeks in advance and independently optimize processes. Over the next three years, AI will be more deeply integrated into edge and control environments, enabling autonomous decisions in real time. This includes selfadapting parameters, automatic quality control, intelligent resource utilization and continuous process improvement.
“This makes production more efficient and more resilient. Companies that integrate these technologies early on create a sustainable competitive advantage.” Fast said.

Software technologies working together on industrial networks.
Keith McNab, director of control and automation software for Emerson’s machine automation solutions business, and Daniel Smith, senior product manager for Emerson’s machine automation solutions business, teamed up to respond on how OPC UA, MQTT and AI shaping and enabling the evolution of enterprise manufacturing networks in 2026 and beyond.
“OPC UA, MQTT, and AI are all technologies working together on industrial networks to bring more value to the data created during operations, greatly benefiting users. OPC UA elevates and secures data; instead of simply reporting a basic value with a timestamp and engineering units, OPC UA securely delivers contextualized information to provide more value and greater understanding,” McNab said. “MQTT then defines how to move that contextualized information at scale via a lightweight framework. This capability provides a seamless path to drive valuable information to powerful AI models that can interpret the information to deliver insights that help teams make better decisions.”
McNab and Smith said that IoT applications typically work in an environment where there exists equipment from many different vendors. When those different pieces of equipment can all intercommunicate, critical applications work more easily and effectively. OPC UA’s companion specifications allow different vendors to have varied equipment that all speak the same, consistent language. Modern protocols also simplify setup by creating an effective plug-and-play environment. Systems can automatically discover and auto-configure devices, allowing them to participate in the communication network automatically.
“First and foremost, this increased flexibility gives users a better view into the process as a whole because they can collect data from many sources and bring it together to make better decisions via a wide array of tools, including AI models. In addition, the consistency offered by a shared language makes tools like AI agents far easier to create,” Smith said.
He added that AI also empowers teams to add an intelligence layer to communication, allowing the network to be more adaptive and self-optimizing. The system can make parameterization changes based on real time
data to make communications more effective or address issues that are detected by the AI.
“These solutions create a lot more functionality at the edge that will enable many edge-to-cloud technologies, including cloud AI. The edge will be a normalizing factor, bringing in many disparate protocols and transforming the data they transmit into a semantically rich OPC UA information model and then using MQTT to communicate at scale to the cloud,” McNab said. “The normalized OPC information model contains the intent of the data that can be shared with enterprise solutions such as AI tools. Cloud applications can be programmed more efficiently when teams know the type definitions they must be programmed against.”
He added that modernization is important, but legacy devices will remain in the mix for a long time. If teams can start gathering data better and can collect it in a contextualized format by slowly moving content from all devices onto the edge and standardizing it, they can do a lot more modeling rather than just pulling data into a chart.
Moreover, that cloud connectivity allows
other critical data to come back from the cloud, such as weather forecasts, pricing information, feedstock availability, and more, to help teams better optimize their operations in real time.
“Control engineers have many legacy solutions in the field. Most users have 10 to 12 different manufacturers of automation, speaking many different languages and following many different standards for how they each do the same thing,” Smith said. “The move toward OPC UA, MQTT, and even AI will help them slowly transition and simplify their data. They will get more data out of the devices they have today, which allows them to make their processes more efficient and simpler to set up.”
“We’ve had predictive analytics and maintenance applications running in the cloud for some time now, but to get them configured, teams would have to tap an application engineer who would spend hours trying to map the data from the field to the data that resides or is consumed in the cloud,” he added. ?All that mapping was manual and took a lot of effort. Now, since we’ve standardized and normalized that information at the edge, nearly all of that manual effort is eliminated. It’s almost entirely automatic.”
In addition, everything going from the edge to the cloud needs to be secure. Previously, teams would have to bolt on complex security solutions, but today, security is inherent in the protocols making it much simpler to protect communication.
Replacing failed equipment is much easier as well because many of these devices using modern protocols can be auto-discovered and auto-configured when they are plugged into the network. This gets the plant up and running much faster so any fallout from forced outages is minimized.
“Adaptability is the core benefit of AI. The intention of AI is to be able to take a model and build on top of it without having to manually create code each time. It simplifies the overall model process,” McNab said.
He said that modern protocols like OPC UA and MQTT and their integration with AI is enabling the creation of critical optimization tools—many of which we can’t even imagine at this point. With contextualized data, an AI model can learn what is happening in the factory. The improved connections between data are making AI’s consumption of data much more efficient and raising the ceiling on what we can accomplish with optimization tools.
“AI also unlocks continuous optimization,

Emerson’s PACSystems™ industrial PCs and controllers bring high performance AI capabilities right to the edge, combining advanced computing, built in cybersecurity, and scalable industrial connectivity like OPC-UA and MQTT to ensure secure data flow, uninterrupted control, and scalable performance.
not just for process but for networks as well. It can continually increase network efficiency and might at some point even tell the devices what types of information are needed in the payload for increased adaptability,” Smith said. “In coming years, AI might also automatically detect and predict network failures and make those networks more tolerant to failures in the future. AI can also reduce engineering time by automatically generating logic from P&IDs and specifications.”
Enterprise manufacturing networks fuel hyper connected factories.
According to Raj Rajendra, portfolio sales specialist at Siemens Industry, Inc., “OPC UA, MQTT, and AI are revolutionizing enterprise manufacturing networks by enabling seamless communication, real-time data exchange, and intelligent decisionmaking.”
Here is Rajendra’s view of the technology landscape and how the combination of OPC-UA, MQTT and AI is expected to make an impact.
OPC UA ensures standardized, secure, and scalable communication across devices and systems, from the shop floor to the cloud. Its interoperability supports future-proof connectivity and compliance with global standards, making it a cornerstone for smart manufacturing.
MQTT, a lightweight protocol, excels in
transmitting real-time data with minimal bandwidth, ideal for IoT and edge devices. It enables structured data exchange through unified namespaces and simplifies cloud integration for advanced analytics and monitoring.
AI processes data from OPC UA- and MQTT-enabled devices to deliver predictive analytics, optimize maintenance, and enable autonomous systems. It enhances decisionmaking by analyzing large datasets, driving smarter and faster operations.
“These technologies will create hyperconnected factories in the coming years with seamless integration of devices, systems, and cloud platforms,” Rajendra said. “AI will drive real-time process optimization, improving efficiency and reducing waste. Together, OPC UA, MQTT, and AI will enable scalable, energy-efficient, and sustainable manufacturing practices, ensuring factories are adaptive and future-ready.”
Rajendra said that OPC UA, MQTT, and AI offer distinct technical advantages enabling new IIoT applications beyond traditional systems.
OPC UA provides semantic information models, delivering data context (e.g., instrument and equipment diagnostic data, in addition to process data) and eliminating manual mapping for AI. Its built-in security safeguards OT-IT system connections, while platform independence fosters vendor-neutral architectures. Complex data structures allow sophisticated remote interactions, enabling digital twin

synchronization.
MQTT offers a lightweight, publish/ subscribe model, decoupling data producers from consumers for massive scalability. Its efficiency suits resource-constrained devices and unreliable networks, extending IIoT reach. QoS ensures reliable data delivery, and topic-based filtering reduces network traffic.
AI excels at pattern recognition and anomaly detection in big data, powering predictive maintenance and real-time quality control, and its adaptive learning optimizes processes continuously. Predictive modeling enables proactive decision-making, while autonomous control facilitates self-optimizing systems. AI also enhances human-machine interaction through intelligent assistance.
Together, these technologies transform industrial operations from reactive and manual to interconnected, proactive, intelligent, and autonomous, defining the success of modern IIoT.
Anticipated impact
Rajendra said that the prospects for OPC UA, MQTT, and AI in industry are transformative, enabling smarter, more adaptive, and sustainable operations.
Applications: These technologies will drive predictive maintenance by collecting real-time data via OPC UA and MQTT, and by using AI to predict equipment failures, reducing downtime and costs. Real-time
optimization will be achieved through AI dynamically adjusting production processes, improving efficiency and minimizing waste. Autonomous systems powered by AI will enable self-optimizing machines, reducing human intervention and increasing productivity. Additionally, OPC UA and MQTT will facilitate seamless integration between edge devices and cloud platforms, enabling advanced analytics and decision-making.
Anticipated Impact: These solutions will significantly increase efficiency by streamlining operations and reducing resource consumption. They will enhance scalability, allowing systems to adapt to changing production demands, and support sustainability by optimizing energy usage and reducing carbon footprints. Furthermore, industries will benefit from faster innovation cycles, improved product quality, and enhanced competitiveness.
“By leveraging these technologies, industries can achieve higher levels of performance, flexibility, and sustainability,” Rajendra said.
Challenges Addressed: System integration across multi-vendor devices is simplified with OPC UA’s standardized communication. Real-time data processing is enhanced by MQTT’s lightweight protocol and AI’s ability to analyze large datasets efficiently. Scalability challenges are resolved as OPC UA and MQTT support architectures that grow from edge devices to enterprise systems.
Cybersecurity is strengthened with OPC UA’s encrypted communication and AI’s advanced threat detection. Additionally, AI provides actionable insights through predictive analytics, helping engineers optimize processes and reduce downtime.
Ongoing Impact: These technologies will accelerate innovation with the adoption of autonomous systems and advanced analytics, while sustainability goals will be supported through optimized energy usage and reduced environmental impact. By addressing these challenges, OPC UA, MQTT, and AI empower engineers to build smarter, more secure, and future-ready industrial systems.
“OPC UA, MQTT, and AI are set to transform manufacturing by enabling smarter, more efficient, and adaptive operations,” Rajendra said. “Over the next three years, AI will play a pivotal role in driving automation, predictive capabilities, and real-time optimization.”
Impact of OPC UA and MQTT: These technologies will enhance connectivity by enabling seamless communication across devices, systems, and cloud platforms, creating fully integrated manufacturing ecosystems. They will support scalable architectures, enabling manufacturers to adapt to changing demands, while also improving cybersecurity through encrypted communication and secure data exchange.
Anticipated Impact of AI: AI will revolutionize manufacturing by furthering

UA,
and AI address key challenges for automation and control engineers, including system integration, real-time data processing, scalability, and cybersecurity.

drive greater efficiency by streamlining operations and enabling predictive maintenance, as well as enhancing adaptability, which facilitates response to changing production needs and market demands.
predictive maintenance capabilities, along with reducing downtime and maintenance costs. It will drive autonomous systems, allowing machines to self-optimize and reduce human intervention. AI will also support sustainability by optimizing energy
usage and reducing carbon footprints. Additionally, AI will accelerate innovation, improving product development cycles and manufacturing agility.
“Together, these technologies will create more efficient, secure, and sustainable manufacturing environments, ensuring that factories remain competitive in an increasingly digitalized and connected world,” he concluded.
Al Presher, Editor, Industrial Ethernet
Both the current and future "State of Factory Connectivity" is in the midst of a tremendous boost from the AI technology revolution. AI is expected to significantly contribute to factory connectivity innovations over the next 1-3 years, with its greatest impact on factory networking within the next five years.

THE CURRENT STATE OF FACTORY CONNECTIVITY is strong, and getting stronger. AI is providing an impetus to help reshape how connected systems and people operate. AI-powered tools can empower operators with more informed decision-making and help solve specific challenges in areas like quality, energy usage and cybersecurity. AI is also helping lead us toward a future of autonomous operations, where cost, efficiency, safety and resilience can be optimized by intelligent, self-learning systems.
For this special report, the Industrial Ethernet magazine gathered the perspectives of industry experts and what they see as both the current state and future of manufacturing networks. Here is what these industry experts had to say about the megatrends shaping Industrial Ethernet industrial networking.
Enabling AI driven analytics, digital twins and software defined industrial automation.
“The current state of factory connectivity in smart manufacturing is built on a secure, highperformance, and resilient wired and wireless network foundation. This infrastructure connects diverse industrial devices, enabling innovations like AI-driven analytics, digital twins, and software-defined industrial automation,” Vivek Bhargava, Product Marketing Manager, Cisco Industrial IoT told the Industrial Ethernet magazine recently.
“Wireless technologies such as Wi-Fi 6/6E and Cisco Ultra-Reliable Wireless Backhaul (URWB) provide the bandwidth, low latency, and mobility needed for applications like
Automated Guided Vehicles (AGVs) and Autonomous Mobile Robots (AMRs), enhancing factory flexibility,” Bhargava said.
He added that innovations such as software-centric manufacturing with virtualization of IPCs, HMIs, and PLCs, and cloud-based analytics are being utilized to improve production efficiency, flexibility, and time to market. Networks are also being called upon for increased visibility, zero-trust policy enforcements, and for remote access.
Manufacturers are investing heavily in network modernization, digital transformation, AI, and cybersecurity to scale smart manufacturing. The evolution of factory connectivity is toward a resilient, scalable, and secure IT/OT network architecture that enables operational excellence, sustainability,

and competitive advantage in Industry 4.0 environments.
“Key trends pushing factory connectivity ahead include the integration of AI-powered processes, shop-floor virtualization, industrial mobility, and robust security - technologies that enable flexible, scalable, and secure manufacturing environments,” Bhargava said. Industrial mobility needs for AGVs, AMRs, mobile tooling, as well as workers’ devices such as laptops, tablets, etc., are driving the need for wireless networks on the shop floor, requiring both Wi-Fi 6/6E and a more reliable higher-bandwidth industrial wireless solution such as Cisco URWB.
“Building accurate AI inferencing models and digital twins requires large-scale, high-speed, resilient connectivity that can collect and transport large amounts of data in near real-time between production assets, edge compute, data centers, and cloud environments. Similarly, shop-floor virtualization requires low-latency, low-jitter networks that can tunnel layer-2 machine control protocol traffic between machines and their controllers that are now might be located several miles away and not next to the machines they control,” he added.
For wired and wireless networks alike, factory networks must implement industrial cybersecurity measures, including deep visibility into assets, multilevel granular segmentation into zones and conduits following ISA/IEC 62443 and zero-trust principles, to safeguard data from unauthorized access while enabling secure communication across IT and OT domains.
Engineering challenges
Innovations in factory connectivity address
key engineering challenges such as low-latency real-time requirements, resiliency, scalability and demand variability, security and data privacy, and operational complexity.
Low-latency low-jitter requirements are handled by using Time Sensitive Networking principles such as Frame Preemption (IEEE 802.3br and IEEE 802.1Qbu standards) that addresses the needs of control applications that require real-time communications. In this technology, transmission of lower-priority traffic is aborted in favor of higher-priority control traffic.
Factory networks must be resilient to ensure continuous operation and prevent costly downtime in industrial automation processes. Resiliency can be achieved by implementing redundancy protocols such as Media Redundancy Protocol (MRP), Device Level Ring (DLR), Resilient Ethernet Protocol (REP), High-availability Seamless Redundancy (HSR), and Parallel Redundancy Protocol (PRP), which provide fast recovery from link or node failures, often within milliseconds.
Bhargava said that a key issue is network availability which can also be improved if front-line workers who may not have deep networking skills are provided with tools that can help them diagnose issues and corrective steps, so these workers can fix problems quickly without time-consuming expensive escalation to networking experts.
Factory networks must be scalable and adaptable to change to support the increasing complexity and dynamic nature of modern manufacturing environments. This flexibility allows production lines to be reconfigured quickly, supports diverse use cases, and accommodates growth without disrupting operations. This need is addressed by management platforms that support plugand-play addition of network devices and rapid
Security is a key part of any network. Critical infrastructure such as manufacturing requires multilayer defense starting with deep visibility into all assets and traffic, dynamic segmentation to enforce access policies, and zero-trust network access.
“When you talk about the influence of AI, you need to consider two aspects. One, the AI technology used to manage factory connectivity itself to ensure its availability. And two, how the network enables the factory’s AI-driven automation,” Bhargava said.
“AI-powered management for factory networks enhances operational efficiency by leveraging AI to automate network operations, provide proactive monitoring, and deliver actionable insights. It maximizes uptime by quickly diagnosing issues and recommending remediation steps, reducing mean time to resolution (MTTR). It empowers first-line OT responders, even those with limited networking knowledge, to troubleshoot common problems, without escalating them to networking experts. This helps avoid expensive downtime,” he said.
“AI is expected to significantly contribute to factory connectivity innovations over the next 1-3 years, with nearly half of industry experts (48%) identifying AI as having the greatest impact on factory networking within the next five years,” Bhargava added. “Factory networks can enable AI by providing a robust, secure, and high-performance infrastructure that supports real-time data collection, processing, and communication. These networks deliver low latency and high bandwidth necessary for AI-driven applications such as machine vision, software-defined industrial automation, advanced robotics, digital twins, etc.”

“Three technological trends are currently shaping the further development of industrial networks: high-performance networking, next-generation wireless connectivity, and consistently security-integrated connectivity," -- Dr. Julia Reker, Director of Network Technology, Industry Management and Automation, Phoenix Contact.
Holistic, integrated communication architectures that seamlessly connect OT and IT.
According to Dr. Julia Reker, Director of Network Technology, Industry Management and Automation for Phoenix Contact, “industrial connectivity is currently undergoing a phase of profound transformation. Production facilities are becoming increasingly networked, software-defined, and intelligent.”
“Modern connectivity must not only be powerful, but above all secure. That is why Phoenix Contact consistently develops its devices in accordance with IEC 624434-1—i.e., certified secure development processes—and meets the requirements of IEC 62443-4-2 for secure components with its industrial network technology product families,” Reker told Industrial Ethernet recently.
Reker said the trend is clearly moving toward holistic, integrated communication architectures that seamlessly connect OT and IT while meeting increasing regulatory requirements. Regulations such as the Cyber Resilience Act (CRA) and NIS 2.0 increase the requirements for software lifecycle, patch and vulnerability management, and the verifiability of secure devices.
Phoenix Contact is following a securityby-design strategy with resilient, remotely manageable, and interoperable systems. Signed firmware, secure boot chains, role-based user management, and automated update processes ensure that connectivity becomes an active
value driver in production—scalable, robust, and prepared for AI-supported applications.
“Three technological trends are currently shaping the further development of industrial networks: high-performance networking, next-generation wireless connectivity, and consistently security-integrated connectivity,” Reker said. “Ethernet networks with TSN enable deterministic communication, while industrial WLAN solutions based on Wi-Fi 6/6E provide the necessary stability for mobile applications such as AGVs, cobots, and modular machines. At the same time, secure data communication is becoming increasingly important, as cyberattacks are now among the greatest risks to businesses and are becoming increasingly sophisticated.”
Reker said that Phoenix Contact is responding to this development with a comprehensive 360° industrial security concept that provides holistic protection for plants and systems against sabotage, data loss, and downtime. This is based on a consistent security-by- strategy, in which products are designed in accordance with the IEC 62443 philosophy right from the development process. This supports the implementation of legal requirements such as the NIS 2 Directive and the Cyber Resilience Act (CRA), as both regulations require demonstrably robust, securely developed, and updatable products, as well as clear protective measures throughout the entire life cycle.
Phoenix Contact addresses these requirements with powerful industrial
switches, secure firewalls, industrial WLAN systems, edge-capable controllers, and a broad security portfolio. The key to this is a holistic architectural approach that considers the network, security mechanisms, device management, and data management as an integrated overall system. The result is transparent data flows, low latencies, and highly reliable, resilient communication— even under harsh industrial conditions and increasing regulatory requirements.
“With the increasing digitalization of industrial plants, the complexity of modern network architectures is growing significantly. Today, operators must ensure that heterogeneous systems are reliably integrated, data streams are transmitted in real time, and production processes are kept highly available – all while facing growing cyber threats,” Reker said. “This is precisely where a key engineering pain point arises: networks must not only be powerful, but also designed to be secure from the ground up.”
The IEC 62443-3-3 standard defines clear security requirements at the system level, including segmentation, access control, network zones and conduits, monitoring, and measures to ensure integrity, availability, and confidentiality. In combination with IEC 62443-4-1 (Secure Development Lifecycle) and IEC 62443-4-2 (Device Security), this creates a holistic framework that obliges both device manufacturers and operators to implement protective measures in a technically sound manner. For engineers, this means that security

“Technologies like AI are reshaping how connected systems and people operate. AI-powered tools can empower operators with more informed decisionmaking and help solve specific challenges in areas like quality, energy usage and cybersecurity,”--
mechanisms such as hardening, certificate and key management, logging, and secure update processes must now be an integral part of all network and system planning.
To relieve operators of this increasing complexity, Phoenix Contact supports its customers with network and security services: from network analyses and wireless site surveys to security assessments in accordance with IEC 62443, segmentation concepts, and vulnerability analyses. This provides operators with a robust, resilient, and IEC 62443compliant infrastructure that can be operated transparently and securely over the long term.
“AI will become one of the key drivers for secure industrial connectivity in the coming years. While today's OT networks are still largely monitored and optimized manually, AI-supported processes will enable automated analysis, evaluation, and protection of communication structures in the future,” Reker said. “This will be particularly important as increasing cyber threats, more complex network architectures, and regulatory requirements such as CRA and NIS 2 demand significantly higher transparency and response speeds.”
She added that AI can provide targeted support for security mechanisms in line with the IEC 62443 series: models detect anomalies in OT traffic more quickly, classify risks according to 62443-3-3 criteria, and help to proactively secure security levels. At the same time, AI-supported diagnostic functions provide more precise insights
into interference, RF profiles, and roaming behavior—a crucial factor for robust, resilient industrial Wi-Fi networks.
“In the long term, industrial connectivity and security platforms could be further developed so that AI becomes an integral part of secure automation networks. Self-optimizing systems that adapt to RF environments, evaluate security events in a contextsensitive manner, or apply adaptive patch and certificate strategies are conceivable,” Reker said. “In the future, this will result in resilient, partially autonomous communication systems that adapt flexibly to production and threat situations.”
More focus on where to process data to address performance, storage and security needs.
“The state of factory connectivity is strong, and getting stronger. It’s defined by the ongoing need to use data to not only generate actionable insights, but also to enable entirely new operational possibilities,” said Joseph Biondo, Sr. Program Manager at Rockwell Automation.
“Technologies like AI are reshaping how connected systems and people operate. AI-powered tools can empower operators with more informed decision-making and help solve specific challenges in areas like quality, energy usage and cybersecurity,” Biondo said. “AI is also helping lead us toward a future of autonomous operations, where cost, efficiency, safety and resilience can be optimized by
intelligent, self-learning systems.”
Biondo’s view is that manufacturers are also becoming more intentional about where they process data to address performance, storage and security needs. Cloud platforms enable enterprise-wide coordination and visibility, while edge computing keeps data close to its source when real-time processing is required. At the same time, emerging approaches such as software-defined automation are creating more flexible systems that give manufacturers the scalable and adaptable automation architectures they need to remain agile.
Biondo said that organizations need smarter operations to be agile and resilient as they face supply chain disruptions, cybersecurity risks, sustainability pressures, skills shortages and more.
Sustainability is a priority for manufacturers, but many struggle with a lack of real-time visibility, inconsistent data and difficulty scaling sustainability initiatives. Connected technologies are helping address these challenges. For example, solutions that integrate energy and production data can reveal where opportunities exist to optimize energy consumption. Technologies that automate and trace industry-specific processes can also help minimize waste and reduce downtime.
Cybersecurity is also more critical as ever. Many OT systems in use today were not originally designed with cybersecurity in mind, while cyber threats continue to

"Initiatives like IIoT, Industry 4.0, and 5.0 were mainly driven by specific business needs, but AI is fundamentally different because it creates both internal and external pressure for adoption, while also delivering visible business value. " -- Felipe Costa, senior networking and cybersecurity product manager, Moxa.
become more sophisticated. As a result, more manufacturers are recognizing the need for proactive, OT-specific security strategies that address the unique complexities of industrial environments. Vendor-neutral, standards-aligned technologies are key enablers because they provide visibility into OT assets, helping organizations reduce risk exposure, accelerate remediation and strengthen governance.
Technology is also helping manufacturers mitigate the impact of skills shortages. For example, agentic AI capabilities are being embedded into HMI software, allowing operators to interact with chatbots for a variety of needs. This can help them make faster decisions, quickly access information like SOPs, and troubleshoot machines more efficiently and accurately.
“OEMs and automation engineers can use emerging technologies to address many of their most pressing engineering challenges,” Biondo said.
From a recent survey by Rockwell Automation, he said that nearly 8 in 10 OEMs said they view AI or machine learning as critical to designing quality into equipment. With so much uncertainty – from supply chain to workforce constraints and geopolitical pressures – manufacturers are looking for ways to ensure consistent product quality. Because product quality is also closely tied to
sustainability goals, AI is increasingly being used to help maintain and improve quality. For example, AI can automate inspection processes and detect subtle deviations that may impact product quality. These systems can also recommend corrective actions, allowing manufacturers to address issues earlier and reduce scrap, rework and downtime.
“AI is driving a large shift in manufacturing – from automation to autonomy. The autonomous factory of the future will be built on intelligent systems that anticipate needs and continuously improve their performance, delivering unprecedented efficiency and safer production environments,” Biondo said.
He added that AI-enabled automation systems will optimize scheduling, dynamically adjust processes and detect potential issues before they disrupt production. Autonomous mobile robots (AMRs) and independent cart technologies will operate in unison to enable end-to-end material movement. Maintenance systems will self-diagnose equipment health issues and recommend work orders to address them. And people will increasingly guide systems using natural language.
“As AI becomes more deeply integrated into core production functions, manufacturing operations will become increasingly selfoptimized,” Biondo said. “This will help manufacturers reach new levels of productivity and sustainability, while allowing employees to focus on higher-value work and constant innovation.”
AI a key accelerator as AI initiatives require structured data and a robust infrastructure.
“Factory connectivity was not evolving at the speed many expected, but AI is now a key accelerator because AI initiatives require structured data and a robust infrastructure to transport this data reliably and in real time. In brownfield environments, this evolution is happening through protocol convergence using edge gateways and hybrid architectures (on-site and cloud), allowing previously isolated systems to share data,” said Felipe Costa, senior networking and cybersecurity product manager at Moxa.
“At the same time, higher-bandwidth switching with 1G and 10G are moving closer to the edge, and data-intensive applications are becoming the new norm,” Costa said. “In greenfield projects, time-sensitive networking (TSN) is enabling deterministic and converged networks, while SPE and APL extend Ethernet to the sensor level, creating scalable and consistent data models.”
Costa said this transformation is not only about connectivity; it is directly impacting operational performance. Deterministic communication, higher bandwidth, and data normalization enable real-time analytics, closed-loop optimization, faster changeovers, predictive maintenance, and more flexible production. In other words, the network
is evolving from passive infrastructure into a real-time data platform for smart manufacturing, improving OEE, reducing downtime, and increasing quality.
AI is also bringing back a core OT principle: resilience. Operations have always been driven by the triad of safety, reliability, and performance (SRP), and now the business layer is (re)discovering the importance of these elements. However, this new level of connectivity also highlights a persistent gap: many deployments still do not incorporate cybersecurity by design. As factories become more data-driven and interconnected, cyber resilience is a fundamental requirement for scaling AI and smart manufacturing initiatives safely.
“AI is the most disruptive technology impacting all industrial sectors in different ways. Since the introduction of modern electronics and computing, we have not seen such a broad ecosystem effect,” Costa said. “Initiatives like IIoT, Industry 4.0, and 5.0 were mainly driven by specific business needs, but AI is fundamentally different because it creates both internal and external pressure for adoption, while also delivering visible business value. As a data-hungry application, AI is accelerating the entire connectivity stack, demanding protocol convergence, higher bandwidth, and more edge and cloud computing capabilities.”
Costa said that, with this exponential increase in data traffic, industrial networks now require a much better understanding of how to control undesired effects, such as broadcast and multicast storms, latency variation, and packet loss, which are conditions that AI-driven applications usually cannot tolerate. As a result, there is a clear need for more powerful managed switches, intelligent traffic management, network segmentation, and resilient architecture that keep applications running even under non-ideal conditions. This is not only improving connectivity but directly increasing the performance and stability of automation and machine networks.
Costa said that, if we set AI aside, digitalization was already moving in this direction, but at a slower pace. AI brought the business justification that many modernization projects were lacking. It is not a silver bullet solution, and not every AI project delivers immediate ROI, but the technologies have created an almost-universal push for accelerating upgrades and retrofits in legacy systems that have needed better connectivity for decades.
“This is the question of the moment. Some experts believe AI will remove humans from the loop, while others believe it will never

PACSystems Next-Generation Industrial PCs pair next gen processors with rugged edge design to enable AI driven maintenance, process optimization, quality inspection, decision support, and supply chain visibility. SOURCE:
happen,” Costa said. “I believe the reality will be somewhere in between. If history has proven anything, it is that when the cost of adoption decreases and the business benefits grow, technology moves forward.”
In industrial environments, he said this is always balanced by risk. In high-risk processes, we will continue to see human-in-the-loop architectures to validate decisions and avoid undesired or even catastrophic outcomes, while in low-risk and data-intensive layers, AI will progressively take over tasks that today require manual analysis.
“In the next 1-3 years, AI will not completely replace humans because the challenge is not only ethical; it is also technical. Moving data is not enough—data must be transformed into structured, contextualized information that AI models can consume,” Costa said. :However, this is exactly the driver behind investments in edge computing, intelligent traffic management, data models, and timesensitive communications to address this gap.”
And yet, according to Costa, AI will strongly influence how connectivity is designed and will naturally take over tasks that do not depend heavily on human discernment. However, as networks become more deterministic, resilient, and context-aware to support real-time data pipelines and closed-loop optimization, AI will be able to optimize these procedures, replacing repetitive and non-intellectual tasks, and accelerating flexible production, predictive operations, and adaptive systems.
“The long-term impact on manufacturing is that connectivity, together with the human role, will evolve into an autonomous, selfoptimizing data fabric,” Costa said. “With more connected assets, more bandwidth, and more distributed processing, cybersecurity and system integration become the main concerns. The industry will not be the same, and our role as experts is to enable this transformation in a secure and operationally reliable way, because the business will move forward regardless of the technical challenges.”
Enabling edge connectivity and AI acceleration, insights and efficiencies.
According to Alan Mathason, senior product manager for Emerson’s machine automation solutions business, “factories are becoming more connected than ever as organizations find themselves competing in an increasingly complex marketplace. Carefully designed hardware and software product sets are evolving to enable edge connectivity and AI acceleration, providing insights and efficiencies previously only dreamed of.”
He added, however, that users must carefully select hardware platforms to address scalability, expandability, and clear migration paths over time, as critical components like processors progress to their next generations. The most advanced automation suppliers are increasingly supporting companies in this goal. For example, Emerson’s Next Generation Industrial PCs platform is available with PACEdge – an edge-enabling software suite –as well as support for Windows 11 IoT.
“Many organizations are exploring or implementing edge software to manage critical competencies like Cyber Resilience Act (CRA) compliance and navigating the complex IT/ OT interface. With edge software tools in place, industrial assets can take advantage of real-world data, as well as the information from the many seemingly unrelated assets surrounding an industrial process.” Mathason said. “That enhanced data can provide the kind of insight that previously would have required a senior plant manager with decades of experience at a particular site or niche application— a key benefit in an era of workforce shortages and lean teams.”
Mathason said that one of the primary challenges facing organizations today is that it has become increasingly challenging to entirely air gap systems. Operations teams are trying to figure out how to use data from potentially threatening sources or conduits (i.e. web access) to better advise an industrial process without leaving it vulnerable to attack. Legacy technologies and protocols

“Factories are becoming more connected than ever as organizations find themselves competing in an increasingly complex marketplace. Carefully designed hardware and software product sets are evolving to enable edge connectivity and AI acceleration, providing insights and efficiencies," Alan Mathason, senior product manager, Emerson Machine Automation.
were often not designed with the cybersecurity features necessary to operate in a connected environment, but many organizations continue to rely on those systems and need a solution that helps them bridge the gap.
Modern automation solutions unlock a combination of application/architectural philosophy that only exposes advising parameters instead of control capabilities. This philosophy, coupled with secure edge technologies, lend great power to solving challenges without forcing an organization to rip and replace its entire infrastructure.
“The next few years of AI evolution are likely to create a seismic shift in industry. Industrial AI is rapidly growing in power, and when it is coupled with effective edge technologies, it allows many advances in productivity,” Mathason said.
He said that, for example, layering edge concepts on high performance hardware can unlock automatic optical inspection that can quickly support a wide range of assembly methods at a fraction of the cost of dedicated machines. Further, modern edge solutions are typically architected to allow edge connectivity, unlike dedicated machines. Edge solutions share the data they observe, and can automatically learn and improve from datasets in adjacent production cells – or data from other geographic regions performing a similar function.
”Consider an example of a manufacturer
with sites surrounding the world who need to build high-precision mechanical assemblies. In one location, operators may find assembly efficiencies or discover quality problems from a process or component supplier,” Mathason said. “Communicating that critical knowledge using traditional methods would rely on tight coordination and information sharing practices at separate locations, which is often impractical with heavy workloads. AI systems that discover statistically significant events can share these observations in a high ‘signal to noise ratio’. Keying into these special events, systems may be architected to either highlight them for important review or in many cases solve global production cell problems automatically.”
Standard Ethernet enabling multiple traffic types for control, motion, safety, diagnostics, and IT.
“Across global manufacturing, the shift toward deterministic Ethernet based on Time - Sensitive Networking (TSN) is reshaping factory connectivity. TSN adds deterministic performance and converged network capability to standard Ethernet, enabling multiple traffic types—control, motion, safety, diagnostics, and IT—on a single network architecture,” said John Browett, General Manager, CC-Link Partner
Association Europe.
“A key development has been the rise of open, high-bandwidth networks technology incorporating TSN, which address both the need for gigabit bandwidth and deterministic control across diverse devices. This helps reduce the costs involved in the construction of machines and systems that would previously have required separate networks for each of these functions,” Browett said.
Browett’s view is that the market is increasingly valuing networks that can support this converged model, especially when combined with gigabit bandwidth in order to handle the metaphorical “tidal wave” of data being generated by modern factory processes. They also simplify the integration of IT systems with those of the shop floor.
He said that two key trends and technologies stand out:
TSN standardization: The IEEE 802.1 TSN Task Group and the IEC/IEEE 60802 profiles continue to refine how TSN can be applied to industrial automation. The results of this are expected in 2026. The TIACC activity will also deliver a standardized testing regime for multiple vendors offering TSN capable devices, ensuring interoperability across multiple vendors, device types and functions. Converged real‑time Ethernet architectures: Industry demand is rising for open networks capable of gigabit bandwidth with TSN

“Across global manufacturing, the shift toward deterministic Ethernet based on Time-Sensitive Networking (TSN) is reshaping factory connectivity. TSN adds deterministic performance and converged network capability to standard Ethernet, enabling multiple traffic types—control, motion, safety, diagnostics, and IT—on a single network architecture," -- John Browett, General Manager, CC-Link Partner Association Europe..
convergence and determinism, supporting multi-axis motion, safety, control along with high bandwidth applications such as machine vision systems.
These trends reinforce the broader move toward open, TSN - based gigabit Ethernet ecosystems that can unify control - level real-time needs with data-intensive systems.
“Traditional industrial networks often required separating network traffic by function, leading to multiple, parallel networks. TSN addresses this by providing converged deterministic transport over standard Ethernet, avoiding the need for multiple separate networks,” Browett said. “TSN’s scheduling and synchronization mechanisms (e.g., 802.1Qbv, 802.1AS) allow mixed - priority traffic to share the same network without interference. Hence the need to engineer systems with multiple dedicated networks is starting to fade away.”
The result is technology that addresses the industry’s core requirement: enabling open, deterministic, converged, high - bandwidth Ethernet networks that eliminate historical trade-offs between speed, interoperability, and real-time performance.
“Within the next 1–3 years, AI could be expected to play a key role in the configuration and validation of TSN networks in order to reduce engineering effort and reduce time
to market,” Browett said. “For systems in operation, AI could possibly also be employed in a diagnostic role to autonomously detect configuration and operation problems and address them in real-time, or just guide human technicians and engineers to solve issues faster than before.”
In the longer term, this may take one step further to allow the implementation of AI controlled “autonomous” networks that can detect and address operational problems in real-time including automated responses to cybersecurity issues.
Browett said that, in summary, “AI will assist humans to increase the adoption of high bandwidth, deterministic converged open Ethernet networks by increasing performance, productivity and reliability across TSN-based gigabit networks.”
Single Pair Ethernet, 5G/Wi Fi and data models such as PA DIM, OPC UA and AI.
Dr. Al Beydoun, ODVA President and Executive Director, said that “factory connectivity is currently shifting from a traditional hierarchical framework that breaks industrial control systems (ICS) into multiple levels vertically, as outlined in the Purdue model, to a more interconnected structure. This pervasive connectivity is driven by new technologies such as single
pair ethernet (SPE), 5G and Wi-Fi, data models such as PA-DIM and OPC UA, and artificial intelligence (AI).”
Here is Beydoun’s view of the current landscape:
Single Pair Ethernet (SPE): Enables devices that were constrained by a historical lack of processing capability and small size to be connected directly to the Ethernet network in a cost-effective manner. Examples include RFID, temperature, and level sensors as well as contactors, push buttons, and motor starters. Two examples of SPE are Ethernet-APL for long reach and hazardous areas in the process industries and EtherNet/IP In-Cabinet are enabling Ethernet connectivity where it previously wasn’t possible.
5G & WiFi: Robust wireless connectivity allows for devices such as vibration and temperature sensors to be directly connected to cloud applications to be used for predictive maintenance. Additional applications such as enabling autonomous mobile robots (AMRs) and automated guided vehicles (AGVs) allow for enhanced production flexibility both on the line and through parts resupply.
Data Models: Using structured data models that include semantics and scaling that allow for meaning, context, and structured relationships of data that allow for its use in plant wide energy and output optimization efforts. Two examples of data models are PA-DIM for process automation and OPC UA

“Factory connectivity is currently shifting from a traditional hierarchical framework that breaks industrial control systems (ICS) into multiple levels vertically, as outlined in the Purdue model, to a more interconnected structure. This pervasive connectivity is driven by new technologies such as single pair ethernet (SPE), 5G and Wi-Fi data models such as PA-DIM and OPC UA, and AI." Dr. Al Beydoun, ODVA President and Executive Director.
for broad factory automation use.
AI: Controls engineers have been optimizing local processes and applications through custom algorithms since the advent of automation. What AI brings to the table is the ability to consolidate data points from thousands of processes and applications across a plant to be able to provide insights and recommendations on how to improve the macro level processes.
“AI has been solving very specific and challenging problems in industrial automation for several years now via custom-built models,” Beydoun said. “The next couple of years will bring AI to the fingertips of many more plant workers via more cost effective out of the box software solutions that will be tied into existing controller, ERP, and cloud systems. For example, Agentic AI can help operators improve production decisions by offering recommendations to improve throughput and quality based on real time data. AI can also assist with identifying and predicting device failures before they cause downtime, allowing for parts to be ordered and maintenance to be scheduled before an emergency arises.”
Beydoun said it’s important to note that AI is going to require humans to review and approve recommendations for the time being to ensure that the decisions are on target.
One of the more important tasks in the next couple of years will be making production data available to higher level systems through the addition of semantics and scaling. Data models are a key solution to making data more usable for AI in the near term.
“The hope for AI in the long term is that it will enable Industry 5.0 by providing for a new level of data integration and anticipation. Factories that can proactively change raw material orders and production targets based on current demand to improve profitability,” Beydoun said. “Sustainability is another key area where AI will help to optimize energy usage through waste reduction. AI will also continue to reduce repetitive tasks and free up workers to focus on higher level design and strategic work.”
Fostering a continuously learning operating environment to adapt in real time to changing conditions, demands, and business goals.
Rambod Dargahi, senior product marketing manager at Seeq said that “the software side of factory connectivity is evolving rapidly.”
First, the way data is organized and shared across the factory is becoming far more intelligent. Modern intelligent platforms are creating a single, unified layer where every sensor, process, and piece of equipment can contribute to and draw from a common,
real-time picture of the entire operation.
Second, the boundary between the operational technology (OT) and information technology (IT) worlds is dissolving, so a production manager's dashboard and a shop-floor controller are effectively speaking the same language.
Third, AI is playing a growing and transformative role. It doesn’t just analyze historical data, but continuously monitors live operations and surfaces, and prioritizes the decisions that need to be made— including those not yet recognized. It recommends the actions for greatest impact, unleashing new possibilities for manufacturing organizations.
“Taken together, these trends are moving factory software from a collection of rigid, purpose-built tools toward a dynamic, intelligent, and continuously learning operating environment that can adapt in real time to changing conditions, customer demands, and business goals,” Dargahi said.
“Industrial AI, built on an advanced analytics foundation, is driving measurable performance gains across facilities by connecting data from multiple sources into a single, self - service environment for engineers and operations teams,” Dargahi said. “Now, with the rise of agentic industrial AI, a new chapter is opening where AI agents can plan and execute

"Trends are moving factory software from a collection of rigid, purpose-built tools toward a dynamic, intelligent, and continuously learning operating environment that can adapt in real time to changing conditions, customer demands, and business goals," -- Rambod Dargahi, senior product marketing manager, Seeq.
multi-step work, integrating more types of information—from live time-series events to unstructured reports, procedures, and past decisions.”
He said that this evolution comes to life in the recent release of Seeq Intelligence, an AI-driven decision intelligence layer on the Seeq platform that introduces advanced agentic AI capabilities to deepen operational understanding, answer complex questions, reveal hidden insights, and support—rather than replace—expert judgment.
“Within this layer, natural - language interaction lets experts ask questions the way they think, and reusable workflows standardize how analyses are run and shared. These capabilities turn fragmented data and hard-won expertise into clear, reusable analyses and prioritized recommendations, so experts spend less time hunting for data and more time applying their judgment to high - impact decisions,” Dargahi said. “By creating a comprehensive, connected view of manufacturing operations enriched with accumulated experience, these types of intelligence platforms provide continuously evolving systems of learning and improvement, helping teams tackle current and future challenges, and unlock breakthrough operational and business performance at enterprise scale.”
Dargahi said that these changes are meant to address a series of engineering challenges.
Data Overload: Trends, alarms, work orders, lab results, and shift notes compete for engineers’ and operators’ attention, more data than any human can reasonably absorb. Industrial AI and decision intelligence innovations contextualize time-series data, then layer in operational knowledge and subject - matter expertise so teams can see what’s happening, why it’s happening, and which actions matter most.
Inefficient Workflows: When issues arise, plant personnel historically needed to swivel between multiple disparate tools, export data to spreadsheets, rebuild plots and trendlines, and manually stitch together context from scattered information. This slows analyses and makes it hard to standardize best practices. Seeq brings these steps into a single, governed environment, using AI to turn one-off, manual efforts into repeatable, guided workflows that can be standardized and scaled across sites.
Knowledge Silos: Historically, a great deal of engineering knowledge lived in notebooks, emails, or the heads of a few veteran engineers at a plant, and even when it was captured, making it scalable across sites and teams remained difficult. Modern intelligence platforms like Seeq are designed to change that, capturing and organizing this expertise in a unified environment where operational knowledge and insights are readily accessible, reusable, and adaptable across the entire enterprise.
“Industrial AI is becoming the primary catalyst for the next wave of industrial transformation, shifting factory connectivity from simply moving data around to continuously identifying the most important decisions and opportunities,” Dargahi said. “But AI on its own cannot capture the full operational context or human judgment. Real, durable change comes when AI is fused with subject - matter expertise and institutional knowledge, and grounded in prior analyses, decisions, and actions across an enterprise.”
He said that AI is contributing to these innovations as a living decision intelligence layer that continuously learns from process data, systems, and people. It acts like an always - on analyst that partners with operations, engineering, and maintenance teams while keeping experts firmly in control of judgment and final actions. AI reduces the friction that has historically kept teams from moving beyond simple data consumption. It helps organizations advance from ad-hoc analysis to prioritized, higher-value decisions that drive measurable gains in efficiency, margins, and sustainable performance.
“These advancements will increasingly change how manufacturing teams work by turning operational knowledge into a living, shareable asset that distributed teams across sites and functions can tap into,”

“The current state of factory connectivity leverages advanced hardware like Industrial Ethernet switches, network management software, and industrial edge devices, alongside software for factory automation application development and cloud connectivity, to enable seamless IT/OT integration and real-time data exchange,” -- Raj Rajendra, portfolio sales specialist, Siemens Industry.
Dargahi concluded. “By offloading repetitive exploration of data and documents, AI enables faster, more confident human-in-the-loop decision-making at every level—from plant - floor troubleshooting to corporate strategic planning. It not only helps organizations resolve today’s issues more effectively, but it also empowers teams tackling related challenges to proactively collaborate and continuously improve future performance. It's human intelligence, amplified at scale.”
Enabling real time decision making, low latency communication, localized data processing, and secure operations.
“The current state of factory connectivity leverages advanced hardware like Industrial Ethernet switches, network management software, and industrial edge devices, alongside software for factory automation application development and cloud connectivity, to enable seamless IT/OT integration and real-time data exchange,” said Raj Rajendra, portfolio sales specialist for Siemens Industry.
“Companies lead this evolution with technologies like Industrial 5G for
low-latency wireless communication, edge computing for localized data processing, and enhanced cybersecurity through network firewalls and security suites,” Rajendra said. “These innovations support predictive analytics, digital twins, and standardized communication via OPC UA, driving efficiency, scalability, and sustainability in smart manufacturing. The result is improved decision-making, optimized production, and reduced downtime, ensuring factories are future ready.”
Rajendra said that “factory connectivity is advancing through key trends like IT/OT convergence, industrial 5G, edge computing, cybersecurity, and digital twins. These trends enable real-time decision-making, low-latency communication, localized data processing, and secure operations.”
Companies drive this evolution with impactful solutions, such as network components for secure and high-performance communication, software for centralized network management and cybersecurity, and industrial edge for real-time analytics and predictive maintenance. Additionally, factory automation software simplifies automation system integration, while cloud connection provides cloud-based IoT analytics. Together, these technologies
optimize machine networks, enhance automation performance, and future-proof factories for smart manufacturing.
According to Rajendra, innovations in factory connectivity are addressing critical engineering challenges in modern manufacturing.
Key issues include system integration, real-time data processing, network reliability, cybersecurity, and scalability. Factory automation software and OPC UA simplify IT/OT integration, while industrial edge and industrial 5G provide low-latency, localized data processing. Network components ensure robust and reliable communication in industrial environments, and security suites enhances cybersecurity to protect against rising threats. For scalability and futureproofing, cloud connection and digital twins provide flexible, data-driven solutions to adapt to evolving production demands. These technologies collectively ensure efficient, secure, and adaptable operations in increasingly complex and digitalized manufacturing environments.
Impact of AI Technology
“AI is poised to revolutionize factory connectivity in the next 1-3 years by enabling smarter, more autonomous manufacturing,” Rajendra said.
“Key contributions include predictive maintenance, where AI analyzes machine data to prevent failures and reduce downtime, and real-time data insights for optimized decisionmaking.”He said that AI will also drive autonomous systems, enhance cybersecurity by detecting threats in real-time, and enable seamless communication across machines and systems.”
“The future impact of AI connectivity advancements includes increased efficiency, scalability to adapt to production demands, improved collaboration between systems, and enhanced sustainability through optimized energy use and reduced waste,” he added. “Solutions like industrial edge and AI-based predictive maintenance are already paving the way for these advancements, ensuring factories are more adaptive, secure, and sustainable.”
Most manufacturers are prioritizing greater data availability and accessibility.
Jason Pennington, director of digital solutions at Endress+Hauser said that there are emerging trends within industry highlighted by TCP/IP-based communication networks, for example, PROFINET and EtherNet/IP.
“As advanced physical layer (APL) becomes more available and understood, we’re seeing smaller projects proliferate into areas of existing facilities aligned with migration of legacy networks, and even low-risk learning opportunities around non-critical operations,” Pennington said
“Most manufacturers are prioritizing greater data availability and accessibility. This can pertain to the host systems in terms of simplicity, or even leveraging security and network schemes to make operational data more available in real-time or near real-time to stakeholders throughout the enterprise,” he said. “We’ve seen some planning architectures go from multiple network types down to a single or dual strategy to better fit economics, integration, and plant learning requirements.”
He cited a simple example, where they have seen EtherNet/IP replace a myriad of analog/discrete and fit-for-purpose/boutique networks, consolidating instruments, actuators, and other field devices into a singular network that reports up to a host system. This makes daily life much simpler for operators, maintenance personnel, managers, and systems integrators.
Pennington said that “AI is everywhere and, in many ways, it’s unavoidable. The

The
key areas to watch are how usable it is for personnel, along with the philosophy or guide rails we, as an industrial community, establish for working with AI technology.” Today, Endress+Hauser is leveraging several language models to write unique operation and efficiency playbooks, plus systematic failure mode and effect analysis documentation to complement what we can deliver in a uniquely contextual manner.
“We have also used unique monitoringassist solutions from an innovative collaborator called AI-Ops to effectively add an extra set of eyes beyond sensors and control systems to analyze efficiencies, highlight anomalies, and create insights regarding unit operations,” he said.
But also a need to provide solutions for legacy systems.
Bruce Cloutier, the CEO/Founder of INTEG (jnior.com) said that you may have heard the quote: "the good thing about standards is that there are so many to choose from!". “That is very true today and it was equally as pertinent 50 years ago. That quote gets
attributed to different sources. I side with it being something that Ken Olsen said as I cut my teeth (so to speak) on a PDP-8 back in 1969 from his not so small company, Digital Equipment Corporation, better known as DEC,” Cloutier added.
Cloutier said that, no matter what you do, there is always somebody who thinks they can do it better. Through the years there has been an ongoing train of Protocol Dejour, often focused on specific industries (BACnet for instance). It is a constant battle, with end users looking to have choices of competitive products at competitive prices all easily dropped into their systems, and manufacturers all trying to control the user experience and the performance of the equipment they stand behind. Manufacturers are focused on developing the treasured customer loyalty, or lock in, but for users, open systems and universal connectivity is the Holy Grail.
“The fact is that new connectivity, as might be offered by OPC UA and MQTT, brings new opportunities to the industry. New ways, and perhaps better ways, to accomplish new goals,” Cloutier said. “But that does not address the billions already invested in legacy equipment. None of that should be relegated to the landfills. No factory should be forced to disrupt a functioning line in the name of connectivity.”
He said that the only real solution to these legacy equipment issues is an easy to use, reliable, and stable intermediary device that can make a legacy MODBUS device work seamlessly in the MQQT world. For reasons that should be obvious, we should not depend on Arduino or Raspberry Pi for protocol conversion in industrial environments, and even the typical PLC lacks the internal architecture and features to serve in that role generically.
“As for AI, consider that manufacturers rely on precise, accurate, and highly repeatable processes. AI is at best a statistical process, and by virtue of that it will be wrong and inaccurate from time to time. The popular term is hallucination,” Cloutier said. “But when that happens, if AI is inappropriately deployed, the results can be devastating, painfully apparent, and costly. That won't be any hallucination.”
“To the extent that AI can be properly trained and deployed where its failures can be mitigated, it can be of great benefit,” he added. “Unfortunately, few understand what AI actually is, how it really works, and therefore know what they shouldn't trust it to do.”
Al Presher, Editor, Industrial ethernet magazine
Complex software repositories can be set up using AAS technology, and are a tool for managing the lifecycle of a software. The device and update management system uses the OPC UA DI standard to update device firmware via OPC UA. The resulting benefit is software that is always up to date.

A DIGITAL TWIN IS A VIRTUAL REPRESENTATION of an asset over its entire life cycle. This also includes non-physical assets such as software. There are a number of specific use cases for the digital twins of software. These include storing, managing, versioning and distributing software via a digital twin. Various use cases show how well the AAS and OPC UA technologies complement each other.
The digital twin has been one of the trending topics in automation technology for years. According to a definition from the publication “The role of the Industry 4.0 “administration shell” and the “digital twin” in the life cycle of a plant: navigation aid”, it refers to the administration shell of an asset (asset administration shell, AAS). An asset can be an individual component, a field device, an entire system or software. Use cases of the AAS that are currently being discussed the most include:
• the digital rating plate (digital nameplate)
• paperless documentation by storing digital documents in the AAS
• the provision of technical product features in the AAS and
• making CO2 footprint available in the AAS
In addition, digital twins have the potential to advance end-to-end, integrated plant engineering in industrial manufacturing and thus increase the efficiency of production plant engineering in general.
The following use cases were described and examined in more detail in the item:
Confirmation
Confirm Confirmation Timeout
Installation
Current State
Install Software Package
Installation Delay
Percent Complete
Resume
• Management of software modules based on asset administration shells
• Obtaining and reusing engineering data that is located in the plant AAS
• Use of engineering-relevant data from the AAS of the components
• Storage of engineering data generated during PLC engineering in Asset Administration Shells so that it can be reused in other engineering tools. This article builds on this and focuses specifically on the aspects of software management, i.e. saving and managing with AAS and distributing software using OPC UA DI. To this end, the item briefly introduces the
Loading
Current Version
Error Message
Fallback Version
File Transfer
Get Update Behavior
Pending Version
Prepare for Update
Abort
Current State
Prepare
Resume
Update Status
AAS technology – in particular the “Software Nameplate” submodel – and the “Software Update Model” in the OPC UA DI standard. It shows how software can be stored, versioned and referenced in AAS software repositories. The authors explain how software can be retrieved from the repositories and loaded onto devices via the OPC UA DI standard. Finally, an assessment is made of the extent to which vendor-neutral, holistic software management is feasible with the AAS and OPC UA.
AAS information is available in partial models
AAS technology is supported and promoted

by the Industrial Digital Twin Association e.V. (IDTA) as a cross-company technology consortium. The general AAS metamodel has now been transferred to IEC standard 632781:2023. The Asset Administration Shells differentiate between type and instance AAS based on a class-object relationship. A type AAS represents the type of an asset. The instance AAS represents a concrete existing asset. It is generated when the asset is created and may be derived from a type AAS.
The information actually contained in the asset administration shells is available in the submodels. As of July 2025, the IDTA lists 98 predefined, i.e. published submodels and those currently being developed. These cover the basic use cases mentioned above, for example. Based on the predefined submodel “Hierarchical Structures”, the hierarchical structures of the AAS are also implemented as a list of references to subordinate AAS. The Asset Administration Shells are hosted in AAS repositories on AAS servers. Access to the AAS of an AAS server is via its REST API. Security mechanisms – such as fine-grained rights management – are already integrated into
the AAS technology. On the production side, individual devices and components have a QR code that links to the AAS.
Software packages can be stored in the submodel
The aim of the Software Nameplate submodel is to provide software-related information in a standardized and interoperable way. The submodel consists of two aspects:
• the SoftwareNameplateType, which describes type-related properties such as product designation, version, manufacturer information or installation source.
• the SoftwareNameplateInstance, which includes instance-specific details such as serial number, installation path, architecture used and contact information.
Software packages can be stored directly in the type aspect of the submodel. Alternatively, the submodel allows the referencing of external sources of supply, for example a downstream software repository. Information on the download URL is recorded here.
Software update model determines standardized download interface
The OPC UA DI standard (Device Integration) is an extension of the OPC UA standard that was developed specifically for the uniform modeling and integration of devices in automation systems. It defines standardized information models and interfaces to represent and control devices from different manufacturers in an interoperable manner. The DI standard forms the basis for other standards aimed at modeling devices, such as “OPC UA for Profinet”, “OPC UA for IO-Link” or “OPC UA for PLCopen”. The DI specification explains three models that build on each other: the “(base) Device Model”, “Device Communication Model” and “Device Integration Host Model”. Two add-ins are also defined: the “Locking Model” and the “Software Update Model”.
The Software Update Model defines a standardized interface for loading software – for example firmware – onto a device via OPC UA. The download and installation of the software can be completely decoupled. For example, firmware updates can first be rolled


out to several devices and then installed on all devices at the same time. To ensure the function of a device if an update fails, a fallback version can be defined via the model.
Figure 2 shows the software update model in the address space of the OPC UA server of a Phoenix Contact controller.
Software repository the detection of changes to software dependencies
Figure 3 illustrates how software repositories can be realized with AAS using the Hierarchical Structures and Software Nameplate submodels mentioned in the previous sections. The repository can be an AAS server or a list of the AAS of software modules.
In the first step, each software module is represented by an AAS that contains the Hierarchical Structures submodel. Each release of a software module has its own AAS, which is referenced in the AAS of the software module. The type aspect of the Software Nameplate submodel is defined in the AAS of the releases. Among other things, the software package of the release itself or its source of supply is stored in this submodel. Each release AAS also contains a Hierarchical Structures submodel with references to the AAS of all software releases on which this release is dependent.
Figure 4 shows the life cycle of software that is linked to the software repository of an AAS. The figure illustrates that there is even more
information that should be stored in the AAS of the software module.
This includes, for example, requirements and interface descriptions, documentation and test reports. The figure also explains that changes to software dependencies can also be determined from the software repository. Such a mechanism is useful in order to be able to react to any safety-critical updates in software dependencies.
and Update Management System obtains new installation packages from the AAS
Figure 5 shows how a software package is retrieved from the software repository of an AAS and loaded onto a device via the software update model of OPC-UA-DI. The central element here is software, which is referred to below as the “Device and Update Management System (DaUM)”.
This software accesses the OPC UA servers of subordinate devices – such as controllers – and reads the version status of their firmware or other software on the devices. The DaUM uses the AAS software repository via the standard REST interface of the AAS server and checks at regular intervals whether a new software version is available there. If this is the case, the DaUM obtains the installation package from the AAS, loads it onto the device via the OPC-UA-DI software update model and runs the installation here.
Implementation of AAS software repository possible with current technologies
This article explains how complex software repositories can be set up using AAS technology and how they interact with AAS over the lifecycle of a software. The submodels used for this are already published today. It is therefore possible to implement AAS software repositories with the current state of the art. The article illustrates how a so-called device and update management system uses OPC UA DI to update device firmware via OPC UA. Such device and update management systems are already available on the market, for example the software of the same name from Phoenix Contact. The next logical step towards completely vendor-neutral software management would be to link AAS software repositories with the installation mechanisms of the OPC UA DI standard. The last currently unresolved hurdle for complete vendor neutrality would then only be the format of the installation packages for software. There is therefore a need for further specifications to define a general, manufacturerneutral format for software packages.
Prof. Dr. Andreas Würger, Professor of Industrial Control Engineering, HAW Hamburg and Arno Martin Fast, Senior Specialist, PLCnext Technology, Phoenix Contact GmbH & Co.
AI does not create manufacturing insight. It consumes structured information and returns analysis. If the underlying data is inconsistent, undocumented or implicitly defined, the results will be unreliable regardless of how advanced the analytics appear to be.

AI IN MANUFACTURING FAILS FOR ONE SIMPLE reason: inconsistent data structure. Predictive maintenance, quality analytics and digital twins depend on standardized, versioned and semantically consistent data models. A Master Data Model defines the meaning, units, hierarchy and governance of manufacturing data so AI systems can consume it without guesswork.
Without this foundation, analytics become brittle, integrations multiply and plant-toplant comparisons break down. This guide explains what a manufacturing data model is, how to architect it, how to govern it and how to expose it securely so AI systems can operate with accuracy and scale.
In the earliest manufacturing systems, data context was implicit. A designer/programmer built systems with implicit understanding of the formats, data types and other attributes of data passed from a manufacturing system to an application.
While standards exist, we rely too much on implicit and a priori context. Today, for
an AI model or any other application to properly consume and utilize data, it must have clear, consistent and standardized data models. This paper describes what those data models are, how to construct and locate them and how to provide access to them.
Standardized data models are critical to the success of digitalization and Smart Manufacturing.
1. Data Models must be organized properly in response to the specific problems faced by the manufacturer
2. No matter where data models are stored, applications must be provided with standard IT-friendly access mechanisms.
3. Standardized and consistent metadata is a benefit to the successful deployment of data models.
4. Integrating data models with the data product (a filled-in instance of the data model) provides AI systems with the ability to provide the most value.
Smart Manufacturing – A data-driven approach to optimizing production processes and creating manufacturing processes that enhance efficiency, quality and flexibility.
Data Model – A collection of related attributes of a manufacturing system described by some technique such as a User Defined Type (UDT), a database schema, a set of variables structured in code or even a paper diagram.
Data Schema – The formal structure of a Data Model, defining how the data is organized, stored and related.
Data Model Instantiation – Assigning data model elements to specific memory locations in a production control system, such as a tag in a PLC, a register in a Modbus TCP device, an attribute value in an OPC UA server or the present-value property in a BACnet device.
Data Product – A Data Product is an instantiated data model with current data values for some or all the data elements in the data model.
Metadata – Descriptive data associated with a data item in a data model.
A data model in manufacturing is simply a set of related elements (variables, points, registers) in a production control system. This definition differs slightly from that of a data model in an enterprise system.
A data model in enterprise applications is typically a diagram of a software system and its data elements. It is most often used to visualize the various elements flowing in and out of a database.
A data model in a manufacturing system defines a set of elements in a production control system that are related to each other. A data model for pump maintenance, for example, may organize cycle counts, run times, energy usage, asset identification and location data for a set of pumps.
Manufacturing data models can be simple or very complex. A single temperature can be a data model if it meets the needs of a consumer of the data product (an instance of the data model containing actual data). A complex data model can have tens or hundreds of related elements organized hierarchically. Hierarchical data models are used to describe more sophisticated relationships among the elements.
Smart Manufacturing systems that deploy company-wide, standardized data models are simpler to maintain, easier to extend and reduce the cost of integration. Standard data models allow AI systems to easily consume data, understand its context and provide sound analytic judgements. Applications that rely on inconsistent or implicit data definitions are brittle, prone to failure and difficult, if not impossible, to maintain.
Smart Manufacturing systems cannot be successful without standardized data models. Without standardized data models, applications cannot properly consume data. At the most basic level, a value of “50” can be interpreted as 50 centigrade, 50 hertz or 50 psi.
Without a data model, an application must use other, a priori, mechanisms for data interpretation. Implicit definitions like that are undocumented, difficult to extend to additional applications and cause chaos when manufacturing data values are misinterpreted.
Data modeling begins with the unique needs of the organization – the problems that must be solved. Data models can be built to optimize uptime, reduce waste, enhance maintenance or address the organization’s
{ "$schema": "https://js on....example...s chema.js on", "PumpingStationRecord": { "@releas eID": "" , "CreationDateTime": "" }, "Pump": { “ID:”P101”, {“Location”:”Eas tPlant”,”Ins tall Date”:”10/17/2017”}, “Ass et:” {“Vendor”:”MXC”, “Type”:”Centrifugal”,”Ins tall Date”:”10/17/2017”}, “Value:” { “Speed”:”-1”, {“Max”:”205”,”Min”:”120”}, “CycleCount”:”-1”, } } }
specific problems and priorities.
Begin with the decisions the operations or management team needs to make, and work backward to all the data values that are integral to those decisions. Once what’s required is known, identify the names, units, data types and semantics of each element of the data model. This is where Temp21 becomes “Oven 21 Current Temperature” Fahrenheit and an integer.
Data models like this provide a template for a data collection device to translate the raw physical signal into the data needed by the model.
Metadata (sometimes known as properties) is static, descriptive information that provides a more complete understanding of a data element. Data elements in most production control systems are, by themselves, completely undecipherable (Register 40202) or mostly undecipherable (Tag Speed44). Without metadata, these data elements are often useless unless knowledge is communicated implicitly or via an external mechanism.
There are multiple types of metadata. Metadata can be:
• Descriptive, such as “Section 2 Pressure”.
• Structural, such as data type, max/min and array dimension.
• Administrative, such as last read time, display name and facility name.
Since metadata is static, it can be created when the data model is created or when the data model is instantiated with tags from the production control system. Any metadata known and available when the data model is created should be associated with it. Any other metadata known by the control system personnel should be assigned to the data model when it is instantiated.
Data models are simple hierarchical lists of data elements and static metadata. Figure 1 illustrates a portion of a data model for pumps written in JSON notation. The pump section includes identity, asset and value sections that are each further described with metadata that is known at the time of data model creation.
While JSON is one of the better ways to organize manufacturing data models, models are commonly implemented using many different mechanisms:
In programmable controllers, models are organized using User-Defined Types (UDT. In databases, models are organized using schemas that define the contents of each table.
In an Excel spreadsheet, the columns are organized such that successive columns indicate hierarchy.
In text files, models are typically formed using JavaScript Object Notation (JSON). Where Should Data Models and Metadata Be Stored in a Manufacturing System?
Data models must be located where every application consumer can access them. There are no data governance rules specifying how the models should be organized, but the file containing these master data models must be available to every application that wants to use them.
Keeping data models in a single place enhances the data architect’s ability to maintain them and provide versioning when the models are modified.
Create Company wide Standard Models Appropriate to Your Manufacturing Operation. Use standard models tailored to your
industry and specific manufacturing operation. Make them widely available throughout your organization to promote interoperability throughout the plant and the enterprise. Storing models within applications or failing to use standard models results in reduced efficiency and chaos throughout an organization.
Use Standard Protocols and Open APIs. Standards are the key to reducing ambiguity, increasing maintainability and eliminating the chaos of bespoke systems. ISA 95, ISA 88, OpenAPI, ISA 101, PackML and other standards create common languages and models that connect different factory floor systems, reducing the chaos caused by non-standard information transfer.
Define Data Models That Focus on the Priorities of Your Manufacturing Operations. Data models have one ultimate application: to enhance the operation of your manufacturing production systems. Start with the end in mind by defining the problems to be solved, the data elements required to solve them and building those data elements into data models.
Define Clear Ownership and Governance. The meaning of your data must be clear and consistent throughout the organization. Using different data models is like everyone using a different OEE formula. It’s chaos. Designate a team or individual to manage data models, including change control, approval workflow and lifecycle management.
Secure Your Data Models
Because data models are central to much of your manufacturing operation, write-access to the data models must be strictly regulated, while read-access should be widely available.
Don’t Consider Data Models as Fixed Data models, like manufacturing operations, are not fixed. New models will be added, old models revised and models that are no longer applicable will be removed. Add identity, description and versioning metadata to all your data models.
Model Both Transactional and Time Series Interfaces
While traditional systems only model transactional interfaces at L3 and L4 of the architecture (Figure 2), there is also value in exposing the data models for time-series data collection at L1 and L2. Many of the same recommendations apply to both time-series and transactional data models, but you can choose to use different master data models for each. You should include models for all PLC UDTs in the L1 and L2 model. AI systems must be able to recognize time-series data generated by these UDTs.
Provide Standard Ways to Access Your Data Models
Integration can be simplified when standard APIs are available to access data models. Consistent, standard APIs are difficult to achieve. Instead, use tools like Swagger, for OpenAPI, to document synchronous APIs and enable standard access to the underlying data. AsyncAPI is another tool that can document asynchronous APIs in Event Driven Architectures (EDA).
Create Metadata When the Data Model is Created and When it is Instantiated
Any metadata known when the data model is created should be incorporated into it. At instantiation, a standard set of metadata derived from the control systems team’s knowledge should be included with the model. Metadata such as deadband, enabled, time value quality, unit of measure
and such are immensely important to AI data models.
Enforce Enterprise Wide Naming, Units, Time Conventions and Semantic Standards Across all Models.
Inconsistent standards and semantics undermine interoperability and lead to silent failures and hallucinations. Oven21Temp vs Oven_21_Temperature breaks AI models.
Asked Questions
Why is data modeling important?
Data modeling is vital for organizations that wish to be data-driven. You cannot be datadriven unless you have thought through what data you need, how to model that data, how to organize and maintain those models and how various applications will consume and use that data.

What is the advantage of organized collections of data models (master data model concept) vs. data models that are embedded in a software application?
Master Data Models (MDM) are maintained in a single place (database, SharePoint file, Excel spreadsheet, etc.), separate from applications, to ensure consistency across all the related applications that use them. MDM-based models can be updated and versioned in only a single place and new versions are immediately available to all the applications that use them.
What is the difference between a schema and a data model?
A schema is the definition of the data model. It is important to the applications receiving the model validate that it is properly formatted. A consumer can match the model name and version to the original schema to validate that the model is properly formatted.
How is data modeling related to the concept of a Unified Namespace (UNS)?
A UNS is a data exchange mechanism for manufacturing data. To be effective, the UNS must exchange instantiated data models.
How do master data models relate to PLC User-Defined Types (UDTs)?
Master data models are logical definitions of structure and meaning. PLC UDTs are physical implementations of those models inside control systems. The UDT should be built to conform to the master data model, not define it. If PLC structures become the de facto master, enterprise scalability breaks.
This reinforces architectural separation and governance.
What happens when different plants use different naming conventions or units?
Without semantic consistency, enterprise AI models require custom mapping per site. That destroys scalability, increases maintenance costs and inhibits plant-to-plant operational comparisons. A centralized master data model enforces: naming conventions, unit standards, time conventions and asset hierarchy rules. Company-wide standards matter.
How do data models support AI training and model accuracy?
AI models require structured, labeled, contextualized inputs. Data Models provide:
• Defined units
• Known ranges
• Quality indicators
• Asset context
• Version traceability
Without structured models, AI systems must infer context. Inference introduces error, drift and unreliable predictions.
AI does not create manufacturing insight. It consumes structured information and returns analysis. If the underlying data is inconsistent, undocumented or implicitly defined, the results will be unreliable regardless of how advanced the analytics appear to be.
The foundation of Smart Manufacturing in 2026 is not the AI model. It is the standardized manufacturing data model that defines
meaning, context, structure and access.
Manufacturers that build companywide master data models, apply consistent metadata, expose models through standard APIs and secure them within a disciplined architecture create systems that are maintainable, extensible and AI-ready.
Those that rely on implicit definitions, scattered schemas and application-embedded models create brittle integrations that do not scale.
A brief implementation checklist to follow:
• Start with the decisions the business must make.
• Define structured, standardized data models to support those decisions.
• Attach complete metadata at creation and instantiation.
• Store models centrally with version control.
• Expose them through secure, IT-friendly interfaces.
• Integrate models directly with their instantiated data products.
When manufacturing data is intentionally modeled and architected properly, AI systems can finally operate with clarity rather than guesswork. Insight does not begin with algorithms. It begins with disciplined data modeling.
John S Rinaldi, Chief Strategist and Director of Creating WOW! for Real Time Automation (RTA) and David Schultz, Principal Consultant with Amárach StackWorks.
Learn More
INDUSTRIAL CONNECTIVITY IS THE FOUNDATION of modern OT–IT architectures and a long-term investment. Decisions made today often have an impact for many years – both technically and economically.
Especially in new projects, system expansions, or upcoming modernization initiatives, it is worth carefully evaluating the software licensing model. It affects planning reliability, budget stability, and strategic flexibility throughout the entire lifecycle of your machines and systems.
or perpetual license?
Both models can be appropriate. The key is selecting the model that aligns with your equipment lifecycle, budget planning, and digitalization strategy. Learn more about Softing’s licensing options and connectivity solutions.
Make the right decision for your industrial connectivity. The key is selecting the model that aligns with your equipment lifecycle, budget planning, and digitalization strategy. Look for a licensing model that enables industrial connectivity with true freedom of choice.
Softing provides powerful data integration solutions – without imposing a specific licensing model.
Proven software for stable OT–IT integration in existing automation environments. Available as a perpetual license – ensuring long-term investment protection.
SDEX Suite
Scalable industrial connectivity platform for modern OT–IT architectures, deployable on Windows, in Docker environments, or as a hardware gateway. Available as a perpetual license or subscription.
Future-proof industrial connectivity also means making forward-looking economic decisions. Let Softing experts evaluate which licensing model best fits your strategy.
Softing software products such as OPC servers, OPC middleware and IT / OT integration solutions enable secure and reliable data integration – locally or into the cloud. This creates a foundation for optimizing processes and technical installations.
Article by Softing.
Learn More



Why safety professionals need a new class of logic solver, one that bridges the enormous gap between safety PLCs and single-loop devices.

THE LOGIC SOLVER IS THE DECISION ENGINE of every Safety Instrumented System (SIS). It evaluates process inputs, applies voting logic, and initiates the action that ultimately reduces risk. Despite its importance, logic solver selection has traditionally forced engineers into a binary choice: either a fully featured Safety PLC or a single-loop logic solver (alarm trip).
Both approaches are well established, but they were designed for fundamentally different problem sets. As a result, a large portion of modern SIS applications lands uncomfortably between them; too complex for single-loop devices, yet far too small to justify a Safety PLC.
This mismatch is what many practitioners recognize as the logic solver gap, and it has quietly cost industry time, money, and unnecessary complication for years. The gap is more than just an inconvenience; when neither traditional option fits cleanly, engineers are forced into compromises that become difficult to defend during hazard reviews, audits, or incident investigations.
Single-loop logic solvers, often referred to as alarm trips, have evolved significantly. Many now include onboard diagnostics, password protection, easy-to-use menus, and configurable logic settings. They are fast to commission and easy to maintain. But their limitation becomes apparent when more than one loop needs to be monitored, or when voting logic must be implemented without extensive relay inter-wiring. They simply weren’t designed for multiloop safety applications.
Safety PLCs, on the other hand, certainly can handle small safety applications, but they do so with substantial overhead. Programming often requires licensed software and specialized skills. Their configuration and validation may demand additional documentation, and lifecycle costs are frequently much higher than the initial purchase price.
For SIS applications in remote operations, temporary facilities, skid-based systems, wellheads, or burner management units, a Safety PLC is often technologically excessive,
financially prohibitive, or operationally inefficient.
In practice, many SIS designs involve only a handful of safety functions. Typical requirements include:
• One to three independent safety loops
• Moderate I/O counts (often 6–12 pts total)
• Simple voting logic such as 1oo2 or 2oo3
• Localized deployment on skids, packages, or remote units
• Straightforward proof testing and validation
These midrange solvers thrive in places where Safety PLCs feel like overkill: pump shutdowns, burner management, wellhead safety, smallscale overpressure protection, tank protection, clean-in-place systems, and localized trip systems. They also integrate exceptionally

well with existing BPCS or PLC infrastructures, especially in hybrid safety strategies where a larger safety platform handles core systems, while multiloop logic solvers are deployed at peripheral or isolated SIFs.
Across oil & gas, chemical, and energy industries, SIS architectures are becoming more modular and decentralized. Rather than routing all safety logic through a central platform, many facilities are deploying localized safety nodes dedicated to specific hazards or process units.
Multiloop logic solvers fit naturally into this approach. By localizing voting and trip decisions, they reduce wiring complexity, shorten commissioning time, and simplify proof testing. Validation becomes more straightforward
because fewer components and interconnections are involved. This directly supports IEC 61511 lifecycle expectations by reducing proof test complexity and minimizing the scope of revalidation when safety logic changes.
Modern multiloop logic solvers also address visibility and diagnostics. Read-only communication interfaces such as MODBUS, Ethernet diagnostics, or HART pass-through allow safety status and device health to be monitored without undermining safety independence.
This capability supports asset management, audit readiness, and maintenance planning while preserving the integrity of the safety function; particularly valuable in hybrid
architectures that combine centralized safety platforms with localized protection layers. Completing the Logic Solver Spectrum
Multiloop logic solvers do not replace Safety PLCs, nor do they diminish the value of singleloop devices. Instead, they complete the logic solver spectrum by providing a right-sized option for applications that have long been underserved.
For safety professionals seeking designs that are easier to validate, simpler to maintain, and better aligned with actual risk, the “missing middle” is no longer a niche solution, it is rapidly becoming a core architectural element of modern SIS design.
article by Moore Industries
Sennheiser boosts OEE using solutions from Bitmotec and Softing Industrial, achieving a significant boost to Overall Equipment Effectiveness by implementing an innovative digitalization strategy.

A HETEROGENEOUS MACHINE POOL, MANUAL reports and untapped potential: Sennheiser electronic SE & Co. KG achieves a significant boost to Overall Equipment Effectiveness (OEE) by implementing an innovative digitalization strategy. All made possible by the modular software platform from Bitmotec GmbH and the high-performance interface solutions from Softing Industrial Automation GmbH. A partnership that ensures production data transparency, process efficiency and real-world innovation.
Leaving no potential untapped –with structured data and seamless connectivity
For decades, the Sennheiser name has stood for best-in-class quality in the design and manufacture of premium audio products. In its own electronics production facilities, the company also rigorously pursues the goal of
identifying and realizing new approaches to improving efficiency, even in already highly optimized processes. One key objective was to optimize Overall Equipment Effectiveness (OEE) through end-to-end automated collection and analysis of production-relevant data.
The key challenge was the centralized acquisition of performance data from a heterogeneous machine pool with multivendor, multi-generation controls. Until then, data had largely been logged manually per job – a cumbersome process that limited transparency and slowed optimization.
The project was implemented at Sennheiser with a joint team from Bitmotec and Softing Industrial. In the process, each partner contributed their specific expertise,
from conceptual design to integration and production applications.
Bitmotec: project leadership, software integration, and process knowledge
Project management was handled by Bitmotec, who assisted Sennheiser from the initial analysis to the finished solution. After completing a comprehensive survey of the machine pool and hosting a joint requirements workshop, Bitmotec devised a made-to-measure digitalization strategy that accounted for the technical features of the machine controls as well as the client’s organizational process flows.
Work packages completed by Bitmotec:
• Analysis and evaluation of existing control systems
• Development of incremental
implementation plan
• Consultation with machine manufacturers on potential interfaces
• Full integration of the software into Sennheiser’s IT systems, including VPN access, virtualization environment, and database and visualization configuration
• Incorporation of all systems into the modular BITMOTECOsystem OEE platform
Standardized and scalable machine connectivity
As an industrial data communications specialist, Softing Industrial provided the interface technology needed to ensure the reliable integration of machines using a variety of Siemens controller types into the data infrastructure.
This involved deploying Softing’s edgeConnector 840D, which is compatible with machines equipped with the 840D SL or 840D PL controller, with further integration facilitated by the NETLink S7-Compact. The container-based edgeConnector solution enables a virtualized connection via OPC UA and MQTT without any changes being required on the controller itself. A web interface is provided to ensure easy configuration and straightforward integration into existing network setups.
The connectors from Softing provided a stable and manufacturer-neutral foundation for end-to-end data collection in Sennheiser’s heterogeneous machine pool.
Results and outlook: from efficiency to innovation
By combining Bitmotec’s modular IIoT software platform with the high-performance, standardized connectors from Softing Industrial, Sennheiser was able to achieve its next important transition: from a semi-manual data collection process to a centralized and automated OEE reporting system compatible with any machine manufacturer and control type.
The production team now has access to a comprehensive, live data-driven machine pool overview, which provides real-time insights into the current operational status. Job-specific performance analyses for individual machines are also available, which can be used to obtain targeted efficiency assessments. In the event of unexpected downtime, root cause analyses are available to accelerate troubleshooting. The system is supplemented by automated notifications via Microsoft Teams that provide shopfloor staff with real-time updates on relevant events. The scope has been further extended by including a comprehensive asset management system that logs and manages all tools and equipment used.
Thanks to the open architecture and userfriendly visualization, Sennheiser is free

to create new dashboards and extend the solution as needed. For example, to integrate the company’s internal ERP system.
Various optimizations for maintenance and servicing processes are already at the planning stage. These include tool wear and life monitoring as well as threshold monitoring that is intended to facilitate a predictive maintenance strategy. Automated maintenance data analysis is another aspect being considered, with the aim of optimizing individual maintenance intervals. Plans also include maintenance log digitalization to simplify record-keeping and improve transparency by providing user-friendly dashboards to enter the necessary data.
The Sennheiser project highlights the benefits that digitalization can bring to manufacturing by combining the practical, modular design of the Bitmotec platform with Softing Industrial’s expertise in integration and connectivity technology. Together, these partners help companies like Sennheiser to boost their production efficiency while also establishing a more independent and future-proof system – with real-world results every step of the way.
Application article by Softing Industrial.
Learn More
To scale Physical AI, the next leap is the development of a Multimodal Large Language Model (MLLM), an AI model capable of understanding and reasoning across multiple input types including text, images, video, audio, LiDAR, and more.

PICTURE A WAREHOUSE ROBOT WEAVING through aisles at top speed, or a massive shipping crane hoisting containers with millimeter precision. These aren’t pre-programmed machines, they’re AI systems making split-second decisions in the real world. Welcome to the era of Physical AI. Physical AI describes intelligent systems that can sense, interpret, and act in real environments. Think of self-driving cars navigating busy streets, robotic arms assembling machinery with precision, or smart grids adapting in real time to energy demands. At the heart of this transformation is the digital twin: a live, virtual replica of a physical object or system. Digital twins mirror the real world with incredible accuracy, allowing AI to test ideas, predict outcomes, and guide actions instantly. Yet behind this powerful pairing lies something just as critical: the
network. Without fast, secure, and dependable connectivity, Physical AI simply can’t operate.
To scale Physical AI, the next leap is the development of a Multimodal Large Language Model (MLLM), an AI model capable of understanding and reasoning across multiple input types including text, images, video, audio, LiDAR, and more. When this kind of model is directly tied to physical environments and real-time sensor data, it becomes, in essence, “an LLM for Physical AI.”
Digital twins support these models in two ways: as simulation environments for testing and refinement, and as live references during real-time operations. Together, they give MLLMs the accurate, up-to-date context needed for smarter decisions. But none of this works without a robust, intelligent network –the backbone connecting assets, twins, and AI models instantly and securely.
For years, digital twins have been invaluable for design and simulation. Engineers could test a jet engine before manufacturing or model a factory to improve efficiency. But Physical AI changes the role of these twins.
Today, they are part of a continuous control loop, constantly updated from sensors on physical machines, making predictions, and feeding guidance back into the real world. This loop happens in milli- or even microseconds, meaning the infrastructure has to move huge volumes of data incredibly quickly.
Consider a delivery robot in a warehouse. Its sensors, including cameras, LiDAR, ultrasonic detectors, collect data every microsecond or more. The digital twin processes this information to anticipate hazards and plan routes. The robot receives instructions

"Industries
are
and
immediately, adjusts its movements, and carries on. Without reliable and ultra-fast connectivity, that chain breaks.
The demands on these networks go far beyond traditional connectivity. When a crane at a busy shipping port relies on its digital twin to coordinate the movement of multi-ton containers, even a delay of a few hundred microseconds could mean an accident. Physical AI thrives only when latency, the time between sensing and acting, is kept to a bare minimum.
For an MLLM to operate in these situations, ultra-low latency is essential. Every decision depends on instant streams of input from sensors and equally rapid delivery of output commands to physical systems.
Edge computing makes this possible by processing data close to where it’s created. Digital twins can live at the edge for lightning-fast responsiveness, or in the cloud for broader scalability. In either scenario, the network infrastructure must ensure seamless, end-to-end performance.
The network must also bridge the edge and cloud seamlessly to enable real-time decisions locally while supporting big-picture analytics, long-term data storage, and AI model training centrally. And because Physical AI operates in demanding real-world conditions, the infrastructure itself must be ruggedized to withstand dust, moisture, extreme temperatures, and constant vibrations.
High-fidelity digital twins aren’t just fast, they’re ravenous for data. A single autonomous vehicle can generate terabytes per hour from its cameras, radar, and LiDAR sensors. While much of that processing happens onboard, the most critical insights must still flow seamlessly to the cloud or edge. Any bottleneck, and the twin falls out of sync. The AI’s decisions? No longer trustworthy.
In Physical AI deployments, MLLMs rely on this nonstop stream of high-resolution data to perceive, reason, and act correctly. That means networks must not only deliver massive throughput, but maintain absolute precision and reliability in real time.
Physical AI often runs critical infrastructure, including manufacturing plants, transportation hubs, medical robotics. In these environments, network downtime or a breach could have serious consequences.
If Physical AI is the brain, the network is the nervous system: carrying sensory data, enabling thought, and triggering physical action. Security acts as the immune system, guarding against threats.
For MLLMs in Physical AI systems, network security isn’t just a safeguard, it’s integral to function. Without trusted, uninterrupted data flows, the AI model can’t adapt, learn, or act safely in the real world. That’s why resilience must be built in from the start: redundant
connections, advanced fault tolerance, encryption, authentication, intrusion detection, and network segmentation. By integrating security directly into the network infrastructure, organizations streamline management and maintain consistent protection across physical, cloud, and virtual environments. With security embedded, the system adapts quickly to evolving risks without sacrificing performance.
Industries like automotive and logistics are already proving what’s possible with Physical AI. Their experiences highlight the essentials for success: ultra-low latency, high bandwidth, reliability, and strong security.
Ultimately, the success of Physical AI depends on one thing: infrastructure built to match ambition. Networks must deliver speed, intelligence, and resilience from day one, not as an afterthought.
Industries that invest early in robust, secure connectivity will be the ones that turn Physical AI from concept into competitive advantage. The question isn’t whether MLLMs will reshape the physical world, it’s whether your network is ready to power them.
Samuel Pasquier, VP, Product Management, Cisco Industrial IoT Networking.
Learn More
To scale Physical AI, the next leap is the development of a Multimodal Large Language Model (MLLM), an AI model capable of understanding and reasoning across multiple input types including text, images, video, audio, LiDAR, and more.

NETWORK TRAFFIC MONITORING HAS EVOLVED from a nice-to-have administrative tool into an essential cybersecurity defense mechanism. As organizations face increasingly sophisticated threats—from AI-powered attacks to ransomware-as-a-service—real-time visibility into network activity is no longer optional. Network traffic monitoring encompasses the continuous analysis of data flows across network infrastructure, tracking bandwidth consumption, application performance, and security anomalies. Modern implementations leverage machine learning algorithms and behavioral analytics to detect threats that traditional signature-based systems miss entirely.
Today's network environments demand comprehensive monitoring strategies that address cloud-native architectures, IoT proliferation, and zero-trust security models: 1. Multi Source Telemetry Collection Modern networks generate telemetry from diverse sources: on-premises infrastructure,
multi-cloud environments, edge devices, and IoT endpoints. Leading organizations deploy unified observability platforms that aggregate flow data (NetFlow, sFlow, IPFIX), packet captures, API logs, and cloud-native telemetry into centralized data lakes for correlation and analysis.
2. AI Powered Device and Application Discovery
Advanced network monitoring platforms now utilize machine learning to automatically discover and classify devices, applications, and services. These systems baseline normal behavior patterns, identify shadow IT, detect unauthorized devices within milliseconds, and flag anomalous traffic patterns indicative of lateral movement or data exfiltration attempts.
3. Intelligent Analysis and Automated Response
Contemporary network traffic analysis goes beyond bandwidth monitoring. AI-driven platforms detect zero-day exploits through behavioral anomalies, identify encrypted malware command-and-control traffic, predict capacity constraints before they impact operations, and trigger automated remediation
workflows that quarantine compromised endpoints or block malicious IPs in real-time.
4. Next Generation Monitoring Platforms Today's enterprise-grade network monitoring solutions feature cloud-native architectures with infinite scalability, AI/ML-powered threat detection with sub-second response times, integration with SIEM, SOAR, and XDR platforms for unified security operations, automated compliance reporting for frameworks including NIST CSF 2.0, CMMC 2.0, and NIS2, and predictive analytics that forecast network performance trends and capacity requirements.
The threat landscape has fundamentally shifted. Ransomware attacks now cost organizations an average of $4.91 million per incident, while the average data breach takes 194 days to identify. Without continuous network visibility, organizations operate blind. Critical drivers include: Ransomware and Advanced Persistent Threats: Early detection of lateral movement and data
staging activities can prevent encryption events entirely. Network traffic analysis identifies suspicious patterns—unusual port scanning, abnormal data transfers to external IPs, or encrypted traffic to known malicious domains— before attackers complete their kill chain.
Zero Trust Architecture Requirements: As organizations abandon perimeter-based security, network traffic monitoring becomes the enforcement mechanism for zero-trust principles. Continuous verification of device identity, application authorization, and data access patterns ensures that "never trust, always verify" isn't just a slogan.
Hybrid and Multi Cloud Complexity: With workloads distributed across AWS, Azure, GCP, and on-premises infrastructure, unified traffic visibility prevents blind spots. Modern monitoring correlates activity across environments, ensuring consistent security posture regardless of where applications run. Regulatory Compliance Mandates: Regulations including GDPR, HIPAA, PCI DSS 4.0, and emerging AI governance frameworks require documented network monitoring and incident response capabilities. Failure to demonstrate continuous monitoring during audits results in substantial fines.
Supply Chain Security: Recent attacks targeting third-party vendors and software supply chains highlight the need to monitor all network connections—especially those to external partners, cloud services, and SaaS applications.
Organizations implementing comprehensive network monitoring realize measurable operational and security advantages:
Proactive Threat Hunting: Security teams identify indicators of compromise (IOCs) before automated alerts trigger, dramatically reducing dwell time and limiting breach impact.
Performance Optimization: Real-time bandwidth analysis prevents application degradation, ensures QoS for critical services, and supports capacity planning with predictive analytics.
Cost Reduction: Early threat detection prevents ransomware payments, regulatory fines, and business disruption. Optimized bandwidth utilization reduces unnecessary circuit upgrades and cloud egress charges.
Mean Time to Resolution (MTTR) Reduction: Automated root cause analysis accelerates incident response from hours to minutes, minimizing downtime and revenue impact.
Shadow IT Discovery: Automatic detection of unauthorized applications, rogue access points, and unapproved cloud services prevents security gaps and compliance violations.
Insider Threat Detection: Behavioral analytics identify unusual data access patterns, afterhours activity, and attempts to exfiltrate

sensitive information.
Network Segmentation Validation: Continuous monitoring verifies that microsegmentation policies function correctly, preventing lateral movement following initial compromise.
Digital Experience Monitoring: Usercentric metrics ensure that application performance meets service level objectives (SLOs), improving productivity and customer satisfaction.
The convergence of operational technology (OT) and information technology (IT) networks— particularly in industrial, transportation, and critical infrastructure environments—demands hardened networking equipment designed for mission-critical deployments.
Organizations require industrial-grade Ethernet switches engineered for extreme environments, extended temperature ranges (-40°C to 75°C), and continuous operation under demanding conditions. These ruggedized solutions must support advanced monitoring protocols including SNMP v3, RMON, and sFlow while providing redundant power inputs, Layer 2/Layer 3 switching capabilities, and integration with network monitoring platforms.
For industries including industrial automation, intelligent transportation systems, physical security networks, power generation and utility operations, and smart city infrastructure, network reliability isn't just about uptime—it's about safety, regulatory compliance, and operational continuity.
Network traffic monitoring has transformed from a bandwidth management tool into the cornerstone of cybersecurity defense and operational excellence. Organizations that implement comprehensive, AI-driven network visibility position themselves to detect
threats before they cause damage, optimize performance proactively, and meet increasingly stringent compliance requirements.
The question is no longer whether to monitor network traffic, but whether your monitoring capabilities match the sophistication of today's threat actors. In 2026, visibility isn't just an advantage—it's survival.
Since 2005, Antaira has engineered industrial Ethernet switching solutions designed to perform flawlessly in the harshest environments where failure is not an option. Antaira ruggedized switches feature extended temperature tolerance, military-grade metal enclosures, and support for advanced network monitoring protocols essential for maintaining visibility in mission-critical applications.
The Antaira Network Management Suite (a.NMS™) is designed to manage large deployments of Antaira switches efficiently. It handles essential tasks, such as firmware updates and configuration backups and restores, ensuring streamlined network operations. With a visual dashboard that displays SNMP traps and Syslog data, a.NMS™ also provides real-time monitoring and enhanced visibility of supported devices. Antaira NMS is built on the ISO FCAPS framework, providing a reliable and structured approach to network management.
Deployed across automation, transportation, security, and utilities sectors worldwide, Antaira switches provide the reliable foundation that network monitoring platforms depend on—because you can't monitor what you can't trust to stay operational. Build infrastructure that matches the importance of the data flowing through it.
Henry Martel, Field Application Engineer, Antaira Technologies.
Learn More
CC-Link IE TSN delivers precise, synchronised control for Hambi’s automated steel mesh handling system. This world-first system automates the cutting, handling, and stacking of heavy reinforcing steel mesh and performs a task that previously required up to six human operators.

AUTOMATION SPECIALIST HAMBI MASCHINENBAU, part of Terhoeven GmbH & Co. KG, has developed a world-first system that automates the cutting, handling, and stacking of heavy reinforcing steel mesh – a task that previously required up to six human operators.
By integrating Mitsubishi Electric’s drive and control technologies connected via CC-Link IE TSN, Hambi has achieved millimetre-level precision and seamless synchronisation across motion, safety, and vision systems in a single, unified network.
Tackling a demanding manual process
In the production of reinforcing steel mesh, long lengths of wire are welded into large mats, which must then be cut to size and
stacked for transport. This was a labourintensive process involving multiple workers to lift, align, cut, and stack the heavy meshes. It was also considered a difficult task to automate, as weight and flexibility of the mats means that even small deviations in alignment can cause major issues.
However, Van Merksteijn International B.V., a leading steel processor, was determined to overcome these challenges. It reached out to Hambi to develop an automated solution that could detect and compensate for any alignment variations in real time.
The result was the ASA (Automatic Cutting System) – a six-metre-high, 40-metre-long machine that automates every stage of the process, from lifting the top mat in a stack to cutting and turning sections for compact stacking.
The system uses six grippers, each capable of independent three-axis movement. As the mesh bends under its own weight during lifting, the grippers must dynamically adjust their positions to maintain even tension and prevent deformation.
In total, 18 servo drives coordinate this movement, with additional drives handling transportation, turning, and stacking. Synchronisation between these drives, as well as with the image processing system and safety controls, is critical to ensure stability and precision.
That’s why Hambi decided to link every part of the system – including servo drives, safety PLCs, frequency inverters, and controllers – via CC-Link IE TSN. The high-speed, deterministic communication provided by the open Ethernet standard

allowed the team to achieve millimetre-level precision when gripping and positioning the steel mesh, even as it naturally bends and shifts during lifting.
The technology’s gigabit bandwidth also
allows all system components to share a single unified network.
“Communication via CC-Link IE TSN is particularly important,” explains Marc Orgassa, Managing Director of Orgassa

If you’d like to learn more about the project, you can watch a full overview of Mitsubishi Electric’s work to deploy the world’s first fully automated line for reinforcing steel mesh. View Video on YouTube>
GmbH, Hambi’s long-term automation partner. “It allows us to ensure that the various system components and controllers are synchronised with the drives. This is an important prerequisite, as image processing naturally requires the exact position of the grippers.”
Following two years of development, the ASA system was commissioned at Van Merksteijn’s site in spring 2024. The solution achieves the precision, reliability, and productivity needed for large-scale reinforcing steel production.
“It’s inspiring to see machine builders like Hambi using CC-Link IE TSN to solve such complex motion control challenges,” says John Browett, General Manager of the CC-Link Partner Association – Europe.
“The ability to combine different tasks on the high-speed, open network demonstrates how this technology helps companies push automation performance further while keeping system design simple,” Browett added.
Application article by CC-Link Partner Association (CLPA).
Learn More
Industrial Embedded PC from WEROCK offers powerful new controller for continuous operation in production and automation.

WEROCK Technologies has expanded its portfolio of industrial computer systems by introducing the Rocksmart RSX1000, a fanless embedded box PC for use in industry, automation, mechanical engineering, and control cabinet construction.
When industrial PCs fail, it is often not just individual workstations that are affected. Production lines, intralogistics systems, driverless transport systems, test benches, or plant control systems can come to a standstill. The Rocksmart RSX1000 was designed for continuous operation in precisely such environments. The passive, fanless 24/7 design reduces dust-related failures and mechanical wear. Its design for vibration- and dust-prone environments further supports its use in production halls, machine environments, and vehicles.
Voltage fluctuations in industrial power grids pose another challenge. The embedded box PC is optionally available with a wide-range input of 9–60 V DC and a boot-on-power function. This makes the system suitable for use in machines, mobile applications, commercial vehicles, industrial trucks, and decentralized installations.
In terms of interfaces, the system offers “industrial I/O out of the box.” Among other things, dual LAN, multiple USB interfaces,
serial interfaces (RS232/422/485), and optional CAN and Wi-Fi 6 are available. Additional connections and extensions can be configured on a project-specific basis. This allows the embedded computer to be adapted to requirements in automation, the process industry, or building automation.
Equipped with 13th generation Intel® Core™ processors, up to 64 GB of RAM, and up to 2 TB of mass storage, the system is suitable for computationally intensive applications such as machine and plant control, edge computing, image processing, AI evaluations, SCADA and HMI systems, condition monitoring, and the connection of IT and OT networks. The integrated graphics support up to three displays and are designed for control stations, operator panels, and visualization systems in control cabinets or on machines.
The expansion of the embedded box PC portfolio will be presented on the f irst day of the embedded world 2026 trade fair. Trade visitors from industry, IT, and automation can find out about possible applications in mechanical engineering, plant engineering, and control cabinet integration on site.
Markus Nicoleit, Managing Director of WEROCK, explained the market launch of the
Rocksmart RSX1000: “With the Rocksmart RSX1000, we offer an embedded box PC that is designed for stable continuous operation in industrial environments. The fanless design, flexible power supply, and versatile interfaces enable use in machines, vehicles, and systems, as well as in industrial IT integration.”
Features of Rocksmart RSX1000
• Compact rugged industrial PC for machines, vehicles, etc.
• Fanless, passively cooled design for low-maintenance 24/7 operation and reduced risk of failure due to dust/wear
• Shock and vibration resistant for machine, vehicle, and plant environments
• Industrial housing for installations in harsh environments
• Wide input voltage range (optional 9 to 60 V DC) for automotive/mobile and industrial power grids with f luctuations
• 13th generation Intel® Core™ Deca-Core processors (Raptor Lake) for demanding industrial real-time applications
WEROCK Technologies Learn More
Logic controller from Siemens provides response to higher requirements for a compact controller, plus double the block function count and more I/O capacity for more complex and versatile logic applications.

The LOGO! 9 logic controller enables simple implementation of small automation projects from switching and control tasks in building services to mechanical and apparatus engineering. With LOGO! 9, Siemens marks a comprehensive generational change after 11 years and setting new standards in “Everyday Automation”. Significantly enhanced performance combined with modern operating comfort, flexible engineering, and forward-looking security, is tailored to the requirements of modern automation projects. LOGO! 9 will be available starting April 1, 2026.
“The requirements for compact controllers have fundamentally changed. What could once be realized with simple logic functions now requires precise data processing, intuitive operating concepts, and seamless security. LOGO! 9 combines the computing power and security standards of larger systems with the familiar simplicity and cost-effectiveness of a compact logic module for the first time, futureproofing our customers’ applications,” said Annemarie Große Frie, Head of Automation System at Siemens Digital Industries.
More performance and application scope for more demanding applications LOGO! 9 offers twice the function block count with up to 800, enabling significantly
more complex logic applications so that in many cases no additional control devices are required. The extended expansion level with up to 64 digital inputs, 60 digital outputs, and 16 analog inputs and outputs each creates scope for larger and more demanding projects.
The new analog expansion module (AM4) with selectable analog resolution, along with new calculation functions and floatingpoint number support, enables precise control and measured value processing at the highest level directly in the controller. These performance enhancements mean greater application flexibility, fewer additional components, and therefore lower effort across the entire lifecycle – without unnecessarily complicating the architecture.
Modern operating and display experience directly on
The new touch color display (resolution 320 × 240 px) with the enlarged area directly on the base module displays more information at a glance, ensuring clear operation and fast diagnostics.
For even better visualization and maximum readability directly on the machine, the new 4.3-inch LOGO! text display (480 × 272 px) with two Ethernet ports and secure communication to the
LOGO! base module is also available.
The software LOGO! Soft Comfort Version 9 is now available as a Plug & Play solution via USB stick and is fully compatible with older projects and generations. The integrated LOGO! Web Editor, including simulation, is a tool for all platforms and now supports macOS and Linux in addition to Windows. The new User Management (UMAC) with 4 roles ensures clear rights distribution – from administrator to operator – and significantly accelerates engineering processes.
Security and sustainability by design for the future
LOGO! 9 is prepared for the Siemens EcoTech label and thus for tomorrow’s sustainability requirements. Secure Boot and secure LOGO! communication protect against unwanted manipulation and unauthorized access. Firmware updates via LSC (LOGO! Soft Comfort) and factory reset with IP retention guarantee secure and efficient operation throughout the entire lifecycle. The combination of security and sustainability makes LOGO! 9 a future-proof solution.
Siemens Learn More
New updates from Beckhoff include TwinCAT 3 CoAgent, an AI assistant that revolutionizes the entire automation lifecycle, and multi-core, multi-tasking software support for multi-axis motion control.
TwinCAT 3 CoAgent AI assistant cuts down on engineering work in the warehouse environment and achieves a sustainable reduction in downtime.
The logistics industry is faced with the challenge of increasing throughput rates while simultaneously increasing system flexibility. To master this balancing act, Beckhoff is combining the advantages of PC-based control with artificial intelligence and decentralized automation. The aim is end-to-end networking – from interlinked production systems to order picking and dispatch.
The control cabinet-free MX-System, for example, offers huge potential for making savings during installation and commissioning, while the TwinCAT 3 CoAgent AI assistant cuts down on engineering work in the warehouse environment and achieves a sustainable reduction in downtime. This system cuts down on cable lengths, the space required, and installation and maintenance work all at once. Smart diagnostics via Bluetooth® ensure fast fault detection, reducing critical downtimes in busy distribution centers.
TwinCAT 3 CoAgent revolutionizes the entire automation lifecycle. Developers working in engineering benefit from language-based code suggestions for PLCs, HMIs, and I/O configurations, significantly reducing project planning time. During operation, the tool increases transparency through data analysis or AI-supported fault diagnostics, including automatically generated repair instructions.
A new software package from Beckhoff provides modular architecture with multi-core and multi-tasking support for multi-axis motion control. Multi-axis motion control has been an essential component of Beckhoff’s TwinCAT automation software platform for decades with successful deployment in demanding applications across industries. The newly available TwinCAT MC3 software represents the next generation of motion control, and is characterized by a consistent, modular architecture ideal for multi-core and multi-task support. TwinCAT MC3 makes it easier than ever to leverage a single industrial PC (IPC) to automate highly complex machines – even those with hundreds of axes that demand high-performance motion control. All the features of the previous TwinCAT NC2 motion control solutions are included in this new generation. Additionally, TwinCAT MC3 can

TwinCAT 3 CoAgent AI assistant cuts down on engineering work in the warehouse environment and achieves a sustainable reduction in downtime.

New software package from Beckhoff provides modular architecture with multi-core and multitasking support for multi-axis motion control.
be operated in parallel with NC2, and MC3 axes. As a result, new machine components can be implemented with TwinCAT MC3 without having to adapt existing components. TwinCAT MC3 delivers numerous new advantages for motion control applications thanks to its new modular architecture, including multi-core and multi-task support, and no fixed restriction on the number of axes in a machine. Full machine simulation is also possible for programming efforts; users can easily translate a real machine into a highly accurate simulation without major rework.
TwinCAT MC3 can be distributed to several CPU cores on the controller, allowing synchronized movement across all cores. Depending on their speed and function, axes can also be operated on the same CPU core at different cycle times. The CPU core is fully utilized with optimal performance, since the fastest axis no longer sets the rate for all axes.
More
Empowering decisions at the speed and scale needed to accelerate and sustain advantage by applying advanced AI to contextualized real-time operational data, institutional knowledge and domain expertise.
Seeq has unveiled Seeq Intelligence, ushering in the era of intelligence-led operations. Seeq Intelligence creates comprehensive, AI-driven decision intelligence that infuses confidence, clarity, and velocity into every operational decision, unlocking breakthrough operational and business performance at enterprise scale.
Industrial organizations face growing complexity, increasing talent loss, and critical expertise that is locked in siloed systems and individual experience, making it more challenging to improve performance, create consistency, and build competitive advantage.
Seeq Intelligence empowers organizations to make decisions at the speed and scale needed to accelerate and sustain advantage. By applying advanced AI to contextualized real-time operational data, institutional knowledge, domain expertise, prior actions, and decision history, Seeq Intelligence creates a powerful engine for high velocity decisions that drive measurable gains in efficiency, margins, and sustainable performance.
Designed to amplify the creativity and intuition of subject matter experts and infuse that invaluable—and previously impossible to scale—expertise into an organization’s operational DNA, Seeq Intelligence becomes a driver of transformation. It surfaces unseen opportunities, elevates high-impact decisions, and guides actions that improve daily execution and strategic long-term outcomes.
By creating a comprehensive and connected view of manufacturing operations, enriched with accumulated experience, Seeq Intelligence becomes a continuously evolving system of learning and improvement, helping teams address current and future challenges, driving more confident decisions at every level of the enterprise.
Seeq Intelligence introduces advanced agentic AI capabilities to operational decision-making, including:
Agent Q, a premium natural language, domain aware AI analyst that delivers rapid, comprehensive decision intelligence, which deepens operational understanding, answers complex questions, reveals hidden insights, and unlocks operational and business breakthroughs. It quickly assembles diverse, unstructured information and expertise— historical operational events, prior analyses, past actions, documents, and know how— into coherent investigations, traceable

Seeq Intelligence creates comprehensive, AI-driven decision intelligence that infuses confidence, clarity, and velocity into operational decisions. SOURCE:
intelligence, and prioritized recommended actions.
Build Your Own Agent, which allows Seeq users to create custom AI agents that execute multistep workflows on demand, or on schedules and triggers, by orchestrating data retrieval, analytics, and reporting steps to produce repeatable outputs such as reports, summaries, and automated actions.
Agent Extensibility, which enables secure agent-to-agent connections between Seeq AI agents and customer systems and information. This allows users to not only retrieve additional, highly relevant, and up-to date context—such as recent data windows or work orders—but also to initiate workflows and automate actions across those systems. By providing richer context and enabling closed-loop automation directly within the Seeq interface, Agent Extensibility reduces context switching and supports faster, more comprehensive decision making.
Document Access that enables the
extraction and synthesis of information from unstructured and semi structured documents into actionable and contextualized intelligence. It searches, reads, contextualizes, and interprets documentation to support Q&A and produces summaries of procedures, reports, manuals, and past analyses.
“Seeq Intelligence represents a step change in how industrial companies create value,” said Mark Derbecker, Chief Product Officer at Seeq. “By synthesizing context, history, and irreplaceable domain expertise with patented advanced AI, we’re giving organizations a continuously learning system that sharpens decision making and accelerates operational transformation. It’s about helping customers compete — and win — in a world where speed, insight, and adaptability define future leaders.”
Seeq
Learn More
Major update adds full support for CAN XL, multiple symbol files per connection, Python scripting and flexible licensing.

SOURCE: HMS NETWORKS
future.
“The new version is the essence of all previous editions combined with new capabilities”, adds Koch. “This means, we have integrated profound user feedback over years and added state-of-the-art technologies and functionalities to lift the Explorer to the next evolution stage.”
PCAN-Explorer has long been known as a powerful and versatile Windows software tool for working with CAN and CAN FD networks. Its core functionality is built around simultaneous access to multiple CAN channels, which allows users to observe, analyze, and log communication in complex network environments.
With the release of its PCAN-Explorer 7, PEAK delivers a major update that adds full support for CAN XL, multiple symbol files per connection, Python scripting and flexible licensing including floating licenses.
The next major evolution of PEAK’s professional CAN communication software for analyzing, monitoring, and simulating CAN, CAN FD, and CAN XL networks contains many helpful features for engineers. PCANExplorer 7 marks a significant step forward, adding full CAN XL support for faster bit rates (up to 20 Mbit/s) and higher payload (2,048 Bytes).
Users can now assign multiple description data bases per connection such as symbol files or CANdb files, which makes it easier to work with several data bases in parallel avoiding manual merging effort. The addition of Python scripting brings a modern automation environment to the platform heavily used in automotive testing applications.
Another major improvement is the introduction of a new licensing system based on CodeMeter. Customers can choose between single user or floating licenses. In addition, USB license dongles are available to make single user licenses portable. For users who want to test the software, trial licenses are available. These options make the Explorer software more flexible for both individuals and teams.
Another helpful functionality is the fine-grade trace playback, allowing users to step through trace files message by message, pause automatically on defined breakpoints, and create highly controlled test and replay scenarios. PCAN-Explorer 7 also introduces support for J1939 FD, enabling engineers who work with heavy-duty and off-highway applications to analyze and simulate nextgeneration protocol variants. Another practical enhancement is the highlighting of data changes in the receive list. This makes it easier to spot dynamic values, detect anomalies, and follow changing signals in real time.
The Product Manager for the PE7, Kristofer Koch, summarizes the goal of the new version as follows: “With PCAN-Explorer 7, we are anticipating the shift to CAN XL and empowering our users to be futureproof already today. Plus, we succeeded to deliver these advanced capabilities without changing the familiar interface and established workflows of version 6.”
Users, that have relied for many years on previous versions of the most famous CAN software on the market, can still relate to its core functionalities, its user interface and all well-established add-ins – plus now benefit from strong new features of the
A central element is the symbolic representation of messages, which transforms raw CAN data into readable and meaningful signal information using symbol files. In everyday use, PCAN-Explorer enables both manual and periodic transmission of messages, making it ideal for testing, network trouble shooting and system development.
• Support for CAN XL
• Support for J1939 FD
• Multiple symbol files per connection
• Python scripting support
• Fine-grade trace playback with step control, break points, and playback areas
• Highlighting of data changes in the receive list
• Improved performance with 64-bit foundation and separate UI and communication engines
• Flexible license options: Single, Floating, Trial, Add-Ins with license dongle option
• Optional maintenance contracts for updates and support
• All proven PCAN-Explorer 6 core functions included
HMS Networks AB
Visit Website
Purpose-built for OT environments, the solution has been tested across multiple industrial deployment scenarios and offers enhanced cybersecurity for Industrial 5G networks.
Siemens is now delivering verified AI-driven cybersecurity solution for Industrial 5G with Palo Alto Networks that combines Siemens’ private 5G infrastructure and continuous cybersecurity monitoring with Palo Alto Networks’ AI-driven cybersecurity solution. Purpose-built for OT environments, the solution has been tested across multiple industrial deployment scenarios. It offers enhanced cybersecurity for Industrial 5G networks without compromising on performance, part of the Siemens Xcelerator portfolio.
At the recent Mobile World Congress 2026, Siemens announced a verified cybersecurity solution for industrial private 5G Networks in collaboration with Palo Alto Networks. The solution combines Siemens’ Private 5G infrastructure with Palo Alto Networks’ NextGeneration Firewall (NGFW), specifically optimized for AI, extensively tested to verify high availability, network resilience, and uninterrupted operations. It enables manufacturers to meet diverse industrial security requirements while maintaining the critical performance their increasingly AI-driven productions demand.
“A pharmaceutical plant has different security requirements than an automotive assembly line,” said Michael Metzler, Vice President Horizontal Management Cybersecurity for Digital Industries at Siemens. “Siemens’ verified solution with Palo Alto Networks addresses these industry-specific needs through purpose-built architecture. Manufacturers get secure 5G connectivity tailored to their operations without performance trade-offs.”
“Palo Alto Networks and Siemens are not just connecting the factory floor, we are building the central nervous system for the future of industry – a future that is intelligent, autonomous, and secure by design,” said Dharminder Debisarun, Smart Industries Cybersecurity Executive at Palo Alto Networks.
Verified solution delivers industrial-grade security
Data-driven production systems require wireless connectivity for countless sensors and mobile assets, making private 5G essential infrastructure. At the same time, cyberattacks can cause costly downtimes or compromise worker safety. Additionally, regulations like NIS2 mandate defense-in-depth security architectures meeting IEC 62443 standards. Off-the-shelf IT security solutions often create performance bottlenecks or fail to address OT-specific threats in industrial environments.
The collaboration between Siemens and

Palo Alto Networks addresses this gap. Palo Alto Networks has specifically optimized its NGFW technology for Siemens’ Private 5G infrastructure through Siemens’ extensive testing across multiple deployment scenarios. This verification process validates that the solution delivers industrial-grade security without compromising the low latency and high throughput required for real-time production systems – a critical distinction from generic IT security approaches.
The solution combines three elements for enhanced cybersecurity. Siemens has specifically tested and verified the solution for industrial environments in its Digital Connectivity Lab in Erlangen (Germany): Siemens’ private 5G Infrastructure provides on-premises, deterministic wireless connectivity for mobile and moving assets with built-in security features protecting the core network infrastructure. The solution ensures data sovereignty and low-latency communication independent of mobile network operators.
SINEC Security Monitor: Siemens software for passive, non-intrusive, continuous on-premises
security monitoring during production. The system identifies communication anomalies, unauthorized devices, or potential threats without impacting production operations.
Palo Alto Networks Firewall delivers Layer 7 security and dedicated OT protocol analysis specifically optimized for industrial environments. Unlike generic IT security solutions, it provides deep packet inspection for OT protocols while maintaining the low latency required for real-time control applications, now also in wireless communications via private 5G networks. This includes protection against malware, intrusion attempts, and data exfiltration without the performance degradation typical of off-theshelf security tools.
This verified architecture meets IEC 62443 requirements for industrial automation and control systems security while maintaining the performance characteristics essential for time-critical production applications. The solution is now available as part of the Siemens Xcelerator portfolio.
Siemens Learn More
Cards based on netX 90 system-on-chip and available in PCI Express and low-profile PCI Express form factor.
Hilscher has introduced the cifX PCIE90-RE and the cifX LPCIE90-RE, two new PC cards for secure industrial communication. The cards are based on the netX 90 system-onchip (SoC) and are available in PCI Express and low-profile PCI Express form factors.
They target manufacturers of PC-based automation solutions who want to integrate their products into industrial networks securely, with high performance, and with long-term availability.
With these new cifX variants, Hilscher is expanding its proven PC card portfolio with two future-ready solutions for today’s and tomorrow’s requirements. The cards are suitable successors to established netX 100-based cifX PC cards with a PCI Express interface, such as the cifX 50E-RE. They are also ideal for new device generations where security, performance, and flexibility are key design criteria from the outset.
Secure and scalable communication
At the core of both cards is Hilscher’s multiprotocol-capable netX 90 communication controller. It provides the technological foundation for unified, secure, and scalable

industrial communication. The cifX PCIE90-RE and the cifX LPCIE90-RE support common Industrial Ethernet and IIoT protocols:
• PROFINET IO-Device
• EtherCAT SubDevice
• EtherNet/IP Adapter
• OpenModbus/TCP Server/Client
• CC-Link IE Field Basic Slave
• POWERLINK Controlled Node
• Sercos Slave
• OPC UA Server
• MQTT Publisher/Subscriber
Hilscher North America, Inc.
EC 122 EtherCAT interface module extends SIGMATEK's real-time Ethernet bus portfolio.
With the EC 122 interface module, SIGMATEK is extending its portfolio in the field of high-performance system connections and optimizing the integration of S-DIAS modules into EtherCAT networks.
The module allows simple and efficient integration of S-DIAS modules – both standard and safety I/Os – into an EtherCAT bus system. Equipped with one EtherCAT input and one EtherCAT output , the compact interface module not only handles the complete EtherCAT slave configuration, but also the data exchange between EtherCAT and the S-DIAS system bus.
The EC 122 comes with an integrated power supply for up to 32 modules and automatically detects connected S-DIAS I/O modules. Compared to the EC 121, the new interface module offers a significant performance increase.
With its multiple amount of process data, the EC 122 is ideal for demanding applications – including Safety over EtherCAT (FSoE). This allows EtherCAT SubDevices to be integrated into a safe application as FSoE slaves. With the Power

Boost Module (PSB 001), an additional 32 modules can be addressed.
Machine and system manufacturers thus benefit from even greater flexibility and performance within the proven S-DIAS system.
Dimensions of the unit are 25 x 104 x 72 mm (W x H x D)
SIGMATEK
Learn More
LITE network adapters for powerful PROFINET and Modbus communication in a small form factor.
Emerson has announced the release of new, compact LITE network adapters for the PACSystems™ RSTi-EP remote I/O system. Bringing higher performance in half the space of traditional network adapters, the new LITE network adapters deliver remote, real-time diagnostics to help simplify machine design and maintenance.
Two LITE network adapters are available— one for use with PROFINET and another for Modbus. The network adapters are plug-andplay capable, making them easy to install and maintain in existing RSTi-EP products. They offer high availability and intuitive integration with existing systems.
Emerson's new LITE network adapter for the PACSystems™ RSTi-EP system delivers highperformance PROFINET or Modbus connectivity in a compact design, providing real-time remote diagnostics to simplify machine design and maintenance.
Slice I/O modules are typically deployed in industrial environments with limited space, high power requirements and a need for ruggedized housing that can withstand harsh

conditions, such as temperature extremes and high vibration. Slice I/O options offer modular, “build as you go” flexibility that can operate with any PROFINET PAC controller. Easy to configure, operate and upgrade, Slice I/O helps process large data flows at high speed while
TURCK expands the TBEN-LL-4RMC series with variants featuring IO-Link and 24/48 V option.
The TBEN-LL-4RMC series from TURCK controls up to four CANopen-based motors directly in the field. In addition to Interroll motors, the robust IP67 module also supports drive rollers from MTA and MPC. The new variants feature IO-Link master ports for integrating smart sensor technology and enable connection via standard M12 L-pinning. This allows conveyor systems to be automated in a modular fashion without a control cabinet.
The TBEN-LL-4RMC modules combine motor control, digital I/Os, and IO-Link in a robust IP67 housing. Users benefit from multi-protocol Ethernet and programming-free ARGEE logic for decentralized control systems. The 24 or 48 V power supply increases flexibility and supports the trend toward energyefficient drives. Preprogrammed logic facilitates the implementation of zero pressure accumulation and other conveyor strategies.
The module supports the trend toward 48 V drives in intralogistics, material flow, and production logistics. The technology

allows for smaller cable cross-sections and longer cable lengths, which improves the efficiency and cost balance of the power supply.
The new IO-Link variants enable the integration of signal lights, pressure sensors, or other intelligent devices. TURCK
offers the controllers in four variants: TBEN-LLH-4RMC, TBEN-LL-4RMC, TBEN-LLH4RMC-2IOL, and TBEN-LL-4RMC-2IOL.


The best of both worlds ... the in-depth technical features our readers expect, but now also a daily blog with the latest product news and industry updates.
The Industrial ethernet magazine has been rebranded Industrial Ethernet, but it's still the only publication worldwide dedicated to Industrial Ethernet automation and machine control networking, the IIoT and Industry 4.0. The difference is a deepened focus on a daily blog to deliver more and deeper content (more product news, industry updates and technology focus) to keep our readers fully informed ... while also delivering the Industrial Ethernet magazine they have come to expect.