





![]()






In an era when global OEMs monitor performance across continents through AI-enabled dashboards, it is easy to assume that manufacturing has fully transcended its pen and paper clipboard past. Yet across the broader supply chain, much of industrial accountability still echoes that earlier rhythm. Cycle times are recorded by hand. Line interruptions are logged at the end of a shift. For generations, production data has depended not on algorithms, but on people.
The technology has changed. The human element has not.
Today, artificial intelligence can surface patterns no analyst could detect alone. It can forecast downtime, optimize throughput, and illuminate inefficiencies buried in years of operational history. Its capability is not in question. What remains uncertain—and increasingly consequential—is the quality of the data it consumes.
Incomplete records, inconsistent inputs, and manipulated figures—whether born of omission, misunderstanding, or the temptation to “cook the books”—carry greater risk than ever before. In a data-driven environment, flawed information does not merely distort a report; it can misguide capital investment, production planning, and workforce strategy. Poor data governance is no longer a clerical oversight, it is an operational liability.
This raises two essential questions for industry. First, how do manufacturers protect and ensure the integrity of their data—from the shop floor to the enterprise system? Second, how can AI be deployed to its fullest potential once that foundation of trust is established?
To explore these questions, Automation Alley convened leaders from manufacturing, government, and academia for a focused roundtable on data and industrial intelligence. The discussion examined how to unlock value from siloed historical records, how to approach predictive and prescriptive analytics with discipline, and how to make prudent investments in new technologies that capture and contextualize operational data in real time.
The insights gathered in this playbook reflect both optimism and caution. There is immense opportunity in the convergence of data and AI—but only if industry treats information as a strategic asset, not a byproduct of production.
Automation Alley is proud to host these conversations and to help inform the future of manufacturing. The path forward will not be defined solely by smarter machines, but by smarter stewardship of the data that drives them.



Tom Kelly Executive Director & Chief Executive Officer Automation Alley

Attendee List:
Sponsors, Supporters and Roundtable Participants
Main Feature: From Data to Industrial Intelligence: How Manufacturers are Confronting the Next Operational Challenge
Expert Insights: Lenovo | NVIDIA From Data to Decisions: Lenovo and NVIDIA Building Intelligence at Industrial Scale
Expert Insights: MetroLogic DCS
Total Design Responsibility in Manufacturing: Who Really Owns the GD&T?
Recommendations: Industry
Recommendations: Academia
Recommendations: Government
Key Takeaways: Main points from the Integr8 Roundtable discussion Sources


SPONSORED BY: Lenovo + NVIDIA & Metrologic DCS
SUPPORTED BY: Michigan Economic Development Corporation, Michigan Manufacturing Technology Center and US Center for Advanced Manufacturing
Tom Kelly - Executive Director & Chief Executive Officer, Automation Alley
Steve Bannasch - Dimensional Engineering Strategist, Metrologic DCS
Sara Barton - Manager of Corporate and Business Development, University of Michigan - Flint
Amy Boston - Account Executive, Lenovo
Jim Brady - VP, Business Development and Partnerships, Viaduct
Mike Brooks - Industry Manager - Automotive, Phoenix Contact
Ron Crabtree - CEO, MetaOps Inc
Ebbin Daniel - Founder and CEO, Machine AI Solutions
Will English - Plant Manager, ARCH Cutting Tools
Tim Finerty - Partner, Wipfli
William Gard - VP, D&F Corporation
George Greenough - Technology & Operations Leader, G2 Create
Leland Gute - Senior Infrastructure Account Executive, Lenovo
Jason Hamp - AI Technologist - Manufacturing, Lenovo
Patrick Hillberg - Adjunct Professor - Industrial and Systems Engineering Dept., Oakland University
Tina Hurite - Vice President of Operations, Michigan Manufacturing Technology Center
Dennis Irwin - General Manager, ARCH Cutting Tools
Wesam Iwas - Managing Director, Ankercloud Inc.
Robert Joyce - President and Founder, Innovative Plastics and Molding dba, FibreTuff
Scott Kilberg - Account Executive, Jendamark
Amy Kitchen - Director of Strategic Growth, MetaOps Inc
Tom Krent - Career Pathways Coordinator, City of Troy
Gary Krus - VP of Business Development and Operations, HIROTEC AMERICA
Eric Lynch - Director of Business Development, build/create studios
Karolina Malyska - VP Product Strategy & Innovation, Metrologic DCS
Paul Marcus - CEO, DataOps
Julie Oldham - Business Growth Consultant, Michigan Small Business Development Center
Gaetan Pluton - Director R&D, Metrologic DCS
Mahesh Ramamurthy - Founder and CEO, SatoriXR
Ben Reese - Marketing Manager, Metrologic DCS
Tom Schneider - President, HA Industries
Hazaa Shahit - Senior Business Developer, Wipfli
George Stasiw - Senior Account Executive, Samsung
Daniel Vachoo - Director, Commerical Operations, Dechen
Apoorv Waghmare - Technical Sales Specialist, Prescient
Patrick Williams - Sales Manager, GRT-USA
Markus Windisch - CTO/Founder, Peerox GmbH
Benjamin Wixson - Group Manager, ASA, Inc



Manufacturing has become the most data-intensive industry in the global economy. Factories generate massive streams of information from machines, sensors, inspection systems, supply chains and workforce activity.
Industry estimates suggest manufacturing produces roughly 1,812 petabytes of data annually, more than finance, telecom or retail sectors. Meanwhile, a single smart factory can generate more than 5 petabytes of operational data per week through connected equipment and industrial IoT systems.
And that volume is accelerating. Nearly half of manufacturing leaders report that the amount of data their organizations collect has doubled in just the past two years, underscoring how quickly the industrial landscape is shifting toward data-centric operations.
Yet the real question is not how much data factories generate. It is how effectively they turn that data into insight.
Automation Alley convened leaders from industry, academia and technology to explore a central question:
How can manufacturers transform raw operational data into industrial intelligence that improves performance, resilience and competitiveness?
The answer is not constrained by technology. The tools already exist. The friction lies in how data is structured, interpreted and acted upon inside the modern factory. What emerges is a set of systemic challenges and emerging opportunities that together define the next phase of industrial evolution.


For all the progress made under the banner of Industry 4.0, most manufacturing environments remain fundamentally fragmented. Data is abundant, but it is rarely unified.
Engineering data lives in CAD and PLM systems. Production data is generated through MES platforms and machine controllers. Quality data is captured in inspection systems. Supply chain data sits within ERP environments. Each system is optimized for its purpose, but not for interoperability.
The result is a patchwork of information that resists consolidation.
This fragmentation is not accidental. It is the byproduct of decades of incremental technology adoption, where systems were layered in response to specific needs rather than designed as part of a cohesive data architecture.
Even when integration is technically feasible, organizational realities intervene. Departments often operate with distinct priorities, metrics and ownership structures. Data becomes territorial.
At its core, this is also a structural misalignment of ownership. Tolerances, data structures and production realities often sit in different parts of the organization, disconnected from one another.
“GD&T (Geometric Dimensioning & Tolerancing) is a data structure,” said Steve Bannasch, director of business development at Metrologic DCS. “Tolerance is owned by one group and the data structure is owned by another. And when you look on the floor, nothing ever matches.”
The consequence is a lack of visibility across the full production lifecycle.


“When you try to do cross analysis, the quality group is the one trying to identify the issues. Every quality group needs to be elevated to the top so they own the whole product cycle.”
Steve Bannasch Director of Business Development Metrologic DCS

A defect discovered at final inspection may trace back to a design tolerance, a supplier inconsistency or a process variation on the shop floor. Without integrated data, identifying that root cause requires manual investigation, slowing response times and increasing costs.
Too often, quality teams are left to bridge the gap after the fact.
“When you try to do cross analysis, the quality group is the one trying to identify the issues,” Bannasch said. “Every quality group needs to be elevated to the top so they own the whole product cycle.”
Breaking down silos requires more than connecting systems. It demands a shift toward shared data models and governance structures that prioritize enterprise-level insight over departmental optimization.
Data initiatives often struggle to gain traction for a simple reason: their value is difficult to quantify in advance.
Traditional capital investments in manufacturing come with clear expectations. A new machine increases throughput. A tooling upgrade improves precision. The return can be modeled with reasonable confidence.
Data investments are different.
The value of improved visibility, predictive capability or faster decision-making is inherently probabilistic. It is realized over time, often in the form of avoided disruptions rather than immediate gains.
This creates hesitation. Organizations delay investment until a problem becomes visible and measurable.
“There needs to be prioritization of what will be the most important data to collect instead of trying to collect all the data,” said Tina Hurite,

“There needs to be prioritization of what will be the most important data to collect instead of trying to collect all the data.”
Tina Hurite Vice President of Operations Michigan Manufacturing Technology Center


vice president of operations at Michigan Manufacturing Technology Center. “It is just too much.” This instinct to wait for clarity often results in reactive decision-making.
By the time ROI becomes obvious, the cost has already been incurred in the form of downtime, scrap or lost throughput.
Yet the upside of acting earlier is well documented. Advanced analytics and predictive maintenance programs can reduce machine downtime by 30 to 50 percent, lower maintenance costs by as much as 10 to 40 percent and extend equipment life by 20 to 40 percent.
The challenge is that these gains are rarely realized until after implementation.
machine downtime
lower maintenance costs
30-50% 10-40% 20-40%
A more effective approach is incremental.
“Prioritize what the heavy hitters are, collect that data, then identify solutions that can help incrementally,” Hurite said. “After that, identify the next batch you want to focus on. It’s done one piece at a time.”
This shift reframes ROI not as a single event, but as a compounding capability.
In many factories, data is generated in real time but decisions are not.
This gap between data creation and action represents one of the most significant constraints on industrial intelligence.
Production systems capture vast amounts of information continuously. Sensors monitor temperature, vibration and throughput. Machines log performance metrics.
Yet much of this data is processed after the fact.
Reports are compiled at the end of shifts. Performance is reviewed in weekly meetings. Insights are extracted retrospectively, when the opportunity to influence the outcome has already passed.

This delay reduces the practical value of data.
Closing the latency gap requires moving intelligence closer to the point of operation.
In one implementation, continuous monitoring of machine data combined with real-time dashboards increased overall equipment effectiveness by 11 percent , simply by enabling faster, more informed decisions.
But speed alone is not enough. Trust and usability must follow.
“It is about trust,” said Jason Hamp, enterprise AI technologist at Lenovo/NVIDIA. “You can’t just create knowledge. You need to pair the older people with shop knowledge and digital natives together.”
Without that alignment, even real-time insights risk going unused.
Data, by itself, does not create understanding.
Manufacturing environments are shaped by deep domain expertise. Engineers, operators, quality specialists and executives each interpret information through different frameworks.
This creates a translation problem between digital systems and physical processes.
Even the most advanced models must be grounded in real-world context.
“There needs to be humans involved in AI data decisions,” Hamp said. “It can be very insightful and draw conclusions, but you need someone who has the experience to know if what it says is true.”
At the same time, much of the most valuable knowledge in manufacturing is tacit. It resides in the experience of seasoned workers who understand how processes behave under varying conditions.
“There needs to be humans involved in AI data decisions. It can be very insightful and draw conclusions, but you need someone who has the experience to know if what it says is true.”
Jason
Hamp Enterprise AI Technologist
Lenovo/NVIDIA

“How do we collect data and collect the tribal knowledge of the shop floor?” Bannasch asked. “Even when you collect data from machines, you still have this large missing part of what makes it all work together.”
Bridging this gap requires a convergence of perspectives.
Analytics must be grounded in operational reality, and expertise must be translated into structured, usable data.
The goal is not to replace human knowledge, but to scale it.


Precision has long been a defining characteristic of manufacturing.
In the context of data, however, consistency often delivers more value than absolute accuracy.
Highly accurate data that is inconsistent or incomplete is difficult to analyze. Consistent data, even with minor imperfections, enables comparability and trend analysis.
This shift is closely tied to how manufacturing systems operate in practice.
“When it comes to automation, it needs to be repeatable and in the same spot,” said Gary Krus, vice president of business development at HIROTEC.
Repeatability, not perfection, is what enables control.
Manufacturers are increasingly prioritizing standardized data collection and stable measurement environments, creating a more reliable foundation for analytics and decision-making.
Consistency creates continuity. Continuity enables insight.

Consistency creates continuity. Continuity enables insight.



“If you want to create a quality product, you have to understand what you are collecting from end to end.”
Gary Krus Vice President of Business Development HIROTEC
As manufacturing becomes more connected and data-driven, cybersecurity risks increase exponentially. AI systems controlling critical production processes are potential targets for cyberattacks. Protecting intellectual property, maintaining data privacy, and ensuring operational security while keeping systems accessible presents a complex challenge for manufacturers implementing AI solutions.
One of the most tangible ways data is reshaping manufacturing is through serialization.
By assigning unique identifiers to individual parts and tracking them throughout the production process, manufacturers create a continuous digital thread.
This allows errors to be traced back to their source with far greater precision.
“If you want to create a quality product, you have to understand what you are collecting from end to end,” Krus said.
That end-to-end visibility is what transforms quality from a reactive function into a proactive one.
Serialization enables manufacturers to identify patterns, isolate variables and intervene earlier in the process. Over time, this reduces defects, lowers scrap rates and improves throughput.
It also reinforces a broader shift toward integrated thinking.
“We talked about data vault silos,” Krus said. “How do I collect data and interlink all the areas from design to production?”
That question sits at the center of modern manufacturing strategy.
Artificial intelligence is becoming a catalyst not just for efficiency, but for entirely new ways of working.
In production environments, predictive maintenance alone can reduce unexpected equipment failures by up to 70 percent while increasing asset availability by 5 to 15 percent .
More broadly, manufacturers deploying advanced analytics are seeing 30 to 50 percent reductions in downtime, 10 to 30 percent increases in throughput and 15 to 30 percent improvements in labor productivity.
But the path to those outcomes is rarely linear.
“If I look at AI, I’m going to take it small and look at specific areas, and try to grow it,” Bannasch said. “I do not want to look at it forcefully.”
This measured approach reflects a broader shift toward practical adoption.
“Start with the problem at hand,” Hamp said. “Start with the resources you have.”
That mindset is shaping how AI is deployed across manufacturing.
Rather than attempting large-scale transformation all at once, organizations are targeting specific use cases, building internal confidence and expanding over time.
The long-term potential, however, extends even further.
“I think the focus will eventually turn to small language models, bespoke to the industry they are meant for,” Hamp added. “That’s where you’ll start seeing these giant leaps.”
These systems, trained on domain-specific data and informed by real-world expertise, have the potential to bridge the gap between data and decision-making in entirely new ways.

Manufacturers deploying advanced analytics are seeing: reduction in downtime increase in throughput improvement in labor productivity 30-50% 10-30% 15-30%
The manufacturing sector is not short on data. It’s in the process of learning how to use it.
The challenges are structural. Fragmented systems limit visibility. Uncertain ROI slows investment. Delayed decision-making reduces impact. Misaligned perspectives hinder adoption.
But these challenges are not permanent. They are transitional.
As manufacturers standardize data, close the latency gap, align expertise and rethink how value is measured, a different model begins to take shape.
“Get ready to adapt; fail fast, but keep your eye on the ball and keep pivoting,” Hamp said.
That may be the defining characteristic of this moment.
The shift underway is not about collecting more information. It is about creating systems that can interpret and act on it in meaningful ways.
Factories are becoming more connected, but connectivity alone is not the goal.
The objective is industrial intelligence.
And for manufacturers that can bridge the gap between data and action, the reward is not incremental improvement. It is a step change in how industrial performance is achieved.





Jason Hamp AI Technologist - Manufacturing Lenovo
The story of industrial intelligence is no longer about technology arriving on the factory floor. It is about what happens after it arrives—how decisions change, how work changes, and how performance compounds over time.
Across its global manufacturing footprint, Lenovo has lived this shift firsthand moving from stabilizing operations amid volatility to running factories where data continuously drives action at scale. NVIDIA AI and accelerated computing make this operational, enabling real-time insight and response where decisions are made.
Several years ago, the pressures facing manufacturing were already visible: rising complexity, shrinking tolerance for quality escapes, labor constraints, and growing expectations around sustainability. Data was abundant, dashboards were plentiful, yet decisions still lagged events on the shop floor.

What would it take to make intelligence repeatable across every plant , not exceptional in just one?

The realization came gradually. Intelligence could not remain an analytical layer sitting above operations. It had to become part of how factories run—informing schedules, guiding material movement, flagging risk early, and shaping daily decisions.
This marked a shift from experimentation to operating model change.
Lenovo’s transformation did not begin with a single use case. It began with a question: What would it take to make intelligence repeatable across every plant, not exceptional in just one?
Two manufacturing sites would come to exemplify the answer.
In Hefei, China—home to the world’s largest single PC manufacturing operation—Lenovo confronted extreme demand volatility alongside an unprecedented level of product customization. Rather than optimizing individual steps in isolation, the company embedded intelligence directly into how planning, quality, and supplier interactions operated together. Flexible production lines could be reconfigured in hours, AI compressed planning cycles from hours to seconds, and real-time quality and energy signals surfaced issues as they emerged, not after the fact. Over time, this shifted how work moved through the factory—transforming issue resolution from reactive firefighting into a continuous, data- driven flow of decisions.


In Monterrey, Mexico, Lenovo’s largest North American site confronted scale of a different kind—thousands of suppliers, tens of thousands of SKUs, and dozens of markets. Here, intelligence became the connective tissue across supply, production, and logistics, linking real-time demand signals, supplier capacity, and transportation flows. This allowed the factory to sense disruption early, understand its downstream impact, and respond decisively before delays cascaded across the network.
What mattered most was not the technology itself, but the consistency with which intelligence was applied across decisions.

Overall productivity increased by up to 58% through better coordination of people, machines, and materials.
When intelligence moved from insight to action, the results in these two factories became visible across multiple dimensions at once.
• Lead times reduced by up to 85%, realized across the Aug 2024–2025 Lighthouse rollout and impact-capture period as scheduling and execution shifted from batch planning to real-time responsiveness
• Overall productivity increased by up to 58% through better coordination of people, machines, and materials
• Quality-related losses reduced by 56% through earlier detection and prediction of defects
• Supplier quality issues reduced by 55% as upstream data was integrated into operational decisions
• Logistics costs reduced by approximately 42% as material movement became demand- driven, guided by real-time demand and disruption signals, rather than forecast- driven
• Carbon emissions reduced by around 30% by optimized throughput, asset utilization, and energy management
What is notable is not any single metric, but that these gains occurred together. Productivity improvements did not come at the expense of quality. Sustainability gains did not slow output. Intelligence aligned outcomes that were once treated as trade-offs.


The defining lesson from Lenovo’s lighthouse experience was repeatability. Once intelligence was treated as an operating capability— governed, standardized, and embedded— it could move from one site to another without reinvention.
This is where the concept of the AI-ready production network emerged.
An AI-ready production network is not about the number of models deployed. It is about the organization’s ability to:
• Act on data where it is created
• Apply consistent decision logic across sites
• Extend proven practices without rebuilding foundations
• Improve performance cumulatively over time
This is how intelligence stops being a project and becomes infrastructure.

Operational intelligence only works when insight arrives in time to matter. NVIDIA accelerated computing and AI makes this possible—supporting real-time analysis, vision-based quality, and rapid response at the point of operation.
The result is not automation for its own sake, but confidence: confidence that decisions are based on current conditions, and confidence that intelligence can scale without fragility.
Lenovo has made industrial AI deployable through Lenovo Validated Designs—tested manufacturing architectures that integrate edge computing, accelerated AI, and operational workflows. By validating performance, scalability, and reliability upfront, Lenovo enables manufacturers to move from pilots to production without re - engineering foundational systems.


For manufacturing leaders, the implication is clear: industrial intelligence is no longer about tools, but about intent and discipline. The shift requires more than technology—it demands services, change management, and capability building that embed new decision logic into daily operations. Leaders treat intelligence as an operating capability, prioritize repeatability over novelty, and measure success by sustained outcomes. Competitive advantage comes not from moving first, but from intelligence that compounds with every cycle of execution.
The story unfolding across Lenovo’s lighthouse factories illustrates a broader truth for manufacturing. When intelligence is embedded into how work is done—supported by accelerated computing and designed for scale—it becomes a compounding asset.
Productivity improves. Quality stabilizes. Sustainability advances. And most importantly, the organization gains the ability to adapt continuously. That is the promise of industrial intelligence at scale.
For organizations exploring this path—or finding it difficult to translate ambition into execution—the next step need not be taken alone. Lenovo brings hard-won experience from real manufacturing environments and are open to sharing practical lessons, perspectives, and guidance to help leaders shape and advance their own modernization journey.



Lenovo is a US$69B global technology company, ranked #196 on the Fortune Global 500, serving customers in 180 markets worldwide. Building on a full‑stack portfolio spanning AI‑enabled devices, edge computing, data center infrastructure, and services, Lenovo helps manufacturers move from experimentation to real deployment—especially where low‑latency, secure Edge AI is required on the factory floor.


Steve Bannasch
Dimensional Engineering Strategist Metrologic DCS
Why the future of industrial intelligence depends on fixing a decadesold disconnect.
In today’s manufacturing landscape, where data drives decisions and automation accelerates production, an uncomfortable truth remains: many organizations still don’t actually know whether the data they’re collecting is the right data. Sensors, quality stations, SPC tools, and analytics platforms capture millions of data points, yet operations leaders continue to struggle with unresolved variation issues, recurring build problems, and quality surprises that seem to appear out of nowhere.
Why? Because in manufacturing, data is only as good as the specifications behind it—and nowhere is this more apparent than in the realm of Geometric Dimensioning & Tolerancing (GD&T).
Across industries, the question “Who owns the Design/GD&T?” has never been answered clearly. And that ambiguity is creating costly inefficiencies at every stage of the product lifecycle.


Sensors, quality stations, SPC tools, and analytics platforms capture millions of data points, yet operations leaders continue to struggle with unresolved variation issues, recurring build problems, and quality surprises that seem to appear out of nowhere.
This article summarizes the core challenges and introduces the premise of our new whitepaper, Total Design Responsibility in Manufacturing: Who Owns the GD&T?, which explores a more unified, futureready approach.
During product development, engineers are focused on performance, packaging, simulations, and meeting program timelines. As a result, GD&T, one of the most critical components of producibility, often becomes an afterthought, added quickly in early design phases or inherited from prior programs with minimal scrutiny.
This fragmented approach results in:
1. Mismatched Datum Strategies
• Product engineers may define tolerances.
• Manufacturing teams later define datums.
• Suppliers independently redefine both to pass PPAP (production part approval process.)
• By the time the print is “final,” three separate interpretations exist.
2. Disconnected Tolerances
• The tolerances on the drawing often don’t reflect the realities of the fixtures, measurement systems, or assembly processes used downstream.
• Suppliers measure to pass PPAP.
• Plants measure to validate build.
• Quality teams measure to investigate failures.


None of these are always aligned and they rarely trace back to a single authoritative GD&T source.
The result? A product that meets the print on paper… but not in the real world.
Modern plants rely heavily on data-driven decision-making: Six Sigma analysis, SPC (statistical process control) dashboards, AI-based anomaly detection, and more. But when the underlying GD&T is inconsistent, every data set built on top of it becomes unreliable.
This leads to a familiar cycle:
• Variation appears in production.
• Quality teams adjust fixtures or shims “just to keep the line running.”
• These fixes aren’t captured formally, becoming tribal knowledge.
• Engineers never receive this feedback.
• The next program repeats the same issues.
Plants end up treating symptoms instead of causes. Problems get pushed around, not solved.
And all of it stems from one root issue: misaligned specifications.

For years, tribal knowledge and informal fixes have masked GD&T inconsistencies. But with AI accelerating within manufacturing ecosystems, that era is ending.
AI will ingest every accessible data point; design files, quality logs, SPC data, test results, historical measurements, and process them 100,000× faster than any team of engineers. If the underlying specifications are inconsistent or incorrect, AI will not gently reveal the problem; it will magnify it at unprecedented speed.
As stated by Steve himself, “AI is going to come at you like a freight train out of control.”
If manufacturers don’t correct their GD&T ownership structure before AI becomes fully embedded, the gap between specifications and reality will widen, and fast.
Most companies today have some form of Dimensional Engineering (DE) group, but their role varies wildly:
• Sometimes they report to Product Engineering.
• Sometimes Manufacturing.
• Sometimes Quality.
• Sometimes they’re spread across multiple domains, each focused on only part of the vehicle or product.
This fractured structure prevents the very alignment DE was created to provide.

Our perspective is clear: Dimensional Engineering must be elevated to own 100% of GD&T specification, datum strategy, fixturing alignment, and measurement methodology across the entire product lifecycle.
What unified DE ownership enables:
• Consistent datum strategies from concept through launch.
• Aligned tolerances that match fixturing and measurement realities.
• Coordinated supplier guidance preventing “three versions” of GD&T.
• Faster issue resolution, since all variation ties back to a single authoritative model.
• True industrial intelligence, where data accurately reflects the product’s dimensional intent.
When DE becomes the central owner, product engineers can focus on performance and design. Manufacturing can focus on throughput. Quality can focus on validation. And data finally becomes trustworthy.
One of the biggest risks in today’s plants is that real-time fixes; the shims, offsets, locator adjustments, etc., are rarely captured in systems of record. If a toolmaker adjusts a locator by 0.5 mm to keep the line running, that change affects GD&T interpretation, measurement alignment, and downstream performance. But unless someone manually documents it, that information lives only in someone’s memory.

And when that person retires or moves on?
The organization loses critical dimensional intelligence.
As AI expands, this gap becomes dangerous. AI can only analyze what it can see. If the system lacks the real adjustments that make today’s products “work,” AI will misinterpret the system’s behavior entirely.
The solution again points to a centralized GD&T authority, one responsible not only for the specifications, but for ensuring every adjustment, every shim, every locator modification is captured, validated, and integrated.


It’s no longer sufficient to ask, “Who owns the GD&T?”
The better question is, “Who owns the entire dimensional responsibility of the product from concept to production?”
The answer is a unified, elevated Dimensional Engineering function empowered to:
• Define datums
• Define tolerances
• Define fixturing
• Define measurement strategies
• Validate supplier measurement systems
• Ensure every plant adjustment is captured
• Integrate all dimensional data across the lifecycle
This structural shift is not optional, it is necessary for a future where industrial intelligence, automation, and AI play central roles.

Metrologic DCS delivers a unified, technology agnostic quality software suite that connects all metrology data, across sensors, machines, and production stages, into a single, real time quality loop. The platform combines universal 3D measurement execution, advanced dimensional and variation analysis, and quality process control to transform raw data into actionable insights.
The result is concrete and measurable: manufacturers can detect, understand, and correct quality issues earlier, reduce scrap and rework, shorten time to decision, and improve overall production performance.




High‑value insights emerge when machine telemetry is combined with process parameters, material inputs, operator actions, and quality outcomes
Manufacturers are generating massive amounts of data, but maximizing its benefits requires effectively capturing, connecting and using that information. To create a competitive advantage, industry leaders are moving beyond basic data collection toward integrated industrial intelligence strategies that link operations directly to business outcomes.
The following recommendations are practical, evidencebased ways manufacturers can unlock value from data while managing cost, complexity, and long-term uncertainty:
• Focus on capturing contextual data, not just machine signals: High-value insights emerge when machine telemetry is combined with process parameters, material inputs, operator actions, and quality outcomes.
Context-rich data enables root-cause analysis and improves the reliability of predictive models.
• Break down data silos across systems and departments: Disconnected Manufacturing Execution Systems, Enterprise Resource Planning, quality, and maintenance systems limit visibility into how manufacturing decisions interact across the enterprise. An integrated data environment improves coordination and decision-making across operations, engineering, and supply chains.
• Tie analytics initiatives directly to business and manufacturing outcomes: Robust ROI comes from analytics projects linked to clear metrics such as downtime, scrap, throughput, or labor productivity. Use cases should be defined by the technology and operational decisions they enable.
• Begin with predictive use cases before moving to prescriptive systems. Predictive maintenance and quality forecasting offer quicker and more measurable value than fully prescriptive or autonomous solutions. Early successes establish trust and internal capability for more advanced analytics.
• Use targeted pilots to evaluate ROI before scaling: Narrowly focused pilots tied to critical assets or high-impact production lines provide crucial data for broader deployment decisions, reducing risk and aiding in justifying larger investments.
• Evaluate long-term strategic value alongside short-term payback: Some benefits - such as improved resilience, faster decision-making, and AI readiness - accrue over time instead of immediately. Firms need to account for these compounding advantages when evaluating data investments.
• Invest in quality data, interoperability, and standards early: Low-quality data and incompatible formats compromise analytics and inhibit scalability. Aligning with recognized standards reduces technical barriers and supports long-term system interoperability.
• Build cross-functional ownership of data initiatives: Industrial intelligence succeeds when operations, engineering, quality, maintenance, and IT share responsibility for data strategy. Cross-functional practices ensure insights translate into action.
• Data strategy is part of a successful workforce strategy: Analytics tools only create value when workers understand them and have the knowledge and ability to use them effectively. Upskilling and reskilling programs help embed data-driven decision-making and reduce resistance to change. Proper onboarding of new employees is also crucial to a successful strategy.
• Plan for scalability: Data pipelines, governance models, and analytics platforms should be designed as long-term infrastructure. Scalable building designs and factory floor planning allow for easier adoption of advanced technology such as AI, digital twins, IoT and augmented/virtual reality.

Industrial intelligence succeeds when operations, engineering, quality, maintenance, and IT share responsibility for data strategy.




Beyond statistics and machine learning, students need to learn how manufacturing data is generated, integrated, and acted upon across manufacturing execution systems, enterprise resource planning, quality, and maintenance systems.
The academic sector is essential for equipping the future workforce to shape how data and industrial intelligence are developed, trusted, and implemented in manufacturing.
It is crucial for learning institutions to connect their research, curriculum, and partnerships to the actual needs of manufacturing, preparing both students and the industry for lasting technological evolution.
The following are ways to accomplish that aim:
• Prioritize research that reflects real data environments: Academic research often relies on clean, well-structured datasets, while manufacturers contend with noisy, incomplete, and inconsistent data. Harmonizing classroom research projects with industrial plant conditions improves the transferability of data analytics, AI models, and decision-support tools.
• Emphasize data context and systems thinking in curricula: Beyond statistics and machine learning, students need to learn how manufacturing data is generated, integrated, and acted upon across manufacturing execution systems, enterprise resource planning, quality, and maintenance systems. Teaching systems-level thinking prepares the future workforce to design analytics that work in the manufacturing industry.
• Integrate applied analytics, AI, and industrial use cases into coursework: Hands-on exposure to quality analytics, predictive maintenance, and optimized processes helps reduce the gap between theory and practice. Using applied learning environments, including digital twins and simulation-based labs, allow students to experiment and become familiarized with real-world scenarios.
• Foster stronger industry-academia links on data access: Promote joint research efforts that grant students and faculty entry to anonymized industry data, leading to quicker innovation and increased relevance. By partnering, manufacturers can delve into advanced analytics, cutting down on risk and expenditure.
• Realign research incentives to favor manufacturability and scalability: Academic achievements are frequently evaluated based on novelty rather than practical application. Promoting research into scalability, interoperability, cybersecurity, and workforce adoption will enhance the probability of innovations transitioning from academic research to practical industrial application.
• Encourage standards development and data interoperability research: Academia can contribute to open standards, reference architectures, and data models that reduce fragmentation across manufacturing systems. Aligning research with standards helps secure long-term compatibility and adoption.
• Break down academic silos that connect engineering, data science, and operations: Industrial intelligence sits at the intersection of multiple disciplines . It’s crucial that programs blend mechanical engineering, industrial engineering, computer science, and business better to reflect how data-driven decisions are made in manufacturing.

Academia can contribute to open standards, reference architectures, and data models that reduce fragmentation across manufacturing systems.
• Emphasize data literacy among future industrial workers: While building AI models is not the future of every graduate, each one should have the ability to interpret data, question assumptions, and act on insights. This broad preparation will prepare the workforce to adopt analytics tools effectively.
• Reduce innovation risks through test beds: Innovation risks can be mitigated by utilizing university-led test beds and living labs, where novel data architectures, analytics tools, and AI applications are developed in controlled settings. This facilitates quicker adoption in industry and mitigates risk.
• Measure success with multiple metrics: Evaluating academia’s role in building data analytics and industrial intelligence skills requires looking beyond academic output to include metrics like adoption by the workforce, their preparedness, and enhanced productivity and resilience.




Federal programs that fund data infrastructure, analytics tools, and interoperability research help lower barriers and improve the adoption of industrial intelligence technologies.
Data-driven manufacturing necessitates a solid infrastructure capable of providing firm security, promoting employee advancement, and refining data analytics.
Government initiatives are instrumental in achieving these objectives. By promoting research and coordinating public programs, they establish policies and standards that reduce challenges for businesses, from small to large, ultimately improving national and international competitiveness.
The government can provide the following strategies and initiatives:
• Expand and sustain investments in manufacturing data infrastructure and analytics research: Federal programs that fund data infrastructure, analytics tools, and interoperability research help lower barriers and improve the adoption of industrial intelligence technologies. The Advanced Manufacturing Data Infrastructure and Analytics program aids in creating the infrastructure, measurement science, and tools necessary for broader industry data utilization.
• Provide funding and incentives to accelerate smart manufacturing adoption: The Department of Energy offers the State Manufacturing Leadership Program , which is designed to help fund manufacturers, universities, and state partners to deploy smart manufacturing technologies, advanced computing resources, and training support. The program makes data-driven tools more affordable and accessible, especially for small-to-medium-sized firms.

• Encourage inclusive public-private ventures that boost innovation: The Manufacturing USA network, through its collaborations with the National Institute of Standards and Technology, CESMII (the Smart Manufacturing Institute), and state extension services, fosters partnerships across different sectors. This brings together industry, academia, and public laboratories to collectively create applied solutions, share successful strategies, and grow the adoption of advanced analytics technologies.
• Support development and adoption of data standards and interoperability frameworks: Federally coordinated standards and data frameworks reduce fragmentation across data systems and aid in the sharing of trusted information across supply chains. National Institute of Standards and Technology programs focus on smart manufacturing and data analytics to help establish measurement science, protocols, and best practices.
• Advance secure and ethical data governance practices across public and private sectors: Federal data policy frameworks like the Federal Data Strategy establish principles for responsible, secure, and effective data use, including management, privacy protection, and interoperability. These structures guide agencies and industry leaders in developing practices that can also inform manufacturing data governance.
• Facilitate open access to non-sensitive public data: The government can support transparency and innovation across sectors by providing contextual public datasets that the manufacturing industry can integrate with internal data to establish objectives, set future goals and create models. The datasets could include economic indicators, supply chain performance, and energy usage statistics.


• Coordinate national strategy on digital transformation and industrial competitiveness: The government plays a critical role in leveraging national strategies for smart manufacturing strategic plans and economic policies that align investments, address barriers - such as workforce gaps - and allot and direct resources toward scalable industrial intelligence solutions. Organisation for Economic Co-operation and Development research emphasizes the value of coordinated data and innovation policies in shaping competitive industrial ecosystems nationally and globally.
• Expand workforce development tied to data literacy and analytics adoption: Collaboration between government, academia and technical schools allows educational and training programs to focus on data fluency, digital skills, and analytical literacy, which reduces adoption barriers and strengthens resilience in a data-intensive industrial economy.
• Leverage federal research and test bed facilities for shared research and development and demonstration projects: Expanding government labs and public-private initiatives should increase test beds will allow manufacturers to trial data platforms, analytics tools, and interoperability solutions in controlled environments before fullscale deployment.
• Institutionalize evaluation metrics for public government programs: Establish clear, evidencebased metrics to assess how public programs impact manufacturing data adoption, productivity, and competitiveness. Using evaluation methods like those used by the Organisation for Economic Cooperation helps ensure public investments deliver improvements in the manufacturing industry.

1. Manufacturing is now a data-dominated industry
Factories generate more data than any other sector, but volume alone is not a competitive advantage. The differentiator is how effectively that data is translated into usable insight.
3. The ROI of data is difficult to prove—until it’s too late
Unlike traditional capital invest‑ ments, data initiatives deliver value over time and often through avoided disruptions. This uncer‑ tainty leads many organizations to delay adoption until costs have already been incurred.
2. Data silos remain the industry’s biggest structural barrier
Despite advances in Industry 4.0, most manufacturing data is still fragmented across systems and departments. This lack of integration limits visibility and slows root‑ cause analysis.
4. Real-time data is underutilized due to decision latency
While factories generate data continuously, decisions are often made retrospectively. Closing the gap between data creation and action is critical to unlocking operational gains.

5. Human expertise remains essential to interpreting data
AI and analytics can surface insights, but they require domain knowledge to validate and apply them. Bridging the gap between digital systems and shop floor experience is key to success.
6. Consistency in data matters more than perfect accuracy
Standardized, repeatable data collection enables meaningful analysis and trend identification. Inconsistent data, even if precise, limits its usefulness.
7. Industrial intelligence is achieved incrementally, not all at once
Manufacturers are finding success by starting with targeted use cases and scaling over time. The shift to data driven operations is a gradual transformation, not a single leap.
Retrocausal, Manufacturing Industry Trends https://retrocausal.ai/blog/manufacturing-industry-trends
Manufacturing Leadership Council, 70 Percent of Manufacturers Still Enter Data Manually
https://manufacturingleadershipcouncil.com/seventy-percent-of-manufacturers-still-enter-data-manually-37135/?utm_source=chatgpt.com
Mckinsey, Establishing the Right Analytics-based Maintenance Strategy https://www.mckinsey.com/capabilities/operations/our-insights/establishing-the-right-analytics-based-maintenance-strategy?utm_source=chatgpt.com
Mckinsey, Transforming Advanced Manufacturing through Industry 4.0 https://www.mckinsey.com/capabilities/operations/our-insights/transforming-advanced-manufacturing-through-industry-4-0?utm_source=chatgpt.com
Artesis, AI-predictive Maintenance Real Data Shows 73 Percent Drop in Equipment Failures
https://artesis.com/ai-predictive-maintenance-real-data-shows-73-drop-in-equipmentfailures/?utm_source=chatgpt.com
Com4, Predictive Maintenance: How to Use IOT to Reduce Downtime and Costs https://www.com4.no/en/blog/predictive-maintenance-how-to-use-iot-to-reduce-downtime-and-costs?utm_source=chatgpt.com
Mckinsey, Capturing the True Value of Industry 4.0 https://www.mckinsey.com/capabilities/operations/our-insights/capturing-the-true-value-of-industry-four-point-zero?utm_source=chatgpt.com
Manufacturing USA https://www.manufacturingusa.com/
National Academies http://nationalacademies.org
NIST, Communications Technology Laboratory https://www.nist.gov/ctl
NIST, Smart Manufacturing http://nist.gov/programs-projects/smart-manufacturing
DMREF https://dmref.org/pages/what-we-do
NIST, Manufacturing Extension Partnership https://www.nist.gov/mep
Digital Manufacturing & Cybersecurity Institute (MDEX) https://www.mxdusa.org/
National Academies https://www.nationalacademies.org/
OECD, Science, Technology and Innovation http://oecd.org/innovation/
NIST, Industrial Standards http://nist.gov/el/industrial-standards
ISO https://www.iso.org/standards.html
NSF
https://www.nsf.gov/
OECD https://www.oecd.org/en.html
Manufacturing USA https://www.manufacturingusa.com/
U.S. Department of Labor
https://www.dol.gov/
OECD, Directorate for Science, Technology and Innovation http://oecd.org/sti/
U.S. Department of Energy, Advanced Materials and Manufacturing Technologies Office https://www.energy.gov/eere/amo
NIST, Advanced Manufacturing Data Infrastructure and Analytics Program http://nist.gov/programs-projects/advanced-manufacturing-data-infrastructure-and-analytics-program
U.S. Department of Energy, https://www.energy.gov/mesc/articles/us-department-energy-announces-nearly-13-million-incentivize-smart-manufacturing
Manufacturing, Smart Manufacturing https://www.manufacturing.gov/topic/smart-manufacturing
NIST, Smart Manufacturing https://www.nist.gov/smart-manufacturing
Federal Data Strategy, 2020 Action Plan https://strategy.data.gov/action-plan
NIST, Advanced Manufacturing Data Infrastructure and Analytics Program https://nist.gov/programs-projects/advanced-manufacturing-data-infrastructure-and-analytics-program
U.S. Department of Energy, U.S. Department of Energy Announces Nearly $13 Million to Incentivize Smart Manufacturing https://www.energy.gov/mesc/articles/us-department-energy-announces-nearly-13-million-incentivize-smart-manufacturing
Manufacturing.gov, Topic: Smart Manufacturing https://www.manufacturing.gov/topic/smart-manufacturing
NIST, Smart Manufacturing https://www.nist.gov/smart-manufacturing
Federal Data Strategy, 2020 Action Plan https://strategy.data.gov/action-plan
OECD, Quantifying Industrial Strategies https://www.oecd.org/industry/industrial-policy-and-strategies/quantifying-industrial-strategies/
OECD, Digital Transformation and Policy Issues https://www.oecd.org/en/topics/policy-issues/digital-transformation.html
NIST, Smart Connected Systems Division https://www.nist.gov/ctl/smart-connected-systems-division
OECD, Productivity and Digitalisation https://www.oecd.org/industry/productivity-digitalisation/
NIST, NIST Publications https://www.nist.gov/publications
U.S. Department of Energy, Advanced Manufacturing Office https://www.energy.gov/eere/amo/advanced-manufacturing
NIST, Hollings Manufacturing Extension Partnership (MEP) https://www.nist.gov/mep
National Academies, National Academies of Sciences, Engineering, and Medicine https://www.nationalacademies.org
OECD, Digital Economy Outlook https://www.oecd.org/digital/
World Economic Forum, Shaping the Future of Advanced Manufacturing and Value Chains https://www.weforum.org
NIST, Industrial Standards and Optimization Group https://www.nist.gov/el/industrial-standards
OECD, Skills Beyond School https://www.oecd.org/education/skills-beyond-school/
U.S. Department of Labor, Employment and Training Administration https://www.dol.gov
Manufacturing USA, Securing the Future of U.S. Manufacturing https://www.manufacturingusa.com
Automation Alley is a nonprofit technology business association and Digital Transformation Insight Center focused on driving the growth and success of businesses in Michigan and beyond through innovation and automation. With a global outlook and a regional focus, we foster a vibrant community of manufacturing and technology innovators, entrepreneurs, and business leaders through opportunities for collaboration and learning. Our programs and services help businesses develop the skills and expertise needed to effectively jumpstart or accelerate digital transformation. By bringing together industry, academia, and government, we aim to create a dynamic ecosystem that drives innovation and growth across Michigan.
At Automation Alley, our mission is to help businesses thrive in the rapidly changing digital economy. We equip them with the knowledge, insights, and tools to develop a software-first mindset that leverages the power of automation, AI, and other cognitive technologies. We believe that by working together, we can build a stronger, more innovative, and more competitive economy for the future.
Wealth, prosperity and equality through technology.

Publication Credits
Editorial: Nicole Kampe, Dennis Burck and Joseph Gray
Graphic Design: Laura Gearhart
Photography: Corey Sims

























Made possible in part through ongoing support from the
