Welcome to the latest edition of the Future Digital Twin magazine, produced at a moment when the energy industry is no longer asking whether digital transformation will happen, but how quickly organisations can adapt their structures, culture and operational models to keep pace. Across every conversation at this year’s gathering, one theme has emerged repeatedly. Technology alone is not the story. The real transformation lies in how companies rethink decisions, responsibilities and the relationship between people and machines.
This edition reflects that shift in perspective. The opening feature explores the oil and gas operating model of the future, examining how rising energy demand, investor pressure and accelerating technological change are forcing organisations to redesign how work gets done. It sets the tone for a magazine that is less about technology hype and more about operational reality. From there, insights inspired by thinking from Boston Consulting Group examine what AI at scale looks like inside upstream operations, and why the next phase will involve structural change rather than isolated pilots.
Several features focus on the growing convergence between digital infrastructure and physical energy systems. A discussion featuring leaders from Microsoft, Databricks and Augury examines how AI is becoming embedded in the energy system itself, rewriting the digital nervous system that underpins production, reliability and efficiency. Elsewhere, we look at why the oilfield is moving beyond experimentation, as agentic and generative AI begin to influence control rooms, maintenance strategies and remote operations in ways that demand new levels of accountability.
Data remains at the centre of this evolution, but with a crucial shift in emphasis. Rather than treating data as a project, organisations are beginning to treat governance, contextualisation and trust as the real operating model. Digital twins sit at the intersection of this change, offering not simply visualisation but a new way to connect engineering knowledge, operational workflows and AI driven decision making. The question is no longer how much data exists, but whether it can be understood and acted upon at scale.
This issue also brings powerful voices from across the industry. In conversation with Sheikh Nawaf S Al-Sabah
of Kuwait Petroleum Corporation, we explore how resilience and long term stewardship remain central to energy strategy even as the landscape shifts. Csaba Zsoter of MOL Group reflects on balancing fuel security with transition pressures, while contributors from SLB examine how drilling autonomy is becoming a repeatable operational discipline rather than a future aspiration.
Throughout these pages, a clear message emerges. The future of digital twins, AI and automation in energy will not be defined by isolated technologies or pilot programmes. It will be defined by organisations willing to rethink how they operate, how they collaborate, and how they prepare their people for a system where intelligence is distributed across both machines and humans.
As you explore this edition, I hope the ideas, interviews and analysis challenge assumptions and provoke new conversations. The energy system is being rewired in real time, and the organisations that thrive will be those prepared to evolve with clarity, discipline and purpose.
Mark Venables Editor
Energy demand is rising, capital is impatient, and technology is moving faster than most organisations can absorb. The next decade will reward oil and gas companies that redesign how decisions are made, how work is executed, and how accountability is enforced, not those that simply add more digital tools to the same old structure.
AI has spent years hovering at the edge of upstream oil and gas, promising efficiency while largely remaining confined to pilot projects and isolated tools. New thinking emerging from Boston Consulting Group suggests the next phase will be less about experimentation and more about fundamentally redesigning how production decisions are made.
As AI moves from experiment to infrastructure, the question facing the energy industry is no longer whether digitalisation will reshape operations, but how quickly organisations can adapt their operating models to harness it. Insights from a recent panel discussion suggest that the sector is entering a period where AI is becoming inseparable from physical energy systems, not simply layered on top of them.
Agentic and generative AI are now being pushed from dashboards into control rooms, maintenance regimes and remote operations centres, where the consequences of error are physical and expensive. The winners will be those that treat AI as an operating discipline, not a procurement exercise, and who design for safety, accountability and interoperability from day one.
24 When data stops being a project and becomes the operating model
As energy companies accelerate digital transformation, the focus is shifting away from collecting more data and towards creating context, governance and trust across the organisation. The next phase of digital evolution will not be defined by how much data is available, but by how effectively it can be unified, understood and used to drive operational decisions across increasingly complex assets.
28 Holding the centre as the energy world shifts
In conversation with Sheikh Nawaf S AlSabah, Deputy Chairman and Chief Executive Officer of Kuwait Petroleum Corporation, the future of oil, international partnerships and energy security is framed not through disruption, but through resilience, discipline and long-term stewardship.
40 Balancing fuels and transition in a changing energy landscape
Europe’s downstream sector is navigating a decade defined by uncertainty, competing technologies and shifting regulation. For Csaba Zsoter, Senior Vice President, Fuels at MOL Group, the challenge is not choosing one pathway over another, but keeping an integrated system secure, competitive and ready to scale whichever transition route proves viable.
44 Reshaping the future of work in energy
Real-world applications and use cases have emerged beyond the buzz and hype of AI. Natural language processing (NLP), large language models (LLM), hybrid machine learning (ML), generative and agentic AI— all represent a litany of transformational technologies.
32
When machines learn to think alongside us in oil and gas
The oil and gas sector has never been short of complexity. It operates in some of the harshest environments on earth, under pressures both physical and financial, and with little margin for error. Against that backdrop, the arrival of digital twins infused with artificial intelligence feels less like a new gadget and more like a survival tool.
34 Drilling autonomy is becoming an operational discipline
James Cahalane, Digital Product Manager, SLB and Abeer Musbah, Drilling Operations Deployment Manager, SLB argue that autonomy in drilling is moving from a technology ambition to a repeatable operating model. Their focus is on how DrillOps automation and DrillOps advisory combine AI-driven planning with disciplined deployment to raise performance without expanding operational risk.
Twin Adoption Is Widespread. Fidelity Is Not. Digital twins are now embedded across production assets. AIdriven optimisation, real time dashboards, and predictive analytics dominate the conversation.
48 Final Word: The end of digital theatre in oil and gas
Digital twins and AI are no longer experimental tools or innovation showcases. They are becoming the operating system of modern energy production, forcing organisations to rethink how decisions are made, how accountability is structured and what leadership looks like in a data driven industry.
The oil and gas operating model of the future
Energy demand is rising, capital is impatient, and technology is moving faster than most organisations can absorb. The next decade will reward oil and gas companies that redesign how decisions are made, how work is executed, and how accountability is enforced, not those that simply add more digital tools to the same old structure.
Oil and gas has always been good at surviving shocks, but survival is no longer the benchmark. The sector is being asked to deliver three things at once: reliable supply in a volatile world, stronger and more predictable returns for investors, and operational decarbonisation that does not erode competitiveness. Those demands do not sit comfortably inside the operating models that many companies built for a different era.
A decade ago, much of the industry was organised around scarcity. The world felt short of resources, projects were technically ambitious, and the dominant risk was failing to execute at scale. Organisations grew complex because complexity was treated as control. Central functions expanded to manage standards, assure safety, supervise technical integrity, and govern major capital decisions. It worked, until it did not.
Today the constraints have shifted. The problem is less about whether expertise exists somewhere in the organisation, and more about whether that expertise can be mobilised quickly enough to matter. Layers that once felt like assurance can now look like drag. The time it takes to approve, align, and execute has become a competitive variable, not an administrative detail.
The next decade will not be defined by one shock. It will be defined by persistent, overlapping change. The operating model of the future is therefore not a single design; it is an organisational capability for consistent adaptation. That is a different ambition from periodic reorganisations, cost reduction programmes, or a new digital platform rolled out in phases. It requires a deeper shift in how the company thinks about work, authority, and performance.
Energy demand remains the hard reality underneath every strategic narrative. Renewables will continue to grow, but oil and gas is still expected to remain material in the mix for decades. The transition is not
only a decarbonisation story, it is also an affordability and resilience story, particularly as new sources of demand emerge and as geopolitics reintroduce supply risk. A company that plans its operating model as if hydrocarbons are disappearing on a near-term timetable is likely to misallocate capital and capability. A company that plans as if nothing is changing will be outcompeted by those that industrialise improvement.
Investor expectations add a second constraint. The sector has rebuilt credibility by improving shareholder returns and maintaining capital discipline, but that discipline also reduces room for waste. Investors are not paying for organisational complexity. They are paying for predictable cash flow, controlled risk, and a credible plan to operate efficiently through cycles.
Then there is the technology wave. AI is not simply another digital tool, it changes what a role can do, how decisions are made, and what scale means inside an organisation. Generative AI accelerates analysis and interpretation. Agentic AI points towards orchestration of multi-step workflows, execution of decisions within guardrails, and continuous optimisation with limited human input. Combined with sensors, robotics, digital twins, and edge computing, the result is not a better dashboard. It is a different way of operating, if the organisation is rewired to exploit it.
The operating model of the future therefore sits at the intersection of demand reality, investor discipline, and technological acceleration. Four shifts will shape it.
AI becomes organisational infrastructure
Most companies still treat AI as a programme. A group is formed, use cases are prioritised, pilots are run, and a capability is declared. That model will not survive the next decade, because the value of AI is not in isolated point
solutions. The value is in the redesign of workflows that sit across functions, assets, and decision layers.
In upstream, this means moving beyond analytics that advise, towards systems that continuously interpret subsurface, well, and production signals, then propose actions inside defined constraints. In downstream, it means linking real-time process data, maintenance history, and market signals into operational decisions that can be executed quickly and safely. In both cases the prize is not insight, it is throughput, reliability, and reduced nonproductive time delivered consistently.
The organisational implication is uncomfortable. AI compresses work. Tasks that used to require multiple specialists and multiple handoffs can be executed by fewer people with stronger oversight, supported by models and agents that handle the repetitive cognitive load. That pushes organisations towards flatter structures, broader role mandates, and a stronger emphasis on judgement rather than process navigation.
The most difficult change will be accountability. When AI suggests a drilling parameter change, flags a maintenance intervention, or identifies a production constraint, who owns the decision and the outcome. Governance cannot remain a compliance checklist. It needs to define where AI is allowed to decide, where humans must approve, and which individual is accountable when something goes wrong. Without that clarity, AI will either be blocked by caution or used in ways that create hidden risk.
Talent strategy also shifts. Traditional career paths assume that expertise is built through repetition of tasks that gradually increase in complexity. If AI absorbs a portion of that repetition, companies must deliberately redesign how earlycareer engineers build intuition. The future organisation will place greater value on integrated capability: managing AI-driven workflows, interpreting outputs,
understanding limitations, and applying operational judgement under pressure.
AI will reward companies that treat it as operating infrastructure rather than a series of experiments. The hype cycle will continue, but the winners will be those that move from pilots to scaled execution without losing control.
Performance is tailored to asset role
Many oil and gas companies still manage portfolios with a one-size-fits-all operating approach. Standards are centralised, processes are harmonised, and the organisation assumes that consistency is the route to control. Consistency remains valuable, but it is not the same as optimisation. An asset designed to generate cash late in life does not need the same operating model as a technically complex growth project. Treating them the same can destroy value in both directions.
The operating model of the future segments assets by their role in the portfolio, then designs distinct ways
of working around those roles. Cashgenerating assets require integrated accountability at the asset level, lean central oversight, and ruthless prioritisation of activity. Work that does not protect safety, compliance, production, or cost should struggle to justify itself.
Maintenance becomes a performance lever rather than an article of faith. Predictive programmes must prove value, not simply exist. Corrective work must be scrutinised for necessity and timing, not performed by default. Capital discipline becomes operational discipline, prioritising interventions that protect reliability and safety rather than chasing optional improvement.
Growth-oriented assets demand something different. They may still require strong central functions for decision support, technical assurance, and scaling capability. The central function of the future, however, cannot be a gatekeeper. It must be a differentiated engine that brings advanced tools, specialist expertise, and AI-enabled workflow improvements that accelerate delivery and reduce risk. Scale for its own sake is not enough. Centralisation must translate into speed and quality, not delay.
The future model does not return to the asset-centric organisations of the past. Portfolio optimisation still matters. The shift is towards differentiated operating models within one enterprise, supported by central functions that adjust their posture based on the asset’s purpose. That requires transparency in performance tracking, an ownership mindset, and risk-based decision-making that is explicit rather than implied.
Exploration offers a preview of what this can look like. It has increasingly moved towards being managed as a global business with commercial targets, treating licences as assets to be optimised, traded or exited. That is not a minor structural change. It is a shift in identity, from function to value engine.
Operatorship becomes modular
Integration has historically been the default belief. Acquisitions were folded into the same operating model, functions were shared, and career paths were standardised. That logic is weakening because portfolios are fragmenting. Risk and return profiles diverge sharply across asset types, and the market is sceptical of the assumption that one organisation can optimise everything under one governance structure.
This is pushing companies to rethink operatorship and ownership. Some assets may be better served by structures with greater operational independence, such as independent joint ventures that can raise financing and operate with a sharper mandate. Some new energy businesses may be ring-fenced to attract different capital and move faster without being constrained by legacy processes and incentives. Some mature assets may be placed in satellite models where performance discipline is enforced locally and central functions become strategic overseers rather than daily managers.
This shift raises a hard question: what is the minimum viable core that a company must retain to remain distinctive. If operatorship becomes modular, the parent company must excel at asset management, portfolio design, and performance governance. That means creative financing, active portfolio management, and an ability to define clear decision rights across partnerships without drifting into ambiguity.
Partnership flexibility is not a retreat from competence. It is an acknowledgement that different businesses require different operating standards, different cultures, and different talent strategies. The companies that master modular operatorship will be able to scale faster and adapt more quickly, while those that insist on full integration for every asset will carry structural weight that erodes returns.
Resilience is designed for a fragmented world Globalisation is no longer a stable assumption. Tariffs, trade controls, localisation mandates, and geopolitical shocks are altering cost structures and constraining supply chains. The operating model of the future must treat resilience as a design principle, not a crisis response.
This changes how companies think about talent and capability. A mature basin needs steady operators and proven processes. An early-phase development needs builders and problem-solvers. A portfolio that spans both cannot rely on fixed organisational structures tied to geography. It needs workforce planning by skills and mobility, and it needs systems that connect local teams to global expertise without creating dependency.
Remote working tools and digital collaboration platforms are part of this, but the deeper change is the rise of global capability centres that function as operational extensions rather than back-office support. The next generation of these centres will deliver high-value technical work, subsurface modelling, engineering design, remote operations support, maintenance optimisation, and realtime performance monitoring. They will become engines of execution at pace, enabling follow-the-sun operating rhythms and faster learning cycles across assets.
The future competitive advantage will be less about where a company operates and more about how effectively it mobilises capability across its footprint. Local empowerment will matter, not as a slogan, but because localisation requirements and social licence increasingly demand it. Global integration will matter, because expertise is scarce and must be deployed where it
2026 EVENTS PROGRAMME
AI at scale and the quiet rewiring of upstream operations
AI has spent years hovering at the edge of upstream oil and gas, promising efficiency while largely remaining confined to pilot projects and isolated tools. New thinking emerging suggests the next phase will be less about experimentation and more about fundamentally redesigning how production decisions are made, where value is found, and how existing assets are pushed closer to their technical limits.
The oil and gas sector has always been technologically ambitious. Subsea systems, seismic imaging, directional drilling and complex process control all emerged from a culture that understands engineering complexity better than most industries. Yet the digital era has produced an uncomfortable paradox. Significant investment has flowed into data platforms and analytics over the past decade, but many operators still struggle to translate digital ambition into measurable production gains. The gap between digital promise and operational reality remains stubbornly wide.
What is now changing is the context. Global demand for reliable energy continues, capital discipline remains strict, and many operators face mounting pressure to extract more value from mature assets without committing billions to new developments. In this environment, the strategic question is no longer whether AI has a role in upstream operations, but whether production optimisation itself can remain competitive without it.
The argument emerging from recent industry analysis by Boston Consulting Group in its report AI@Scale in Upstream Oil & Gas Production Optimization is that the biggest prize is not in maintenance dashboards or isolated automation tasks, but in the core flow of hydrocarbons from reservoir to export. That shift in focus is subtle but significant. It places AI not on the periphery of operations but at the centre of production strategy.
Why existing assets matter more than ever
The economics of upstream have evolved. New projects remain capital intensive and politically exposed, while investors increasingly reward predictable cash flows and operational efficiency. Maximising output from existing infrastructure offers a radically different profile, with shorter time to value and lower emissions intensity compared with new developments.
This focus has sharpened interest in production optimisation, a discipline that has
existed for decades but has often been overshadowed by drilling campaigns and major project developments. Operational teams have historically relied on physics-based models and human experience to balance flow rates, pressure management and reservoir behaviour. These methods remain essential, but they are limited by uncertainty, data gaps and the sheer complexity of multiphase flow.
A production engineer may oversee dozens or even hundreds of wells, each generating streams of pressure, temperature and flow data. Interpreting this in real time demands constant judgement calls. Small inefficiencies compound into large losses, and conservative decision-making frequently leaves recoverable barrels underground.
The emerging view is that this conservatism is not a failure of expertise but a logical response to uncertainty. When confidence in data is limited, operators naturally favour caution. AI begins to change that equation by improving confidence in the inputs themselves.
The data shift from model-driven to datadriven thinking
One of the most striking themes in the discussion around AI scaling is the relationship between traditional modelling approaches and newer data-driven methods. Legacy production modelling relies heavily on first-principal physics, using simplified versions of complex equations to estimate flow behaviour. These models have
served the industry well, but they depend on assumptions and approximations that introduce uncertainty.
Data-driven AI approaches reverse the logic. Rather than starting with equations, they begin with operational data, learning patterns from vast historical datasets and applying physics guidance only where necessary. The result is not a replacement for legacy modelling but a complementary lens, one that can identify relationships invisible to conventional simulation tools.
In practical terms, this means better estimates of key variables such as individual well flow rates, which underpin almost every operational decision. Improved visibility at the well level cascades through reservoir modelling, production planning and intervention decisions. For an industry that measures value in marginal percentage gains, even
small improvements in estimation accuracy can translate into substantial financial impact.
Beyond tools and toward operational behaviour
Perhaps the most revealing insight is that technology alone is not the limiting factor. The challenge lies in how organisations make decisions. Digital transformation repeatedly shows that application development and infrastructure matter less than organisational change.
This resonates strongly with upstream realities. Production optimisation is inherently cross-disciplinary, involving reservoir engineers, process specialists, production teams and operations staff. Historically, these functions have operated in silos, with limited integration in day-to-day workflows. AI-generated insights are only valuable if decision-making structures can absorb them quickly and confidently.
The transition to AI-enabled operations therefore requires a cultural shift. Engineers must trust automated recommendations without relinquishing accountability. Managers need to prioritise optimisation activity rather
than allowing firefighting and short-term targets to consume attention. Organisations must redesign workflows so that data-driven insights translate into action rather than remaining trapped on dashboards. This shift from tool adoption to behavioural change may ultimately define which operators succeed in scaling AI.
Where value emerges first
Several production optimisation use cases illustrate the breadth of opportunity. Improved well flow estimation forms a foundational capability, supporting everything from reservoir history matching to operational planning. AI-assisted drawdown optimisation helps balance production uplift against risks such as sand production. Flow instability management and slugging detection allow operators to push systems closer to technical boundaries without compromising safety.
These examples point to a broader trend. AI is most powerful when applied to complex systems where multiple variables interact dynamically and where historical data provides a rich learning ground. Upstream production systems fit this description perfectly.
A recurring message is that value creation often comes not from headline-grabbing automation but from incremental optimisation across numerous small decisions. Each adjustment to choke settings, lift parameters or injection strategies may appear minor in isolation, yet collectively they can deliver significant uplifts in production and recovery. Production gains in the low double digits may sound modest compared with disruptive narratives surrounding AI, but in upstream terms they represent transformative economic impact.
The scale question
Scaling remains the defining challenge. Many operators have already experimented with AI pilots, yet relatively few have achieved enterprisewide adoption. A structured pathway is emerging, beginning with focused pilots on selected assets before expanding functionality and geographical reach. This phased approach mirrors successful digital transformations in other sectors but carries particular importance in oil and gas, where operational risk and safety considerations demand careful validation.
What emerges is a picture of AI scaling not as
a sudden leap but as gradual operational learning. Early deployments build trust, refine workflows and demonstrate value. Over time, these lessons support broader adoption across portfolios.
The concept of cross-asset learning is especially interesting. AI systems trained on data from multiple fields can recognise patterns that individual asset teams may never see. This collective intelligence creates a new source of competitive advantage, turning operational data into a shared resource rather than isolated datasets.
Human expertise in an AI-driven system
Despite the emphasis on automation, human judgement remains essential. AI can identify opportunities and suggest actions, but accountability and strategic direction still rest with engineers and operators. This balance between machine speed and human oversight reflects the industry’s cautious culture and safety priorities.
Rather than replacing expertise, AI changes how expertise is applied. Engineers spend less time gathering and validating data, and more time interpreting insights and making value-based decisions. The role evolves from manual analysis toward strategic optimisation.
This evolution may also help address a growing talent challenge. As experienced professionals retire and recruitment pressures increase, AI-enabled workflows could amplify the effectiveness of smaller teams, allowing organisations to maintain performance despite workforce constraints.
The broader implication is that AI adoption forces a rethink of upstream operating models. Traditional
structures built around periodic analysis and manual decision cycles may struggle to exploit real-time insights. Future operating models will likely emphasise continuous monitoring, rapid feedback loops and closer integration between disciplines.
The result is an organisation that learns and adapts continuously rather than one that reacts periodically. This mindset aligns with wider industry trends toward agility and resilience, particularly as geopolitical and market volatility increase.
It also reframes the conversation about digital transformation. The goal is not to digitise existing processes, but to redesign them around new capabilities.
The decade ahead
The oil and gas industry has experienced multiple waves of technological change, from seismic breakthroughs to shale development. AI-driven production optimisation may prove quieter but equally significant. Its impact will not be measured by dramatic new infrastructure but by incremental improvements sustained across thousands of wells and assets.
The opportunity is no longer theoretical. Tools capable of delivering measurable value already exist. The real challenge lies in organisational willingness to embrace a different way of working.
As energy demand continues to grow and capital remains disciplined, operators will increasingly compete on operational excellence rather than scale alone. Those able to combine engineering rigour with AI-enabled decision-making may extract more value from existing assets while reducing environmental intensity and operational risk.
The future of upstream operations may therefore be less about discovering new resources and more about understanding existing ones with greater precision. In that future, AI becomes not an optional add-on but an invisible layer embedded within everyday decisions.
The shift is already underway. The question for operators is not whether the technology will mature, but whether their organisations can evolve quickly enough to capture the value it promises.
The digital nervous system of energy is being rewritten
As AI moves from experiment to infrastructure, the question facing the energy industry is no longer whether digitalisation will reshape operations, but how quickly organisations can adapt their operating models to harness it. Insights from a recent panel discussion suggest that the sector is entering a period where AI is becoming inseparable from physical energy systems, not simply layered on top of them.
Energy has always evolved through engineering breakthroughs. From offshore platforms to LNG value chains, progress has traditionally been defined by material capability and industrial scale. Digital technology initially appeared as a supporting actor, helping companies monitor equipment and optimise operations at the margins. Artificial intelligence has changed that trajectory. It introduces a possibility that goes beyond efficiency gains and into the reconfiguration of how decisions are made across the entire energy system.
The promise is difficult to ignore. AI can forecast demand, optimise drilling trajectories, identify maintenance issues before they become failures and balance increasingly complex grids. At the same time, the rise of AI infrastructure itself is creating an unprecedented surge in electricity demand. Data centres, high performance chips and cloud platforms are becoming major energy consumers, creating an unusual paradox where digital technology is simultaneously a tool for decarbonisation and a growing source of load.
This tension framed the discussion among the three panellists at the recent Baker Hughes Annual Meeting. Their views reveal an industry beginning to understand that digitalisation is no longer a peripheral initiative. It is becoming the operating system for energy itself.
The data centre paradox
Darryl Willis, Corporate Vice President for Energy and Resources Industry at Microsoft, described the challenge directly. Data centres currently represent a relatively small share of global electricity demand, but growth trajectories suggest a rapid increase over the coming decade. Managing that demand while enabling digital innovation requires a delicate balance.
“AI is as much a part of the solution as it is part of the problem,” Willis explained. “We are using it every day inside our data
centres to optimise operations, from cooling and power storage to water utilisation. The opportunity is to make every part of the system more efficient while bringing new capacity online responsibly.”
His comments highlight a broader truth about AI in energy. The technology’s value cannot be considered in isolation. AI driven infrastructure demands energy, but it also enables far more sophisticated optimisation than previous generations of digital tools. Willis pointed to improvements that move well beyond incremental gains, describing processes that once took weeks being reduced to minutes through AI enabled analysis.
The implications stretch beyond the hyperscale environment. As utilities confront increasingly volatile grids shaped by renewables, electric vehicles and distributed generation, AI offers a way to convert vast datasets into actionable insights. The industry has long been data rich but insights poor. The difference now lies in the ability to interpret complexity at speed.
Moving beyond pilot projects
For years, oil and gas executives have spoken about the potential of AI, yet many initiatives remained trapped in pilot phases. Julien Debard, Director of Energy and Utilities at Databricks, argued that this phase should be coming to an end. “We need to stop talking about proof of concept,” he said. “AI has been with us for a long time. Some customers are already doing this at scale. The challenge is proving competency quickly and moving to production.”
Debard outlined where AI is already delivering measurable value across the energy sector. In drilling operations, integrating subsurface data with real time measurements allows AI systems to optimise trajectories and reduce inefficiencies that would otherwise remain hidden. In production operations, engineers increasingly begin their day with AI generated recommendations highlighting which wells require attention, saving hours of manual analysis.
The change may appear modest, but it marks a significant shift in how expertise is applied. Engineers move away
from repetitive data gathering and towards higher value decisions. This redistribution of effort is one of AI’s least discussed but most powerful effects.
Debard also highlighted predictive maintenance, where AI combines operational data with maintenance manuals, inventory records and workforce certification information to deliver actionable recommendations. The result is not simply earlier fault detection but an integrated operational response that shortens downtime and improves asset reliability. These examples demonstrate that AI’s maturity is no longer defined by algorithms alone. It depends on data integration and organisational readiness.
The reality at the machine level
Saar Yoskovitz, Co Founder and Executive Chairman of Augury, brought a grounded perspective from the factory floor and the operating asset. His focus on rotating machinery and industrial reliability illustrates how AI’s impact often begins with fundamentals rather than grand strategy. “The maths works,” he said. “A significant portion of industrial energy use comes from rotating equipment, and running those machines better can reduce energy consumption dramatically.”
Predictive maintenance has become one of the first large scale applications for oil and gas operators because it delivers clear financial returns. Reduced unplanned downtime, improved throughput and lower maintenance costs create a compelling business case that resonates with operational leaders and financial executives alike.
Yet Yoskovitz also offered a cautionary perspective. Technology alone does not guarantee results. Many organisations struggle to act on AI insights, either because workflows remain unchanged or because frontline teams are reluctant to trust automated recommendations.
“It is often about people more than technology,” he noted. “You can have the best sensor and the best algorithm, but if no one acts on the insight, nothing changes.”
This cultural dimension echoes a recurring theme across digital transformation programmes. AI systems may generate recommendations, but operational impact depends on human behaviour and organisational willingness to adapt.
The human in the loop
The concept of human oversight emerged repeatedly throughout the discussion. In energy, where decisions carry safety and environmental consequences, fully autonomous systems remain unlikely in the near term.
Debard emphasised that governance must be embedded into systems from the start. AI outputs require validation, feedback and transparency, particularly in high consequence environments. Field engineers and operators will continue to play a central role, not as passive recipients of digital instructions but as collaborators shaping how AI learns and improves.
This perspective challenges the notion that automation inevitably reduces human involvement. Instead, AI may elevate human responsibilities by focusing attention on judgement, risk evaluation and strategic decision making.
Willis reinforced the idea that security and trust are foundational requirements. As energy systems become increasingly digital, cyber risk becomes inseparable from operational risk. The expansion of digital surface area demands continuous vigilance and collaboration between technology providers and energy operators.
From incremental gains to system transformation
One of the most striking aspects of the panel was the shared belief that AI enables improvements measured in double digits rather than single digit percentages. Willis described how AI driven analysis has produced substantial reductions in operational timelines, transforming activities previously constrained by manual workflows.
This shift from incremental optimisation to systemic improvement reflects a broader trend. AI systems can process relationships across datasets that are impossible for human teams to analyse comprehensively. The result is not simply faster decision making but a reimagining of what performance boundaries look like.
For oil and gas operators, this could mean more efficient drilling campaigns, optimised reservoir management and reduced emissions intensity. For utilities, it could support real time balancing of increasingly complex energy mixes. For industrial operators, it could unlock energy savings without major capital investment.
However, transformation requires interoperability. Yoskovitz and Debard both stressed the importance of open architectures and partnerships. No single organisation possesses all the necessary components. Cloud providers, data platforms, equipment manufacturers and operational teams must work together to build integrated ecosystems.
The cultural challenge
Despite optimism about technology, the panellists repeatedly returned to organisational dynamics. Yoskovitz described the challenge of aligning IT teams with operational technology specialists, characterising digital transformation as a cultural endeavour as much as a technical one.
Experienced operators often carry decades of practical knowledge and may be sceptical of algorithmic recommendations. Building trust requires transparency, collaboration and evidence of value. Early successes in predictive maintenance have helped because they produce tangible outcomes quickly, creating momentum for broader adoption.
The emergence of robotics and autonomous systems may further accelerate change by closing operational loops. Yet even as automation expands, human oversight remains essential, particularly in safety critical environments.
A new model of partnership
The panel discussion also highlighted an evolving relationship between technology companies and the energy sector. Ten years ago, hyperscale technology firms were largely absent from energy conferences. Today they sit alongside traditional industrial players, reflecting the convergence of digital infrastructure and energy systems.
Willis emphasised that partnership is central to progress. No organisation can solve the energy transformation alone. Collaboration across ecosystems is essential to balance energy demand, improve efficiency and maintain affordability.
Debard extended this idea, arguing that AI solutions should be built like reusable building blocks rather than isolated projects. Data models, workflows and governance structures developed for one use case should accelerate others, allowing organisations to scale without starting from scratch each time.
The path forward
The conversation ultimately points towards a future where digitalisation is not a separate strategy but an intrinsic part of energy operations. AI is becoming the connective tissue between physical assets, operational data and strategic decision making.
The challenge is clear. Companies must invest not only in technology but in organisational change, workforce development and governance frameworks that build trust in AI driven insights. Those that move fastest will likely capture substantial efficiency gains and operational resilience.
As the energy system becomes more complex, the role of digitalisation will only grow. AI’s ability to process vast datasets, identify patterns and support decisions positions it as a critical enabler of both reliability and sustainability. Yet the technology’s success will depend on how well industry leaders integrate it into the human and organisational structures that define operational reality.
The energy sector has always adapted through innovation. The difference now is that innovation is no longer confined to hardware and infrastructure. It resides increasingly in data flows, algorithms and the ability of organisations to learn continuously.
The rewiring of the energy system has already begun. The question for operators is whether they view AI as a tool to optimise existing practices or as the foundation for a fundamentally new model of energy operations. Those who choose the latter may define the next decade of performance.
The oilfield is done with pilots
Agentic and generative AI are now being pushed from dashboards into control rooms, maintenance regimes and remote operations centres, where the consequences of error are physical and expensive. The winners will be those that treat AI as an operating discipline, not a procurement exercise, and who design for safety, accountability and interoperability from day one.
Oil and gas does not need another wave of AI excitement. The sector needs systems that hold up in the messy middle of operations, where sensor noise is normal, procedures are revised under pressure, and a single ambiguous recommendation can ripple into safety risk, non productive time and reputational damage. That is why the conversation is shifting away from model performance and towards industrial behaviour, namely how AI systems perceive, decide, escalate and learn, without undermining the controls that keep people safe and assets stable.
Generative AI arrived with an immediate promise, faster analysis, quicker reporting, a shortcut through the paperwork that sits between intent and action. Agentic AI arrives with a more uncomfortable claim, that the system can execute multi step workflows, coordinate decisions across tools, and close loops that previously demanded human attention. In most industries, that claim is provocative. In the oilfield, it is existential, because the cost of getting it wrong is not simply a wrong answer, it is a wrong decision embedded inside an operating process.
The practical question is therefore not whether agentic systems can be deployed, but where they should be deployed, under what constraints, and with what evidence. The strongest early use cases are not the ones that appear most futuristic. They are the ones that reduce friction in critical business operations without creating new failure modes, and that can be audited, repeated and defended when the regulator, the board, or the incident review asks a simple question, why did the system do that.
Responsibility before capability
Responsible AI in oil and gas is often discussed as an ethical add on. In practice, it is a reliability discipline, because the industry already has mature traditions for safety cases, barrier management, management of change, and operational risk. Agentic systems must be designed to fit inside that reality, not sit beside it as a clever assistant.
That begins with clear boundaries. What decisions can the system recommend, what decisions can it execute, and what decisions must always be confirmed by a human with named accountability. A responsible AI strategy is therefore not a slide deck about
principles, it is an operating design that defines authority, escalation, fallback modes, and the tolerances within which autonomous behaviour is permitted.
The next requirement is observability. If an agent is orchestrating a workflow that touches maintenance planning, spares availability, and a shutdown decision, the organisation must be able to see the chain of reasoning, the data sources accessed, and the confidence level at each step. Black box decisioning is not simply undesirable, it is incompatible with how high consequence industries defend actions after the fact.
The final requirement is resilience under degraded conditions. Oilfield data is rarely clean, networks drop, sensors drift, and manuals do not match the reality on site. A responsible strategy therefore assumes partial information and designs for safe behaviour when the system cannot be sure. In other words, the default state should be conservative, with autonomy increasing only as evidence accumulates and operational trust is earned.
Autonomy meets the field
The most credible vision for autonomous platforms in oil and gas is not a model in the cloud giving instructions to people in the field. It is an integrated stack in which AI, robotics and IoT form a practical mechanism for sensing, interpreting and acting, particularly where humans are exposed to risk or where assets operate beyond convenient reach.
This is where the agentic discussion becomes tangible. A platform that integrates robotics and IoT can turn inspection into a continuous capability rather than a periodic event. Autonomous drones, crawlers, and fixed sensors can capture reality, detect anomalies, and feed a decision loop that creates work orders, recommends interventions, and schedules resources. The agent does not replace the technician. It reduces the amount of time that technician spends discovering problems and increases the amount of time spent fixing them with the right information at hand.
However, autonomy in the field fails when it is framed as automation alone. The oilfield is full of automated systems that behave perfectly until the conditions change. What makes agentic AI different is its ability to adapt workflows to conditions, but that flexibility must be constrained by safety logic and operating policy. If the system can adapt without guardrails, it will eventually do something surprising, and surprises are where incidents begin.
A more useful framing is that the oilfield needs supervised autonomy. Agents can triage alarms, identify patterns in rotating equipment vibration, compare today’s operating envelope with historical baselines, and propose actions that are ranked by risk and value. Humans remain in the loop, not because the AI is immature, but because the organisation needs accountable authority, and because the system’s job is to compress complexity into decisions
that humans can defend.
Safety and compliance at scale
Oil and gas has long struggled with the gap between policy and practice. Procedures exist, but the real-world pressures of time, production targets, weather windows and logistics can erode compliance. AI will not fix that gap by producing more documentation. It will only help if it connects people, process and collaboration in ways that reduce the friction of doing the right thing.
This is where generative systems can have outsized impact when deployed with discipline. A well-designed AI layer can turn compliance from a periodic audit exercise into a continuous operating capability. It can surface the relevant procedure at the moment of need, compare an action plan against required steps, flag deviations, and record the evidence trail automatically. The benefit is not simply speed, it is consistency and defensibility, particularly when operations are distributed across contractors, joint ventures and multiple sites.
Yet compliance is also where hype can become dangerous. A model that summarises a regulation incorrectly, or that hallucinates an inspection
interval, introduces risk at precisely the point where the organisation believes it is being protected. That is why high value compliance use cases almost always require retrieval-based grounding, strict source control, and explicit confidence cues. The system must know what it knows, and it must show what it used.
Open collaboration matters here because safety and compliance are socio technical. Many failures in compliance are not due to ignorance, but due to misalignment between teams, or uncertainty about who owns a decision. Agentic systems that can coordinate workflows across functions can reduce that ambiguity, but only if the organisation agrees the rules of engagement. Without shared operating agreements, the AI becomes another participant in the confusion rather than the mechanism that resolves it.
LLMs and LCMs together
A common mistake is to treat large language models as the centre of industrial AI. In the oilfield, language is only one slice of the problem. The other slice is physics, geometry and behaviour, namely what the asset is, what it is doing, and what it could do under different conditions. This is where large context models and digital twin simulation become essential, because they provide a representation of the system that is grounded in engineering reality rather than text.
The practical implication is that the most capable industrial AI will not be purely generative. It will be hybrid. It will use language models to interpret intent, extract meaning from manuals and logs, and orchestrate workflows across enterprise systems. It will use simulation and digital twins to test decisions, validate operating envelopes, and predict
outcomes under conditions that cannot be safely trialled in the field.
Reality capture also becomes part of this stack. If the model is making recommendations about an offshore module, a pipeline corridor, or an LNG facility, it needs to understand what is physically present, not merely what the documentation claims. Laser scans, photogrammetry and sensor driven context can close the gap between design intent and actual condition, and that gap is one of the most persistent sources of operational risk.
This is the deeper point behind the LLM versus LCM debate. Language models are powerful for interface and orchestration, but they cannot be allowed to invent physical truth. The system should therefore separate conversation from calculation. Let the language layer translate human intent into actions, but let the physics layer validate what is possible, safe and optimal. When these layers are integrated properly, AI stops being a chat tool and becomes an operating instrument.
Remote operations that trust
Connected remote operations have been a promise for years, yet adoption has often stalled in the offshore and onshore reality of bandwidth constraints, legacy control systems, and organisational resistance. Agentic and generative AI can accelerate remote operations, but only if they address the real barriers, which are rarely about algorithms.
The first barrier is trust in data and decision loops. Remote operations teams can only act if they believe what they are seeing reflects the field. That requires contextualised data, consistent tagging, and reliable integration between OT systems and enterprise platforms. AI can help detect anomalies and data quality issues, but it cannot compensate for missing instrumentation or inconsistent operating practices.
The second barrier is authority. Offshore teams often retain control because they carry the risk in the moment. Remote centres can advise, but the decision remains local. Agentic systems can reshape this relationship by providing a shared view of the evidence and by standardising how decisions are reached. However, that only works if the operating model is redesigned, because technology cannot resolve a governance dispute.
The third barrier is cyber and resilience. Remote operations expand the attack surface and increase dependence on connectivity. AI systems can improve security monitoring and anomaly detection, but they also introduce new risks, especially when agents can trigger workflows across systems. Organisations therefore need a security model that treats AI as a privileged component, with strict access controls, segmentation, and audit trails. If the agent can do things, it must be governed like an operator, not like an analytics tool.
The prize is significant. Remote operations supported by agentic systems can reduce travel and exposure, shorten response times, and increase consistency across dispersed assets. They can also help address the workforce challenge by allowing scarce expertise to be deployed across multiple sites without physically relocating. But that prize only materialises when the system is designed around operational reality, not around a technology demonstration.
Open platforms and software agnostic design
Oil and gas has a long history of technology lock in. Proprietary systems, bespoke integrations, and vendor specific data models have created environments where innovation is slow because every new capability requires expensive rewiring. If agentic AI is to move beyond pilots, it must be deployed on open data platforms that enable a system of systems approach.
An open platform is not a marketing claim. It is an architectural commitment to interoperability, with common APIs, standardised semantics, and governance that allows multiple tools to coexist without fragmenting the organisation. This is critical because no single vendor will own the entire AI stack. Operators will use a mix of cloud, edge, OT platforms, simulation tools, robotics systems, and specialist applications. The value lies in orchestration, not in uniformity.
Becoming software agnostic does not mean being vendor indifferent. It means preserving choice by keeping data and context portable. If the organisation can move workloads, swap models, and integrate
new capabilities without rewriting the enterprise, it can adopt innovation at the pace the industry now demands.
Contextualisation sits at the centre of this. Data that is not contextualised cannot be shared safely across systems. Digital twins, knowledge graphs, and common information models provide the connective tissue that allows agents to reason across the enterprise without misunderstanding what a tag, an event, or a piece of equipment represents.
This is where hype meets discipline. Agentic AI will not succeed in oil and gas as a layer pasted on top of fragmented systems. It will succeed when it is treated as a consequence driven operating capability built on open platforms, explicit governance, and representations of reality that can be trusted.
The shift that matters
The oilfield does not need AI that sounds intelligent. It needs AI that behaves responsibly under pressure, that knows when to escalate, and that can be integrated into workflows without creating new brittleness. The most important transition is therefore not from pilots to production, but from novelty to operating confidence.
A responsible AI strategy anchored in autonomy boundaries, observability and resilience is the foundation. The integration of robotics, IoT and digital twins turns AI into a field capable system rather than a reporting layer. A hybrid approach that combines language orchestration with physics based validation reduces the risk of invented truth. Open platforms and software agnostic design preserve flexibility and prevent the next decade of digital transformation being defined by lock in.
Agentic and generative AI will absolutely reshape oil and gas operations, but not because the models are impressive. The transformation will come from organisations that redesign operating models, align governance with autonomy, and build systems that can be trusted when the stakes are highest. When that happens, AI stops being hype and becomes infrastructure, and the oilfield finally gets what it has needed all along, better decisions, faster, with fewer surprises.
When data stops being a project and becomes the operating model
As energy companies accelerate digital transformation, the focus is shifting away from collecting more data and towards creating context, governance and trust across the organisation. The next phase of digital evolution will not be defined by how much data is available, but by how effectively it can be unified, understood and used to drive operational decisions across increasingly complex assets.
Oil and gas has never suffered from a lack of data. Sensors, control systems and engineering workflows generate vast quantities of information every second across upstream operations, pipelines, refineries and LNG facilities. Yet for many operators, the value of that data remains constrained by fragmentation. Information sits in isolated systems, duplicated across departments, interpreted differently depending on discipline, and often inaccessible to those who need it most. The industry’s digital journey has, for much of the last decade, been characterised by attempts to centralise these datasets into data lakes and warehouses, but consolidation alone has not delivered the operational clarity many organisations expected.
The emerging consensus is that data itself is not the destination. Context is. Without shared meaning, consistent governance and clear workflows, large data repositories become little more than archives. What is changing now is a recognition that data architecture must evolve into an operational framework that supports collaboration, decision making and automation. The convergence of digital twins, AI and more advanced governance models is forcing companies to rethink not only how data is stored, but how it flows through organisations and influences real world action.
This shift is particularly visible in asset intensive operations such as LNG regasification terminals, where operational reliability, safety and efficiency depend on integrating engineering knowledge with live operational signals. As AI systems begin to interpret and act on these streams, the need for clean, contextualised and trusted data becomes fundamental rather than optional.
Building a shared data language
The ambition to create a common data model has become one of the most important goals in digital transformation programmes. Historically, different functions within an oil and gas organisation have developed their own definitions, standards and workflows. Maintenance teams interpret asset performance differently from operations teams. Engineering data may be structured around design intent, while operational systems focus on real time performance.
A common data model seeks to bridge these perspectives by creating a shared framework through which data can be interpreted consistently across the organisation. The benefit is not simply technical efficiency. It enables teams to speak the same language when evaluating asset performance, planning interventions or assessing risk. Data hygiene and governance improve organically when everyone works from a consistent foundation.
The challenge lies in achieving this without imposing rigid structures that stifle innovation. Many early attempts at data standardisation struggled because they tried to force alignment from the top down. The more successful approaches now emerging tend to evolve organically, guided by business outcomes rather than strict technical mandates. As workflows become connected through digital platforms, data quality improves because inconsistent inputs quickly reveal themselves through operational friction.
This evolution also changes the role of data governance. Rather than acting purely as a compliance function, governance becomes an enabler of collaboration. It defines how data should move, who owns it, and how trust is maintained as information flows between systems and teams.
From central repositories to operational workflows
Data lakes and warehouses were initially seen as endpoints. Once data was centralised, value would automatically follow. Reality has shown that centralisation alone is not enough. The real transformation occurs when data architecture is connected directly to workflows.
Remote execution provides a clear example. As operators seek to reduce on site personnel and improve
consistency across assets, digital platforms are increasingly used to assess asset integrity and coordinate intervention activities from central operations centres. This requires data to be standardised not only in structure but in meaning. Engineers reviewing the condition of a compressor or valve from a remote location need confidence that what they see reflects the actual state of the asset.
Standardised workflow management allows organisations to apply consistent methodologies across geographically dispersed operations. Instead of relying on local practices that vary by site, digital platforms enable a repeatable approach to inspection, analysis and decision making. The result is improved transparency and faster response times, particularly in environments where operational complexity continues to increase.
Remote execution also introduces a cultural shift. Decision making becomes less dependent on physical
proximity and more reliant on shared digital representations of assets. This transition requires trust in data quality and governance frameworks, reinforcing the importance of contextualisation.
The rise of agentic AI frameworks
Generative AI has introduced new possibilities for knowledge discovery and decision support in industrial environments. Yet concerns around hallucinations and reliability have limited adoption in critical operations. The industry’s response is increasingly focused on agentic frameworks that combine generative capabilities with structured retrieval systems.
Retrieval Augmented Generation, often referred to as RAG, is gaining attention as a way to ground AI outputs in verified operational data. Rather than relying solely on probabilistic language models, these systems retrieve relevant information from trusted datasets and use it to inform responses. The result is greater accuracy and transparency, which are essential in high consequence environments like oil and gas.
Agentic frameworks go a step further by allowing AI systems to coordinate multiple tasks, interpret data from different sources and suggest actions within defined boundaries. The potential is significant. Engineers could query complex asset histories, receive contextual insights and generate workflows without manually searching through multiple systems.
However, implementation demands discipline. AI systems are only as effective as the data environments that support them. Poor governance or fragmented data structures increase the risk of unreliable outputs. Organisations therefore face a dual challenge of advancing AI capability while strengthening foundational data practices.
Training the business, not just the technology
One of the most underestimated aspects of digital transformation is the maturity journey required for business users. Many digital initiatives assume that new tools will drive adoption automatically.
Experience suggests otherwise. Technology can only deliver value when users understand how to integrate it into existing decision-making processes.
Traversing the maturity model requires investment in skills development and cultural change. Engineers and operators need to understand not only how to use digital tools but how data driven workflows alter traditional responsibilities. This often involves moving from reactive decision making towards proactive analysis supported by AI insights.
Training strategies that succeed tend to focus on practical outcomes rather than abstract digital concepts. Users need to see how new workflows improve safety, reduce inefficiencies or simplify complex tasks. As confidence grows, adoption expands organically across the organisation.
Importantly, maturity is not linear. Different parts of an organisation may progress at different rates depending on asset complexity, operational culture and leadership engagement. Recognising these variations allows companies to tailor training approaches rather than enforcing uniform adoption.
Digital twins and LNG operations
LNG regasification terminals offer a compelling example of how digital twins can translate data contextualisation into operational value. These facilities operate within tightly controlled conditions where reliability, safety and energy efficiency must be balanced continuously.
A digital twin creates a dynamic representation of the physical asset, integrating design data, operational signals and historical performance into a single environment. Engineers can simulate scenarios, monitor asset behaviour and predict performance outcomes without disrupting live operations. When combined with AI driven analytics, the twin becomes a decision support system rather than merely a visual model.
The real value emerges when digital twins are connected to workflows. Maintenance planning can be informed by predictive insights. Process adjustments can be evaluated virtually before implementation. Operational anomalies can be investigated using a combination of live data and historical context.
For LNG operations, where equipment reliability has direct implications for supply security and emissions intensity, this integrated approach allows operators to optimise performance while reducing risk. Digital twins also support remote collaboration, enabling expertise to be deployed across multiple facilities without requiring physical presence.
Governance as a strategic differentiator
As digitalisation accelerates, governance is evolving into a competitive advantage. Organisations that establish clear frameworks for data ownership, quality and access can move faster with AI adoption and workflow automation. Those that struggle with fragmented governance risk being overwhelmed by complexity.
Effective governance does not mean restricting innovation. Instead, it provides guardrails that allow experimentation to occur safely. Teams can explore new AI applications or workflow models knowing that data integrity and security remain protected.
This is particularly important as regulatory expectations around digital operations continue to grow. Transparent data governance supports compliance while enabling operational agility, creating a foundation for long term resilience.
The convergence of data contextualisation, AI frameworks and digital twins suggests a broader transformation underway. The operating model of the future is likely to be defined less by physical location and more by how information flows through the organisation.
Remote execution, standardised workflows and AI supported decision making will increasingly blur traditional boundaries between disciplines. Engineers, operators and data specialists will collaborate within shared digital environments rather than isolated systems. The focus will shift from managing individual technologies to orchestrating integrated ecosystems.
For oil and gas companies, this transition represents both opportunity and challenge. The technology exists to improve efficiency, reduce emissions and enhance reliability. The limiting factor is often organisational readiness and the ability to align data strategy with operational reality.
A system built on context
The industry’s digital evolution is moving beyond infrastructure. Data lakes and warehouses remain essential components, but they are no longer the end goal. The real objective is creating context that allows data to inform decisions quickly, consistently and confidently.
Digital twins and AI frameworks offer powerful tools, but they depend on strong governance and shared data models to succeed. As LNG terminals and other complex assets become increasingly digitised, the companies that thrive will be those that treat data not as a technology project but as a core part of their operating philosophy.
The energy sector has always been defined by engineering ingenuity. The next phase of progress will be shaped by how effectively organisations connect that engineering expertise with trusted digital systems. When data becomes contextualised, workflows become standardised and AI becomes accountable, digital transformation stops being an aspiration and begins to deliver tangible operational value.
In that sense, the future of oil and gas may not be defined by who has the most data, but by who understands it best.
Holding the centre as the energy world shifts
In conversation with Sheikh Nawaf S AlSabah, Deputy Chairman and Chief Executive Officer of Kuwait Petroleum Corporation, the future of oil, international partnerships and energy security is framed not through disruption, but through resilience, discipline and long-term stewardship.
The modern energy debate is often dominated by urgency. Transition narratives, geopolitical volatility, technological disruption and shifting investor expectations create the impression of an industry in permanent acceleration. Yet when Sheikh Nawaf S Al-Sabah reflects on the direction of Kuwait’s oil sector, the conversation moves at a different rhythm. His perspective is grounded not in reaction but in structure, and in the belief that resilience is built through consistent institutional evolution rather than dramatic reinvention. The emphasis throughout the discussion is on continuity, capability and the role of national oil companies as stabilising forces in a changing global system.
Asked how Kuwait’s oil industry has changed since he took over just over three years ago, his answer begins internally rather than externally. “When I took this position,” he explains, “the mandate was very clear. We needed to move the sector back to a level of international leadership, not only in innovation but in the way we operate and how we contribute to the wider economy.” That process, he says, started with culture and governance rather than infrastructure. Policies were introduced to ensure promotions and leadership opportunities were based on capability and experience rather than outside influence, a step he describes as essential to restoring organisational confidence. “You have to clear up the system first,” he says. “Only then can you expect people to innovate or think differently.”
The importance he places on organisational discipline reveals much about his broader leadership philosophy. Strategy, in his view, is not solely about assets or market positioning but about creating
an environment where people can perform consistently at a high level. This internal focus formed the foundation for a broader repositioning of Kuwait’s energy business, one that has increasingly looked outward again after years of relative consolidation. “We are re internationalising,” he says, emphasising that Kuwait’s global footprint is not new but evolving. The country’s long-standing operations across Europe and international upstream ventures provide a base from which new partnerships can be built with greater confidence and clarity.
Rebuilding through trust and partnership
Partnerships, he argues, are defined less by transaction and more by alignment. “There are three things that matter,” he says. “First is trust. Without trust there is nothing. Second is
a common vision of where we want to be. Third is a commonality of approach that allows us to reach those goals together.” He points to Kuwait’s international upstream arm and its downstream operations in Europe as examples of how collaboration with international oil companies has matured over time. These relationships, developed across decades, are now being reshaped to address a more complex era of resource development. “The era of easy oil is over,” he says. “The challenges today require new models of cooperation, and we are evolving to meet that reality.”
This openness to partnership sits alongside a recognition that geopolitical uncertainty remains a constant feature of the region. Yet Sheikh Nawaf speaks about risk with calm pragmatism rather than alarm. “We live in a rough neighbourhood,” he says, reflecting on decades of regional tension.
What has changed, he argues, is the way markets respond, and the role national oil companies play in reducing volatility through preparedness. In earlier decades, geopolitical incidents often produced sharp price movements as traders anticipated supply disruptions. Today, the presence of spare capacity and diversified refining infrastructure provides greater confidence that shocks can be managed.
“At KPC we have production capacity that can be brought to market quickly if needed,” he explains, but he is careful to emphasise that resilience is not just about crude volumes. The commissioning of the Al Zour refinery, one of the largest in the region, illustrates how downstream strategy can support global stability. Built primarily to produce middle distillates, the refinery became a significant supplier during Europe’s recent energy
crisis following the disruption of traditional supply routes. “We were able to provide material volumes that helped address shortages,” he says. “This is how you manage geopolitical risk. You cannot control events, but you can control your preparedness.”
Energy security beyond self sufficiency
The conversation naturally turns to energy security; a term frequently used in policy discussions but one that Sheikh Nawaf challenges directly when framed as self-sufficiency. “Complete energy independence is a meaningless concept,” he says, arguing that the energy system is inherently interdependent. Crude qualities differ, refineries are designed for specific feedstocks, and global trade flows remain essential regardless of national production levels. Even countries that have
become major exporters continue importing certain grades to match their infrastructure. “Globalisation may be debated politically,” he says, “but within energy it is fundamental. Interdependence is how the system works.”
This perspective reflects Kuwait’s historical identity as a trading nation long before hydrocarbons transformed its economy. He references the country’s heritage as merchants and shipbuilders, suggesting that international exchange remains central to its economic philosophy. This historical mindset underpins the decision to invest heavily in additional production capacity even when immediate market demand may not require it. “Having spare capacity adds reassurance,” he says. “It gives stability to the international community and confidence to the market.” The investment is therefore both strategic and symbolic, reinforcing Kuwait’s role as a reliable supplier within the global system.
When the focus shifts to natural gas, his assessment remains pragmatic. Kuwait, he explains, is rich in oil but constrained in gas resources, meaning that domestic priorities dominate strategy. Gas supports power generation and petrochemical development, and while exploration continues, Kuwait is unlikely to become a major exporter. “The more gas we find, the more uses
we have for it domestically,” he says, highlighting the importance of energy security at home even as international ambitions continue elsewhere.
Staying focused on core strengths
Perhaps the most striking part of the conversation emerges when discussing the long-term identity of Kuwait Petroleum Corporation. In an era where many energy companies are redefining themselves under broader sustainability or diversification narratives, Sheikh Nawaf remains clear about the company’s core purpose. “The name on the door is Kuwait Petroleum Corporation,” he says. “Petroleum is what we know, what our people are trained for and what our infrastructure is designed around.” This is not presented as resistance to progress but as a recognition of comparative advantage. Efficiency, sustainability and innovation remain priorities, but the underlying business remains rooted in hydrocarbons that continue to offer low production costs and competitive carbon intensity.
His confidence in oil’s long-term relevance is measured rather than ideological. “Decades from now, oil will still be part of the energy mix,” he says, noting that global demand growth and affordability considerations will sustain its role even as alternative technologies expand. The emphasis returns repeatedly to
balance and realism rather than binary narratives of transition versus status quo. For Kuwait, the goal is not to oppose change but to navigate it from a position of strength and clarity.
As the interview draws to a close, the conversation returns once again to people and culture. The organisation’s strength, he argues, comes from generations of expertise and a shared sense of mission. “We have a strong culture and strong talent,” he says. “Our responsibility is to ensure energy security, not just for ourselves but for the broader economy that depends on stable supply.” Leadership, in this context, becomes an exercise in stewardship rather than disruption, balancing long-term investment with operational discipline.
A long view of leadership
Asked what Kuwait Petroleum Corporation might represent fifty years from now, his answer reflects the consistency of the conversation as a whole. “We will continue doing what we do best,” he says. “Technology will evolve, markets will evolve, but there will still be a need for reliable energy. Our role is to provide that responsibly.” The statement captures a worldview centred on continuity rather than dramatic reinvention, suggesting that in a rapidly changing energy landscape, stability itself may prove to be the most valuable form of innovation.
The final impression left by Sheikh Nawaf is one of deliberate confidence. While many conversations around the future of energy focus on uncertainty and disruption, his perspective is grounded in preparation, partnerships and institutional resilience. “We know our strengths,” he says. “Our responsibility is to keep delivering, consistently and reliably.” In an industry often pulled between competing narratives, that sense of steady purpose may be one of the most powerful strategies of all.
When machines learn to think alongside us in oil and gas
The oil and gas sector has never been short of complexity. It operates in some of the harshest environments on earth, under pressures both physical and financial, and with little margin for error. Against that backdrop, the arrival of digital twins infused with artificial intelligence feels less like a new gadget and more like a survival tool.
Digital twins started life as digital blueprints, functional replicas of physical assets that could model stress, temperature, or performance. They were handy for training teams or testing ideas, but limited in scope. Over the last few years, though, something has shifted. When artificial intelligence is layered onto these models, the twin begins to do more than copy reality. It anticipates. It suggests. It adapts.
For operators, that shift is not a technical curiosity but a business necessity. A subsea pump that fails unexpectedly can shut down production, trigger a safety incident, or undermine months of planning. A refinery struggling with fluctuating energy costs must balance output with emissions limits while staying competitive. In both cases, the idea of a system that can think ahead, rather than just report back, is a powerful proposition.
The real-time problem solver
AI-enhanced twins are fed by relentless streams of data: vibration sensors, inspection logs, satellite readings, and weather forecasts. Where human teams would once pore over spreadsheets for weeks, the system now ingests and interprets the lot in near real time. Machine learning finds the weak signals in the noise. Generative AI then builds scenarios, testing what might happen next and what levers could be pulled in response.
The result is less of a passive dashboard and more of a decision-making partner. Instead of simply showing a pressure reading, it can explain how that
reading might evolve over the next 48 hours, whether it risks breaching safe thresholds, and what action would minimise both cost and emissions.
The most crucial point here is not replacing people but amplifying them. The machine does the heavy lifting on pattern recognition and forecasting. The human decides how to apply that knowledge in the messy, political, and highly contextual world of oil and gas operations.
Learning in uncertain conditions
Traditional automation has long underpinned the industry. Programmable logic controllers and deterministic systems do what they are told, and do it reliably. But they lack flexibility. They struggle when conditions fall outside predefined rules, which in oil and gas is a frequent reality. Reservoir behaviour, equipment wear, or erratic weather rarely obey tidy models.
Deep reinforcement learning offers an alternative. Within the safe confines of a digital twin, algorithms can play out thousands of scenarios, learn through trial and error, and refine their strategies. Over time, they become better at balancing competing objectives: maximising recovery, protecting equipment, and staying within environmental limits.
In a live operation, that means more robust decision support. The AI might not know precisely what will happen tomorrow, but it has experienced enough simulated tomorrows to offer guidance grounded in probability and learning, not just rules. The operator remains in charge, but with a richer map of options.
People at the centre of change
The technology story cannot be told without the human one. A twin powered by AI demands new skills, new trust, and new working cultures. A drilling engineer today must be as comfortable with algorithmic outputs as with pressure curves. A data scientist must learn why a marginal gain in throughput might not be worth the safety risk.
Reskilling programmes are already moving in that direction, but culture is harder to shift than capability. Trust is fragile. If teams see AI as a black box making opaque decisions, adoption stalls. If they can interrogate why a recommendation was made, what data was used, and what assumptions were fed in, then the machine becomes a colleague rather than a threat.
This is where explainability matters most. Operators do not need a lecture in machine learning theory, but they do need to see enough transparency to feel confident in the decisions they are approving. Without that cultural buyin, the promise of AI twins risks staying theoretical.
Guardrails for responsibility
Oil and gas carry risks that few industries can match. An error can damage ecosystems, endanger lives, and carry reputational costs that last for decades. That reality makes responsible AI more than a slogan. It is a precondition.
Data must be representative and regularly audited to avoid bias. Models must be tested and retested, not just at launch but as they evolve. Transparent governance is needed so that responsibility never drifts between machine and human. These are not optional extras; they are the foundations of trust in a high-risk sector.
Machine learning sharpens prediction. Generative AI goes further, giving digital twins the ability to create and test new scenarios. That opens space for greater autonomy. A twin can take historical project records, combine them with real-time sensor data, and simulate the likely outcomes of maintenance or process changes before they are implemented.
The obvious benefit is speed. What once required weeks of manual study can be condensed into hours of machinegenerated analysis. More importantly, safety improves. Teams can test interventions virtually without putting equipment or people at risk.
Genuine autonomy, where assets operate independently without human oversight, remains some way off. But generative AI moves the needle. It allows lower-risk,
repetitive tasks to be automated, while preserving human judgment for the most consequential calls. Autonomy becomes a spectrum, not a switch.
The shape of things to come
Looking ahead, the convergence of AI and digital twins points towards a model of autonomous integrity management. In such a world, assets would monitor their own condition, predict their own needs, and adjust their own operations. The idea is no longer futuristic. Early steps are already visible.
The impact is hard to overstate. Unplanned downtime shrinks. Emissions management becomes embedded in daily practice. Human teams step away from repetitive firefighting and focus instead on strategy, creativity, and resilience.
That vision is not without risk. Overreliance on algorithms, poor governance, or cultural resistance could stall progress. But the opportunity is transformative. If approached deliberately, anchored in transparency, responsibility, and trust, the AI-powered digital twin could become more than just a tool. It could reshape the industry’s relationship with its assets, its workforce, and its future.
Drilling autonomy is becoming an operational discipline
James Cahalane, Digital Product Manager, SLB and Abeer Musbah, Drilling Operations Deployment Manager, SLB argue that autonomy in drilling is moving from a technology ambition to a repeatable operating model. Their focus is on how DrillOps automation and DrillOps advisory combine AI-driven planning with disciplined deployment to raise performance without expanding operational risk.
Drilling has always been a contest between ambition and consequence. The well plan may be engineered in an office, but its outcome is decided in a shifting environment where friction, formation response, toolface control, hydraulics, vibration and human judgement collide at speed. When margins tighten and well designs become more complex, the industry tends to reach for familiar levers. Increase horsepower, add specialists, reduce uncertainty with conservatism, then accept the invisible losses that come with that caution.
Digital transformation has altered the available levers, but it has not simplified the problem. It has simply moved the constraint. The question is no longer whether data exists, but whether it can be converted into decisions quickly enough to matter, reliably enough to trust, and consistently enough to be repeatable across rigs and crews. Autonomy, in this context, is less about a futuristic rig with no people on it, and more about a rig where the system can perceive changing conditions, make decisions within defined constraints, and execute multiple workflows without waiting for a human to stitch together the next step.
SLB’s DrillOps automation and DrillOps advisory sit in that space between automation as a tool and autonomy as an operating capability. The claim is not that drilling becomes effortless, but that the most common sources of performance variation, human bias, inconsistent execution, and delayed response to change can be systematically reduced when the workflow is orchestrated by an AI planner that can continuously evaluate options and act at the pace of the well.
There is a reason this matters now. The industry is being asked to
deliver more performance with less tolerance for failure, while dealing with the limits of experienced personnel availability and the reality that the next gain often sits inside the details. Operational time lost in connections, tripping, weight-to-weight, and the microdelays that accumulate into days is increasingly where competitiveness is won or lost. Autonomy is a route to attack that layer of loss without trading safety for speed.
Automation is not autonomy
The language of drilling technology often blurs an important distinction. Automation is frequently treated as the end goal, yet it is better understood as the entry point. Task automation can take a repetitive action and execute it reliably, but it typically remains dependent on a human to manage the surrounding context, interpret changes, and decide when to transition from one action to the next. It is useful, but it is linear.
Autonomy introduces an additional capability, perception and decision-making within limits, applied continuously. It is the difference between a tool that executes a predefined step and a system that can manage the sequence, adjust to conditions, and handle multiple workflows as the environment evolves. In drilling terms, that means operating within a boundary that is not static, because the well is not static. Downhole behaviour changes, surface conditions change, data quality changes, and the acceptable operating envelope shifts with them.
SLB describes this progression through six levels, starting with fully manual operations, moving through assistance and task automation, then into conditional autonomy where the system makes decisions within constraints while executing a workflow. The higher levels extend to high autonomy, where complex tasks are executed reliably across varied scenarios and humans step in mainly for exceptions, and eventually full autonomy under all conditions.
What matters for operators is not the label, but the practical impact. Conditional and high autonomy imply that a system can operate closer to technical limits because it can continuously recalculate those limits rather than relying on conservative assumptions set days or weeks earlier. That is where performance is released. It is also where risk must be managed rigorously, because getting closer to limits only makes sense if the process of doing so is controlled.
The AI planner and the domain engines
The core idea behind DrillOps automation and DrillOps advisory is orchestration. Rather than relying on one monolithic model, the concept is built around an intelligent AI planner that coordinates a suite of specialised domain engines. Each engine focuses on a specific aspect of drilling behaviour and operational optimisation, producing insights related to performance and risk avoidance. The planner then prioritises those insights and translates them into decisions and actions.
This matters because drilling is not a single problem. It is a system of coupled problems that must be managed simultaneously. Rate of penetration cannot be maximised without considering vibration and toolface control. Hole cleaning cannot be improved without considering hydraulics and equivalent circulating density. Tripping speed is not simply a function of intent, it depends on consistent execution and the ability to respond quickly to anomalies.
A planner that can manage priorities across engines, adjust constraints, and allow parallel execution where appropriate changes the operational rhythm. It reduces the dependence on sequential human decision-making, where each decision waits for the previous one
to be closed out, and where the cognitive load on the driller and the team increases precisely when conditions become more complex.
The outcome, if it works as intended, is a system that recalculates drilling boundaries in real time. It does not remove safety barriers, but it aims to replace static conservatism with dynamic control. That difference is subtle on paper and decisive on a rig. Static roadmaps are intentionally conservative because they must cover many scenarios. They carry historical bias and large margins to protect against unknowns. Dynamic roadmaps, enabled by realtime data and rig-based engineering capability, can move those margins closer to the true operating envelope because the unknowns are being continuously re-evaluated.
The practical effect is not a single dramatic leap, but a sustained reduction in performance variability. Variability is often the hidden cost in drilling operations. Two rigs with the same plan can deliver very different outcomes because decision-making, execution discipline and response time differ. Autonomy aims to standardise performance, not by enforcing a rigid template, but by ensuring that decisions are made consistently and at the right tempo.
Deployment is an operating model
Technology rarely fails in drilling because it cannot compute. It fails because it cannot be adopted reliably. The rig is a harsh environment for change, and even the most capable digital system will struggle if it is deployed as an add-on rather than integrated into how the operation is run.
SLB frames successful deployment of DrillOps around two pillars, process and mindset. The process begins with installation and commissioning to integrate the solution with rig control systems. It then moves through a hypercare phase, where experts provide close support to stabilise performance and resolve early issues. Once steady state is reached, the model shifts to managed services, continuous monitoring, updates and ongoing expert guidance.
That sequence matters because trust is built by predictability. A drilling team will not trust a system that behaves inconsistently, and they will not accept disruption during critical operations. Hypercare creates the conditions for early success, and early success is often the only credible argument for change on a rig.
The second pillar, mindset, is the harder one because it is cultural. Datadriven execution sounds obvious until it meets the reality of long experience, local heuristics and the human instinct to intervene. Autonomy requires teams to link automated decisions to measurable KPIs, to align the system with strategic goals such as standardisation and reduction of invisible loss, and to start with targeted workflows where the value can be proven without overwhelming the crew.
Trust is not created by insisting that the AI is correct. It is created when the team can see why a recommendation or action makes sense, when it can be reviewed, and when the outcome is demonstrably better. Continuous improvement becomes part of the model, with post-job reviews feeding iterative updates. That approach treats autonomy as a capability that must
be learned and refined, rather than a product that is installed and forgotten.
This is also where autonomy becomes less about removing the human and more about changing the human role. The driller and the team move from manual execution to oversight, exception handling and operational judgement. That is not a smaller role. It is a different one, and it demands a shift in how performance is defined and how accountability is managed.
What performance looks like when bias is removed
The strongest argument for autonomy is performance that is sustained across time and across rigs, rather than isolated best runs that depend on a specific crew or a specific set of conditions. SLB points to an operator experience where deployment of DrillOps solutions across a rig fleet produced material changes. Drilling duration fell by around 40%, from roughly 40 days to about 25 days. Rate of penetration doubled. Weight-to-weight time fell by 50%. Tripping speed increased by 25%.
Numbers like these invite scepticism for good reason. Drilling performance is influenced by many variables, from geology and bit design to rig condition and programme design. The more useful way to interpret the outcome is to focus on the mechanism described. The operator used autonomous drilling capability to reduce bias and to identify safe boundaries more accurately, which allowed operations to move closer to the true technical limits without destabilising the process.
That is a distinct claim. It suggests that a significant portion of lost performance sits inside conservative assumptions and inconsistent execution, rather than inside hardware constraints. If a system can continuously update boundaries based on real-time conditions, it can release performance that was previously left on the table to protect against uncertainty.
It also reframes what a good drilling system looks like. A traditional roadmap is built to avoid surprises by staying well inside limits. An autonomous system aims to avoid surprises by perceiving change earlier and responding faster. Both approaches are risk management. One manages risk through margin. The other manages risk through control.
Autonomy as the next operating baseline
The upstream sector has a habit of treating digital as a project rather than an operating model. That approach often produces pilots that look impressive in isolation but fail to scale because the organisation never changes how it works. Autonomy in drilling will follow the same pattern unless it is deployed with discipline, measured in operational terms, and integrated into the daily reality of the rig.
DrillOps automation and DrillOps advisory are positioned as part of that shift, combining an AI planner and domain engines with a deployment methodology designed to stabilise performance and build trust. The ambition is not to make drilling autonomous in theory, but to make it autonomous in practice, where safety, consistency and measurable gains matter more than slogans.
The industry is heading into a period where efficiency improvements will increasingly come from removing variability, tightening execution, and reducing the small losses that accumulate into major cost. Autonomy is one of the few levers that can operate at that level of granularity, at the pace of the well, without requiring a fundamental rebuild of physical infrastructure.
The next chapter in drilling innovation will not be defined by who claims the highest level of autonomy on a slide. It will be defined by which operators can turn autonomy into a repeatable discipline, embedded into operations, trusted by crews, and measured in outcomes that hold up across a fleet.
THE
Premium Membership gives you year-round access to the insights, connections and content shaping the future of digital twin and AI. Members receive exclusive content, reports, webinars and - most importantly - 50% off event passes, making it the smartest way to attend Future Digital Twin events globally. Become part of the community driving
Balancing fuels and transition in a changing energy landscape
Europe’s downstream sector is navigating a decade defined by uncertainty, competing technologies and shifting regulation. For Csaba Zsótér, Senior Vice President, Fuels at MOL Group, the challenge is not choosing one pathway over another, but keeping an integrated system secure, competitive and ready to scale whichever transition route proves viable.
The energy transition is often described as a straight line, moving from hydrocarbons to alternatives in a neat sequence defined by policy and technology. The reality inside major refining and downstream organisations is considerably more complex. Decisions made today must balance decades-long infrastructure cycles, unpredictable regulatory signals and the immediate expectation that fuel supply remains reliable, affordable and secure.
For MOL Group, a company operating across Central and Eastern Europe with a network of refineries and petrochemical assets, this balancing act sits at the heart of its downstream strategy. The company’s 2030 vision has repeatedly emphasised transformation, yet it also recognises that mobility, industrial production and chemical demand continue to rely heavily on traditional refining capability.
“We are helping people move from A to B, we are also a major petrochemical producer, and we are continuing that in this decade,” says Csaba Zsótér, Senior Vice President, Fuels at MOL Group. “We are investing into refining infrastructure, but at the same time making our operations more sustainable in every possible way.”
That duality defines the approach. On one side, MOL continues to invest in core refining assets, including major upgrades such as a new delayed coker unit in Croatia and ongoing regional expansion. On the other, the company has made significant investments in petrochemicals and emerging technologies designed to reduce emissions and broaden feedstock options.
The strategy reflects a practical view of the energy transition rather than an ideological one. “We believe there will be competing technologies,” Zsótér explains. “We do not know which one is going to win in the coming ten years. It should be determined by technological and financial competition, not only by regulation.”
Navigating uncertainty without standing still
Few sectors feel the impact of regulation as directly as downstream fuels. European policies continue to evolve rapidly, sometimes creating conflicting signals for companies planning investments measured in decades rather than
years. For operators like MOL, the challenge is building a resilient strategy despite uncertainty.
“What you can do in such an uncertain environment is to invest into each technology a bit,” Zsótér says. “We are in biofuels, coprocessing, sustainable aviation fuels, EV chargers, solar panels, green hydrogen and biogas. We start everything on a scale that is meaningful for us and for the region, but not so big that it dominates everything.”
This approach allows MOL to build operational experience without overcommitting to any single pathway. If one technology accelerates faster than expected, the capability already exists to scale up.
It is a pragmatic response to what Zsótér sees as a slow-moving transition in the real economy. “By 2030 not much difference will be visible,” he says. “Internal combustion engines will still dominate. Even if electric vehicles are more visible, it can take ten to fifteen years or more for the whole fleet to change.”
Freight, aviation and marine transport further complicate the picture. While experimentation continues, commercially viable low-carbon alternatives at scale remain limited, reinforcing the role of liquid fuels in the medium term. “The fuel transition is here, and it is not going to turn back,” Zsótér adds. “But we have to fuel that transition responsibly.”
Keeping refining competitive while getting greener
Margins in downstream operations remain tight, forcing companies to pursue efficiency while simultaneously investing in decarbonisation measures. MOL’s strategy focuses on improving the sustainability of existing operations rather than abandoning them. “There is a future in refining,” Zsótér says. “There is a future in industrial production and petrochemicals in Europe, but operations have to become cleaner and more sustainable.”
One example is co-processing technology, where renewable feedstocks are introduced into existing refining systems without fundamental redesign. MOL was an early adopter, launching its first co-processing operation in Hungary in 2020 and expanding the technology across its refineries.
“Without changing the refinery technology fundamentally, by adding green feedstock we can make our operations more sustainable,” Zsótér explains.
Feedstock diversification is another key focus.
Access to sustainable and waste-based feedstocks is becoming increasingly competitive as demand rises across industries. MOL’s waste management concession in Hungary provides long-term access to material streams that can support renewable fuels and circular chemical production.
“Demand for waste-based feedstocks creates supply as well,” he says. “More waste that was previously thrown away will be recycled for energy and chemicals.” This reflects a broader shift in downstream thinking, where waste and alternative inputs become strategic assets rather than peripheral opportunities.
Digitalisation in legacy-heavy environments
Digital technologies have become central to improving efficiency and emissions performance yet implementing them in decades-old industrial systems is rarely straightforward. “It is medium difficult,” Zsótér says with a smile. “We have assets that are sixty years old. We have people who have been working here for thirty-forty years. But we are open to new technologies.”
Digital tools are increasingly being used across refining and commercial operations, from demand forecasting and blending optimisation to data management platforms. MOL has moved away from traditional siloed systems towards more integrated data architectures designed to support analytics and automation.
Artificial intelligence is beginning to play a role as well, although cautiously. “We have dedicated digital and AI teams,” Zsótér says. “But we must be careful. We are operating enormous refining systems with dangerous goods. Reliance on data and machines must be safe and secure.”
This cautious adoption reflects the realities of industrial environments where safety and reliability take precedence over speed of change. Development of new digital tools combines internal expertise with external collaboration. MOL works with technology partners and universities while maintaining strong internal development
teams. The blended approach allows the company to access innovation while ensuring solutions remain relevant to operational realities.
Sustainability beyond compliance
Reducing emissions across scope 1 and scope 2 categories remains a major focus for the company. Projects targeting energy efficiency and cleaner utilities are underway across refining sites, supported by investments in renewable electricity and green hydrogen.
MOL’s green hydrogen facility, the largest in the region, represents one example of early-stage investment designed to build operational experience while supporting longer-term goals. Similar thinking underpins investments in biogas production and advanced biofuels. “We want to decrease our CO2 footprint,” Zsótér says. “Good is never good enough. We always want to do more.”
Carbon capture and storage remain under evaluation, though no commercially viable projects have yet been identified. “We are investigating it, but we have not seen the numbers add up yet,” he admits.
Customer demand also plays a role in shaping sustainability decisions. In petrochemicals, demand for recycled materials and product carbon accounting is increasing even before regulation fully mandates it. In fuels, demand for greener solutions exists but remains strongly influenced by policy requirements and blending mandates. “People appreciate green solutions,” Zsótér notes. “Even in Central Europe, where you might not expect it, people said they would be ready to pay a small premium.”
The value of integration
When asked whether fuels or petrochemicals represent the more important growth area, Zsótér rejects the premise entirely. “It is like asking which one of your children is more important,” he says. “It is an integrated model. One would not work without the other.”
This integration provides resilience across market cycles. When petrochemical margins are under pressure, refining operations can provide support, and vice versa. The strategy relies on maintaining flexibility rather than prioritising one segment permanently. Supply chain resilience has become another priority following disruptions during the pandemic and ongoing geopolitical tensions. MOL’s response centres on diversification across crude sources and feedstocks to reduce vulnerability to single points of failure. “Diversity of supply sources is what gives resilience,” Zsótér says. “That is true for fuel, crude oil and natural gas.”
For MOL, energy security remains more than a policy concept. Operating across multiple countries, the company sees stable fuel supply as a foundational responsibility. “We come to work every morning to make sure people can get to work, ambulances can operate and society functions,” Zsótér says. “Securing fuel supply to the region is our first mission.”
This perspective frames the transition as an evolution rather than a replacement. New technologies must integrate into a system that continues to deliver reliability every day. The company’s regional growth strategy reflects this emphasis on security and scale. Expansion across neighbouring markets and investments in refining capacity are seen as ways to strengthen the resilience of the broader energy system.
People and culture in transition
Technology and infrastructure dominate headlines, yet Zsótér repeatedly returns to people as the decisive factor shaping the future of downstream operations. “We have a very strong culture,” he says. “In our industrial sites, people grow together with the units. There are friendships, marriages, even generations of families working together.”
At the same time, the sector faces challenges attracting new talent. Engineering and technical disciplines have seen declining interest across Europe, creating skills shortages that affect the entire industry. To address this, MOL has deepened its engagement with education, partnering with universities and
vocational schools and even launching its own technical training centre.
“We are showing young people the beauty of the industry,” Zsótér says. “If they want to join us after school or university, we are already there to support them. Strong culture also helps with digital transformation, reducing resistance to change even in long-established operations. We have challenging moments. But we have loyal people and a shared understanding of what we are trying to achieve.”
Regulation, realism and optimism
Asked directly about regulation, Zsótér remains diplomatic but clear about priorities. Predictability is essential when planning major investments. “We are here to comply with regulations,” he says. “But regulation should support viable technologies and the competitiveness of European industry.”
The complexity of operating across multiple European markets adds another layer of challenge, particularly when directives are implemented differently at national level. MOL sells fuel in twelve markets, out of which nine countries are in the EU, each with its own interpretation of biofuel requirements. “It requires very deep knowledge of national specialties,” Zsótér says. “Sometimes it is complicated, but it also allows countries to adapt regulation to their own structures.”
Looking ahead, Zsótér remains cautiously optimistic. The company’s growth trajectory over the past two decades, combined with its assets and workforce, provides confidence despite uncertainty. “What gives optimism is our people, our culture, and the growth strategy we have had for many years,” he says. “How we keep supply security, make it sustainable, and reward our shareholders, that keeps us working harder every day.”
The future of fuels and petrochemicals in Europe may be uncertain, but for MOL the path forward is defined by flexibility rather than fixed predictions. By investing across multiple technologies, modernising existing assets and maintaining a strong regional focus, the company aims to remain relevant regardless of which transition pathway ultimately dominates. In a sector often framed by binary debates, that pragmatic approach may prove to be its greatest strength.
Reshaping the future of work in energy
Real-world applications and use cases have emerged beyond the buzz and hype of AI. Natural language processing (NLP), large language models (LLM), hybrid machine learning (ML), generative and agentic AI—all represent a litany of transformational technologies.
The possibilities for our digital future within the energy industry are endless:
Faster, more reliable data processing and verification, with more sophisticated data-sharing infrastructures
• Connected data sources as a foundation for accelerated innovation and collaboration
Increased autonomy and automated services in operations and maintenance
Integrated physics-based and data-driven digital twin models that together with AI augment human decisionmaking
Improved asset performance management and reliability for plants from upstream to downstream
• Bi-directional data flows between systems and equipment, with Generative AI enhancing interaction to increase situational and operational awareness
• Supply chain transparency and traceability across the energy value chain
Predictive analytics for improved energy efficiency
Considering the potential unlocked by an AI-driven digital transformation strategy, both immediate and long-term advantages become clear - ranging from faster data processing and improved information flow to reduced risk, enhanced communication, and greater automation. But the real challenge lies in moving beyond the hype to thoughtfully integrate these advanced technologies into daily operations in a way that aligns with industry needs. The key question is how AI will become embedded in routine business processes, supporting everything from workflow automation and data analysis to decision-making and customer engagement, ultimately reshaping how organizations operate and deliver value.
Moving beyond data and dashboards
What matters most is not just the data you have but how you use that data. Data standards are an important part of our digital future, enabling companies to collaborate and co-innovate through system interfaces that integrate in the back end. Many progressive companies have put in place a solid data infrastructure and added select applications like a digital twin on top, making data more contextualised and accessible through simplified dashboards that make data easy to find, filter and apply.
However, data and dashboards are not enough of a springboard for AI to have the measurable value or ROI that companies expect. Instead, we need to start with a value-focused approach that zooms in on the specific use cases and services where AI can have the most influence through a digital operating model that builds on digital twin technology backed by physics-based and datadriven models. The successful implementation of an AI-infused digital strategy needs to be driven by desired business outcomes.
Driving transformation with a value-focused approach
Effectively leveraging AI to move beyond “business as usual” in the energy sector requires deep familiarity with the industry’s evolving landscape. As a technology provider with both domain and
technical expertise, we see the greatest potential for AI-driven value creation in several key areas
Safety
Operations and Maintenance
• Performance Monitoring
• Supply Chain Management
Design and Engineering
Emissions Management
Once specific services within the potential highimpact areas are identified—typically those that are frequent and generate consistent, repeatable data patterns—they can serve as ideal candidates for AI-driven automation. These patterns enable AI and related technologies to support the development of value-focused applications that extract and process information, generate insights, and provide actionable recommendations. Many of these actions can be executed autonomously, ultimately contributing to greater energy efficiency.
Beyond the hype, it’s essential to keep people at the core of operations, supported by technology that can seamlessly handle diverse data types and sources. In this model, technology serves as an enabler—delivering the right amount of information to the right person at the right time. This enhances decision-making speed while reducing risk. Depending on the use case, the resulting benefits can scale significantly, from reduced emissions to earlier interventions in
predictive maintenance scenarios.
A glimpse into the AI-driven future of energy operations
Consider a methane emissions management workflow.
An emissions reduction team oversees 10 assets for a major exploration and production company, continuously monitoring each site’s carbon footprint. Their central tool: a cloud-based, dynamic digital twin that provides access to a configurable emissions management cockpit. This cockpit goes far beyond static dashboards—it highlights the highest energy consumers in real time, flags critical incidents requiring immediate attention, and suggests recommended actions based on live data streams enriched with historical and synthetic datasets.
At one facility, the cockpit detects that a main gas turbine is consuming significantly more energy than expected, pushing up the site’s overall emissions profile. The team initiates an investigation by querying the digital twin through an integrated AI-powered chat interface, retrieving targeted insights in seconds.
With the relevant data in hand, they virtually navigate to various system components—such as flare stacks and vents—to pinpoint the issue. Behind the scenes, complex data processing and integration are handled seamlessly, allowing the team to focus on decision-making. They consult a simulator view within the twin to compare realtime and modeled values, receiving prescriptive guidance on where and how to intervene. Armed with this clarity, the team can act quickly, supported by both actionable instructions and transparent rationale
Transforming the future of work in energy
Data becomes insight. Insight drives action. Action delivers outcomes. And outcomes evolve into prescriptive tasks—clear, traceable, and executable—enabling teams to move with confidence and transparency. This cycle supports a wide range of use cases, each aligned with business goals and value-driven outcomes. Forwardthinking energy companies are already embracing this shift. It’s not just better than business as usual—it’s the foundation for a smarter, more sustainable future.
For more information visithttps://kongsbergdigital.com/
Separation Modelling Rigour: The Key to Unlocking Hidden Production Capacity
Digital Twin Adoption Is Widespread. Fidelity Is Not. Digital twins are now embedded across production assets. AI-driven optimisation, real-time dashboards, and predictive analytics dominate the conversation. Yet one critical question remains:
Does the model reflect physical reality with sufficient depth?
In many facilities, production is not limited by compression or
equipment nameplate capacity. It is limited by separation performance . And separator behaviour is frequently simplified inside digital twins. That modelling gap can quietly suppress significant revenue.
The Hidden Constraint: Separation Physics
Real separator performance depends on droplet size distributions, internal geometry, entrainment, coalescence, and separation efficiency. These directly influence hydrocarbon dew point compliance, gas
export specifications, and allowable throughput.
Yet many digital twin environments treat separators as idealised flash stages. The mass balance may converge; the physics may not. Whilst it is typical to model many process operations with rigour, including heat exchangers, pipelines, valves, pumps, compressors, controls etc with fully rigorous representations: simulator default separator models either, neglect carry-over, or require the user to specify arbitrary linear factors.
The consequence is subtle but material: operators may accept conservative production ceilings without identifying the true constraint.
Turning Modelling Rigour into Revenue
MySep embeds rigorous separator physics into commercial simulation platforms used to construct digital twins, such as Aspen HYSYS® , AVEVA™ PRO/II™ and AVEVA™ DYNSIM® , Honeywell UniSim® Design , KBC Petro-SIM® , Kongsberg K-Spice® and SLB Symmetry
Instead of approximating equilibrium separation, the model predicts real droplet behaviour and carry-over risk — exposing bottlenecks invisible to simplified representations.
This enables operators to identify hidden separation limits, quantify throughput sensitivity, evaluate retrofit scenarios virtually, and optimise production within specification.
Rigour becomes decision clarity.
Case Study: US$325 Million Through Fidelity
On a producing FPSO, oil production was capped at ~74,000 BOPD and gas at ~79 MMSCFD due to the reason that the export gas system was limited by hydrocarbon dewpoint specification . From an engineering perspective: the processing plant appeared mechanically sound, yet its capacity was capped well below its design potential. Conventional simulation models that represented separators as ideal devices showed no root cause.
A process digital twin integrating MySep’s rigorous separator models revealed liquid carry-over from the first-stage separator as the true constraint. For the first time, the model could replicate how small variations in gas and liquid flow affected entrainment and downstream dewpoint. Utilising the MySep Studio software, retro-fit internals upgrades were explored for each separator vessel, and the overall process operational envelope could be explored, with these revamp configurations simulated in the digital twin. The upgraded process provided improved droplet capture efficiency throughout, significantly reducing the amount of mist carried into the export gas stream even during high throughput operation.
In practice, this meant the facility could safely raise production rates without exceeding the dewpoint limit. The improvement unlocked an increase in production to ~82,000 BOPD and ~88 MMSCFD — delivering an estimated US$325 million annual uplift , without major capital expansion or export specification breach. The value came from modelling fidelity.
AI Needs Engineering Truth
AI enhances optimisation. But AI operating on simplified physics cannot uncover physics-driven constraints.
Digital twins are only as good as the models they contain. By ensuring that separation — often the hidden bottleneck — is represented with sufficient physical depth, operators can uncover significant efficiency and revenue potential while maintaining compliance and integrity.
When digital twins are grounded in rigorous engineering models, AI becomes materially more powerful — because its optimisation operates on physical credibility.
Modelling rigour is not refinement. It is competitive advantage. Digital twin adoption is no longer the differentiator. Model fidelity is. The future is not AI alone. It is AI built on engineering rigour.
The end of digital theatre in oil and gas
Digital twins and AI are no longer experimental tools or innovation showcases. They are becoming the operating system of modern energy production, forcing organisations to rethink how decisions are made, how accountability is structured and what leadership looks like in a data driven industry. Mark Venables explores how this shift is redefining operational reality across the energy sector.
The oil and gas industry has spent the past decade talking about digital transformation. It has built innovation labs, launched pilot projects, hired data scientists and filled conference stages with ambition. Yet the uncomfortable truth is that much of the sector still operates with the same organisational logic it had twenty years ago. Data sits in silos, decision making remains hierarchical, and digital tools are often layered on top of existing processes rather than used to fundamentally reshape them. The time for experimentation is over. The age of digital twins and AI demands something more difficult: structural change.
This is the defining tension facing the industry today. On one side there is an undeniable appetite for digitalisation. Executives recognise the potential of AI to optimise production, reduce emissions, improve reliability and enhance safety. Digital twins are increasingly embedded across assets, from offshore platforms to LNG terminals, promising real time insight and predictive capability that would have seemed impossible only a few years ago. On the other side sits organisational inertia. Many companies continue to treat digitalisation as a technology procurement exercise rather than an operational redesign. That mindset will fail.
Digital twins are not dashboards. AI is not a reporting tool. Together they represent a new operational layer that changes how decisions are made. The implications are profound. When machines can contextualise data, simulate scenarios and recommend actions in real time, the traditional workflow built around slow reporting cycles and manual validation begins to break down. The industry is moving from a world of retrospective analysis to one of continuous decision making, and many organisations are not yet ready for that shift.
The biggest barrier is not technology maturity but trust. Engineers and operators rightly demand reliability, especially in environments where the cost of failure is measured not only in dollars but in safety and environmental risk. The solution, however, is not to keep AI at arm’s length. It is to embed it responsibly within clear governance frameworks and to recognise that the human in the loop model is not a compromise but a strength. AI should augment judgement, not replace it. The organisations that thrive will be those that design systems where expertise and algorithms reinforce each other.
Another challenge is data itself. For years, companies have invested heavily in data lakes and integration programmes, yet many still struggle to create shared context across disciplines. A digital twin only becomes valuable when it connects engineering data, operational workflows and business objectives into a coherent model of reality. Without that context, AI remains blind. The industry must stop measuring digital progress by the volume of data collected and instead focus on how effectively that data drives decisions.
There is also a broader strategic issue. AI infrastructure is becoming an energy consumer in its
own right, creating a paradox where digital technologies designed to improve efficiency also increase demand for power and resources. The energy sector finds itself at the centre of this equation, tasked with both enabling the digital economy and decarbonising it. This requires a new level of collaboration between traditional operators, technology companies and infrastructure providers. No single player can solve it alone.
The organisations that will lead the next decade are already showing a different mindset. They are moving beyond pilots and embedding AI into core workflows. They are treating digital twins as operational environments rather than visualisation tools. They are training engineers to work alongside data scientists and giving front line teams ownership of digital outcomes. Most importantly, they are accepting that digital transformation is not a project with an end date but a continuous capability.
This is where the industry must be honest with itself. The old model, where digital initiatives sit on the periphery of operations, is no longer viable. Competitive advantage will come from organisations that redesign their operating models around intelligence, automation and adaptability. That requires courage from leadership, investment in people and a willingness to challenge decades of established practice. The message from this event, and from the conversations captured throughout this publication, is clear. Digitalisation in oil and gas is no longer about proving that the technology works. The technology already works. The question now is whether organisations are prepared to change fast enough to use it effectively.
The final word is this. The industry does not need more pilots, more dashboards or more buzzwords. It needs conviction. Digital twins and AI are not incremental upgrades to existing systems. They are the foundations of a new operational discipline. Those who embrace that reality will shape the future of energy. Those who hesitate risk becoming observers of a transformation happening around them rather than leaders of it.
The window to choose is closing
Shaping the future of work
Unlock greater performance across the energy value chain
Make the most out of the transforming oil and gas industry, and redefine how to design, operate, and maintain assets better for game-changing performance.