February
FEATURES

State of digitalization in the oil and gas industry

Digitalization in the oil and gas sector still has a long way to go. Given the benefits to safety, efficiency, emissions reduction and compliance, it is only a matter of time before the industry will need to go completely digital. 

BARORUCHI MISHRA, Group CEO, NET Enterprise Group 

At the highest level, digitalization addresses just two issues that all oil and gas operations (and indeed any operation) face today—poor efficiency in the building and operational phases, and safety exposure of facilities and their personnel. The rest of the benefits or use cases of digitalization are all but derivatives of these two overarching goals.  

It needs to be understood that the build phase entails the full project lifecycle of an upstream O&G business—identify, assess, select, define and execute. The operational phase relates to the operations of both offshore oil and gas platforms and rigs. I would venture to add that greenhouse gas (GHG) reduction as part of the safety category is at a global level and relates to the safety of the planet. 

SAFETY FIRST 

Lack of data-driven decision-making and/or confirmation and optimism biases has led to major disasters in the oil and gas sector. The Macondo disaster in the U.S. Gulf during 2010 is a textbook case of the inability of the crew to use available data logically. While there were many root causes, some of which were related to the prevalent safety culture in the organization at the time—influenced by bp’s focus on cost cutting1—the immediate cause was the crew choosing to ignore the available data to make critical on-the spot decisions.  

Fig. 1. Making sense of real-time data to prevent accidents.

Could this have been prevented by a true-sense digitalization? Using a data platform that could ingest real-time multidisciplinary data from various sources and contextualize them, using the right algorithms/AI through various application layers to analyze and prompt appropriate actions to the crew, would have certainly provided a higher probability of preventing the accident, Fig. 1. A short analysis of the Macondo incident shows how the lack of proper analysis of data could have led to inappropriate decision-making: 

  • The negative pressure test, which is a confirmation test of well integrity, showed that the data were not in line with the expected trend for a successful test, and this could be due to possible hydrocarbon leaks. Normally, this is a red flag, and yet the crew dismissed the data and attributed it to a test error. A serious consideration of this data would have led them to conclude that if nothing was done, a blowout could be the consequence. 
  • Data related to anomalies in mud returns and flowrates, which the crew was getting from real time monitoring, were not on expected lines. This did not make the team sit up and take notes, as the data were not contextualized and integrated with the fact that a cement bond test had not been carried out, despite some data that showed the cement may not have been set properly. 
  • The delinquencies in the maintenance of the BOPs—mechanical issues with the shear rams of the BOP, due to hydraulic leaks and low actuator pressure that could reduce shearing force etc.—could have been integrated with real-time data from the negative pressure tests, mud returns and flowrates. This would have highlighted to the crew the impending dangers of wellbore instability and or formation influx. 
  • As is mostly the case, the crew probably relied on the alarm systems that, in many cases, are normalized and not tuned properly to pick up transient/dynamic changes.  

A digital twin (DT) of the well would have integrated and analyzed the fragmented computer systems that deal separately with maintenance routines of the drilling equipment like BOPs, real time operational data (mud returns, flowrates), casing design, drilling philosophy, cementing philosophy, reservoir properties (pore pressures, real-time well logs), etc. Further, if analogues had been available, they would have compared the results of their analysis, flagged impending consequences, and raised key actions for the crew to be taken immediately to prevent the disaster. The crew could have wholistically understood the situation and acted, based on the recommendations from the DT. 

EFFICIENCIES 

Digitalization, incorporating the use of A.I., can bring about significant efficiencies, as we detail in this section. 

Digitalization helps with speed of maturation of O&G reserves: Besides safety, A.I. is playing a key role in improving the time to maturation of oil and gas discoveries. Companies like Shell have set targets to improve their pre-FID schedules (assess, select and define phases) by heavily relying on A.I.  

One key aspect that helps with attaining this improvement in scheduling is the ability of A.I. to analyze the data on the development concepts from analogous fields in the company’s database. That database includes over a thousand developments in the past, and A.I. picks up the one that is closest to the new field. This is then analyzed further to fill the gaps as needed, due to certain features that are specific to the field in question.  

The component-level cost benchmarks for these concepts are readily available from past developments, as all the relevant SAP data are integrated on a single data platform. By proper application of their Capital Cost Escalation tools (which, in itself, is an output of a very extensive data analysis, based on economic outlooks of the commodities market), the costs in terms of money-of- the-day is ascertained. While using internal benchmarks, it is ensured that the costs of projects, which might have been train wrecks, are not normalized; external benchmarks are used, if the internal benchmarks are an outlier. A.I. helps with these choices. 

It is fair to say that the use of these digital tools helps with speed, besides creating a robust information pack for decision-making on the FID. Post-, when the project enters the execution phase, digitalization becomes more difficult, as the key activity of execution—fabrication—has not adopted automation as completely as the manufacturing sector for say, cars and equipment. The challenges to automation in fabrication of platforms and rigs can be solved by a modular assembly line approach. This approach can lend itself to the use of robotics and, hence, a higher level of automation.  

The cyclical nature of oil and gas projects prevents the fabrication yards from investing in automation; this needs to change. Innovation should help bring down the cost of automation. 

Operational phase efficiency improvements. Achieving higher efficiencies essentially means doing more with less by removing wastage of resources (time, capital and human resources).  A study by RMI in 2019 concluded that out of the roughly 606 exajoules (Ejs) produced in that year from fossil fuels, only 227 EJs were consumed by end-users. On offshore and onshore oil and gas facilities, unplanned shutdowns led to excessive flaring, CO2 emissions and methane releases, due to high-rate inventory blowdowns.  

Fig. 2. Digital twins are being deployed increasingly to enable a variety of functions, as well as improve efficiencies.

Digital twins improve efficiency. In the conventional ways of working, humans act as the invisible integration layer for all data from production, engineering, maintenance, wells, reservoirs, pipeline networks, export, commercial (gas/ oil sales contractual volumes and specs), vendor data, ongoing construction data, etc. Teams know where to find the relevant data, try to collate them and make inferences that lead to decision-making that is often tempered with their own experiences. This process has shortcomings, namely, siloed data, due to resistance across department boundaries, an inherent lack of human ability to ingest large data sets and/or use real time data for decision-making. DTs help improve this situation. 

Foundation-level DTs are being deployed increasingly in oil and gas operations, Fig. 2. Many operators are using them to typically enable a variety of functions, including document search; root cause analysis; management of change; 3-D visualization; simple design changes for brownfield modifications; situational awareness; remote access; inventory management; risk-based inspection and maintenance; failure prediction; hydrocarbon accounting; real-time update of the KPI Dashboards; etc.  

This is achieved by standardizing, contextualizing and integrating data from information technology (IT) systems: ERP; databases (SAP and Maximo); emails; cybersecurity; intranet; servers and cloud systems; historical well tests; Operation Technology (OT) systems ( Scada, DCS, PLCs, field instruments, safety systems, etc.), and engineering databases (3D models, Intelligent P&IDs and Laser Scans, Application Integrations Frameworks, etc). 

This type of data standardization, aggregation and interpretation has allowed better planning and scheduling through Integrated Asset Planning (IAP), margin improvements through optimized procurement behaviors, price sensitive product choices, etc. It also has allowed interoperability, which means a single A.I. model trained for real-time optimization on one facility can be used in the other facility, if the technology is the same. 

Marrying the thermodynamic behavioral attributes to the physical attributes of static 3D model leads to the creation of dynamic transformational digital twins. This essentially means:  

  • Integrating the static 3-D models with the physics-based thermodynamic simulation models like dynamic HYSYS, GAP, prospers for optimization of the operating envelopes, pipeline networks, well inflow performance to improve efficiency and ultimate recovery, etc.  
  • This also allows real-time optimization through continuous update of dynamic models and historical matching of reservoir models with real-time data.  
  • Small or large language models (LLMs) will also allow DTs to adapt. 

In summary, dynamic DTs can truly convert data into end-to-end operational insights and keep them current. And in all this, the integrator, i.e. the data-platform, is the key. A versatile data platform has the characteristics that can be summed up as “Any to Any to Any” i.e.: 

  • It is agnostic to the type of data or the source of the data that it can ingest 
  • One can use it on premise servers or the cloud to store standardized data  
  • It can provide outputs in any format to any user. 

The standardization offered by the data platforms guarantees interoperability. Quite a few service providers of data platforms—dDriven Solutions’ “UNLSH”, BH’s “Cognite Data Fusion,” Konsberg’s “Kongnifai,” etc., are being used by multiple operators in the industry.  

Agentic A.I. It is often said, “wisdom lies in knowing what to do, skill lies in knowing how to do it, but the real virtue lies in actually doing it.” If we interpret this in digitalization terms, data standardization, contextualization, integration and analysis can be equated to wisdom, AI/ML can be thought of as skills, but the actual doing is largely left to humans, at least in the oil and gas sector. Automation with the use of robotics has become a large part of the “doing” in the manufacturing sector but not so much in oil and gas. And so, we are missing the “real virtue!” 

And yet, for quite a few processes, A.I. can become an “agent” for completing the physical task. However, for it to do so, the operations and maintenance teams on the oil and gas platforms or on the drilling rigs will need to enact AI-enabling behaviors. This can be explained with a simple example. 

Fig. 3. A.I. can analyze turbine performance and determine whether a turbine wash is required.

Let us take the example of a typical oil and gas facility. One of the maintenance activities on a gas turbine in the facility is a “turbine wash.” This could be run-hours-triggered activity or a condition/ performance-triggered activity as per the maintenance schedule being followed. This is typically done to remove compressor fouling caused by deposits of dusts, oil, salts etc., on the blades of the compressor section of the gas turbine and restore efficiency.   

In some cases, where digitalization has been implemented on oil and gas facilities, A.I. analyses the gas turbine performance, Fig. 3. Once A.I. has established that a turbine wash would improve the performance of the turbine, a prompt is generated by it for the maintenance personnel to plan the on-line wash.  

However, A.I. can also create an agent to carry out the wash by itself, if all components of the wash system—demineralized water, injection system, wash-chemical—are kept connected in the right sequence and all valves and pumps in this setup can be operated without a man-machine interface. This requires an AI-enabling behavior from the maintenance teams, i.e. the team should ensure that the system set up for the turbine-wash should always be kept online, and no manual vales in the wash set-up are kept closed and all ingredients, water/chemicals, etc., should not be below the required levels in both quantity and quality terms. 

The agentic A.I. can operate the machine controls and start load share with the standby turbine by starting it up, reducing speed of the given machine to 70-90% of the full load speed, and carrying out the turbine wash. Once done, the original status can be restored. Again, the maintenance teams must ensure that the valve lines for start-up of the standby machine are all permissive (i.e. no manual valves are closed) for remote start-up, using only PLCs/DCS controls. 

In this example, A.I. agent has completed a task that could add 3-4% to efficiency. A list of this type of maintenance activities can be created on an offshore platform, and A.I. agents can be created to perform each of them. The complexity of the activities that A.I. agents can deal with can grow over time. For example, prevention/correction of fouling in a dehydration tower or in an amine contactor in an upstream facility is a complex task with many variables; SLMs/LLMs can make A.I. adapt and fine-tune the agents for more complex tasks. 

Building trust in data and A.I. All too often, folks have complained about A.I. hallucination, defined as getting a reply from A.I. that sounds convincing but is clearly wrong! A lot needs to be done to ensure reliability and fidelity of data and AI outputs.  

For upstream operations, if it goes wrong, it can go wholly wrong! Some areas that will, therefore, need focus are: 

  • Data governance: Use of industry regulations and standards for data handling. 
  • Cybersecurity: Cyber threats should be firewalled. 
  • Robust checks and balances. 
  • Checks and balances provided by domain experts. 
  • Robust test protocols and validation processes before field applications. 
  • AI training and capability building. 
  • Emphasis on a clear Digitalization Roadmap for Oil and Gas companies 
  • Creation of a resource-loaded schedule to achieve this roadmap. 

Finally, one needs to be cognizant of the fact that for all the hype, digitalization in the oil and gas industry still has a long way to go. By one estimate, there are about 12,000 to 15,000 offshore platforms globally, but only one-tenth of these might have any meaningful digital/A.I. capability! Given the benefits to safety, efficiency, emissions reduction and compliance, it is only a matter of time before the industry will need to go completely digital. 

It goes without saying that the human aspects of this transition will need to be managed well. Otherwise, the “yes-but” approach to digitalization will continue to be a hurdle! 

REFERENCE 

  1. “Disastrous Decisions: The Human and Organisational Causes of the Gulf of Mexico Blowout” by Andrew Hopkins.

 BARORUCHI MISHRA serves as the Partner and Group CEO of NET Enterprise Group, which comprises companies offering EPC, EPCM, PMC, front-end engineering (Concept Select and FEED), and advanced digitalization services (Industry 4.0 solutions). The group provides specialized consulting and delivery across global oil & gas and energy transition projects. Mr. Mishra is a seasoned techno-commercial leader with over 32 years of experience in the oil & gas and New Energies sectors. He joined NET in September 2023. Previously, he held a leadership role at Shell as project director for Growth and Transition Projects in India. Mr. Mishra began his Shell career in 2013 as vice president of Engineering, overseeing engineering support for projects and assets across India, the Middle East, Southeast Asia, Australia, and China. His international experience includes roles as General Manager of Shell’s LNG Program Office (Netherlands) and G.M. of Project Performance Assurance and Benchmarking at Shell Projects & Technology. Before Shell, Mr. Mishra worked with BG Group as technical director for BG Asia Pacific (Bangkok) and as director, Projects & Operations, for BG Exploration & Production India (Mumbai). His energy career began with ONGC and later included stints at Enron and Cairn Energy.  

 

Related Articles FROM THE ARCHIVE
Connect with World Oil
Connect with World Oil, the upstream industry's most trusted source of forecast data, industry trends, and insights into operational and technological advances.