Digital oilfield

Digital Transformation: Quest for Operational Efficiency

In recent times, we have been inundated with articles concerning the oil and gas industry’s race toward digitalization and automation. The pillars of such a transformation are the rapidly evolving industrial Internet of things, secure cloud computing, data analytics, AI, and machine learning.

jpt-2019-07-guested498495972.jpg

In recent times, we have been inundated with articles concerning the oil and gas industry’s race toward digitalization and automation. The pillars of such a transformation are the rapidly evolving Industrial Internet of Things (IIoT), secure cloud computing, data analytics, artificial intelligence (AI), and machine learning.

In addition, advances in sensor technology have enabled important breakthroughs in near-continuous real-time measurements, potentially transfiguring our ability to derive pertinent information from real-time measurements of processes that change over time. Sensors that are able to continuously report current downhole conditions in producing wells and surface facilities have become powerful tools for managing oilfields.

The skillful combination of these rapidly emerging digital and permanent sensor technologies is moving the industry toward advances in automation to optimize asset performance across the full life cycle of a reservoir with minimum human intervention.

Seeing the Unseen

We strive to see what is beneath the surface. Our ability to construct a digital twin image of the subsurface is solely dependent upon the data we measure. Oilfield data are classified into two categories: data at rest (archived) and data in motion (live streaming data). The huge amounts of data gathered over time pose a formidable challenge to automation. Secrets to E&P problems may lie hidden within the massive amount of accumulated data.

This quandary leads me to something I read in the MIT Sloan Management Review in the early 1990, called metaknowledge, an appreciation of what we know and what we do not know. Normally, we define knowledge as all the facts we have accumulated over time. Metaknowledge is a measure of our sense of awareness of the nature, scope, limits, and uncertainties associated with our primary knowledge. Often metaknowledge is more important than primary knowledge. Metaphorically, the authors, J. Edward Russo and Paul J. H. Schoemaker, described the difference by stating that knowing when to see a doctor (metaknowledge) is more important than how much we know about medicine (primary knowledge). To image the subsurface, we must extract and interpret pertinent information (metaknowledge) buried inside the massive amount of archived data (primary knowledge). It requires Big Data solutions and the development of complex mathematical algorithms that at times can mimic the reasoning processes of the human brain through AI.

The integration of 3D seismic data, through geology and petrophysics is at the core of defining a static reservoir model, which can predict reservoir performance. The static reservoir model provides a 3D database for storing 3D reservoir properties referred to as data at rest. The data at rest is updated periodically as new data are acquired (well log and time lapse 3D seismic).

Streaming live data measured by permanent sensors are referred to as data in motion. Data in motion include pressure and rate measured at the wellhead or bottomhole and temperature from distributed temperature sensing (DTS) fiber-optic cables at various measurement points (nodes) in the wellbore, on the seabed, and in surface facilities.

A View of Oilfield Automation

The oilfield is made up of three constituents: the reservoir, overburden, and surface facilities. Data are acquired from these constituents and archived continuously. The assimilated information is continuously used to construct the model of the oilfield that evolves in real time. If the model variables are known, the response may be predicted by solving the conservation equations governing the system; that is, a deterministic method is available to compute the response of the system to a known perturbation. Drilling a well into a hydrocarbon-bearing formation and continuously producing oil may be considered, in geological timescale, a perturbation. This is termed the forward problem. The inverse problem, on the other hand, refers to the determination of the plausible physical properties and processes of the system, given the observed response (measured data) to a perturbation. The inverse problem in reservoir engineering is called history matching.

A prerequisite to oilfield automation is the ability to seamlessly connect real-time streaming data from sensors to simulation and analytics. Central to such a mission is a high-speed oilfield simulator. The simulator must be capable of mathematically modeling the oilfield as one entity on economical cloud and edge computational systems. Combined with built-in AI, the simulator will enable automation of both optimization and history matching. The process of periodically performing history matching, in a time scale of practical relevance, is a leitmotif of oilfield automation. The model of the oilfield evolves in real time, garnering predictive capability with each update from the history-matched data.

The high-speed simulator, coupled with a resourceful and efficient IIoT system capable of reading directly from sensory devices, can process and analyze large volumes of data at rest and in motion, resulting in a system capable of true oilfield automation. The IIoT in this context consists of six horizontal layers of core services: field services (permanent and episodic sensing), data acquisition and analytics, IIoT and industry standard communication protocols, security and messaging, applications (high-speed oilfield simulator, history matching tools, and AI), and visualization anywhere, anytime. These services are vertically combined to achieve solutions that transcend the entire oilfield. The resulting self-learning system can diagnose problems, anticipate events, and, where possible, provide remedies. Such a system will immensely enhance the operational efficiency of an oilfield that includes hundreds of wells, while optimizing asset performance across the oilfield’s full life cycle with minimal human intervention.

Our industry is methodically moving toward adopting Industry 4.0 (fourth industrial revolution) solutions. For example, scientists at Emerson Automation Solutions have developed a high-speed oilfield simulator for recovery of unconventional resources. Physics is applied circumspectly (referred to as situational physics) in mathematically modeling ultra-low permeability formations. Situational physics is a set of assumptions that are relevant to the flow mechanism, the rock and fluid interactions, and the parameters that significantly impact the long-term production performance of these resources. The forward predictions from such a high-speed simulator used in consort with AI-assisted autonomous history matching yields superior performance.

Integration between geophysical and reservoir engineering data, constructing static and dynamic oilfield models, and quantification of uncertainty, especially that resulting from seismic time-to-depth conversion, will remain at the core of our processes. Industry 4.0 solutions are not a substitute for such tried and tested methods. Rather, they exploit them in a digital environment, which reduces time and eliminates risk attributable to human error. The application of AI will continue to play a central role in the digital transformation of our industry, but it must be helped to evolve to complete autonomy under the watchful eye of humans, to ensure that the laws of physics are obeyed.

michael-thambynayagam-2019.jpg

Michael Thambynayagam, SPE, is chief executive officer of MaxEUR, a partly owned division of Emerson Automation Solutions. His career spans more than 40 years in the oil and gas industry, including the positions of managing director of Schlumberger Gould Research in Cam­bridge, England, and general manager of Abingdon Technology Center in Oxford, England. He has been granted a number of patents in technologies related to chemical and petroleum engineering and has published extensively in the scientific literature, and is best known for his work on the mathematics of diffusion. A compilation of his work was published in 2011 titled, The Diffusion Handbook: Applied Solutions for Engineers. The book was the recipient of a 2011 Association of American Publishers PROSE Award for excellence in publishing in the physical sciences, mathematics, and engineering.

Thambynayagam is an SPE Distinguished Member. He received a PhD degree in chemical engineering from the University of Manchester, England, and was elected Fellow of the Institution of Chemical Engineers in 1984.