Well testing and surveillance have always been, and continue to be, the foundations of reservoir management. Fundamental data, such as pressure, rate, and temperature, and fluid samples are collected during a well test and used to investigate the subsurface. With advancements in modern technology such as smart wells, distributed pressure/temperature, real-time measurements, and extended-reach drilling, we are facing conditions with increasing complexity and unprecedented amounts of data.
In 2017, several key partnership announcements with major information-technology companies were noticed. The oil and gas industry is going through a digital transformation reinforced by data science; terms such as data cloud/lake, machine learning, Internet of Things, high-performance computing, automation, and model management are the new buzzwords. When analytics are applied correctly, they will provide valuable insights, especially in cases such as unconventional reservoirs with significant numbers of wells. In order to make the transformation a success, many of the industry’s leading experts agree that quality as well as quantity of data should be important.
Experience teaches us that the subsurface is always more complex than we expect, feedback is not instantaneous, and issues are difficult to mitigate in the reservoir scale. Hence, essential and information-rich data such as exploration/production-well tests, proper fluid samples, and sufficient/periodic surveillance should still be used effectively as indicators to peel off layers of uncertainty in the complex subsurface. Thus, we conclude that reservoir-engineering fundamentals must still be applied and data should still be quality checked, especially when collecting and applying analytics from a massive amount of information. The age-old “garbage in, garbage out” mantra will continue to apply in the upcoming era of data science.
The papers selected for this issue cover advances and opportunities in well testing. They also apply reservoir fundamentals as well as sound engineering judgment, using quantity but also quality data sets from conventional and unconventional assets.
This Month's Technical Papers
Recommended Additional Reading
IPTC 18924 Current State and Future Trends of Wireline-Formation-Testing Downhole Fluid Analysis for Improved Reservoir-Fluid Evaluation by S.R. Ramaswami, Shell International Exploration and Production, et al.
SPE 187348 New Variable Compliance Method for Estimating In-Situ Stress and Leakoff From DFIT Data by HanYi Wang, The University of Texas at Austin, et al.
SPE 185795 Step-Rate Test as a Way To Understand Well Performance in Fractured Carbonates by A. Shchipanov, IRIS, et al.
Heejae Lee, SPE, Senior Engineer, ExxonMobil Production Company
01 February 2018
Integrated Deepwater MPD Control System Increases Accuracy, Ease of Use
This paper presents lessons learned regarding design, testing, and installation of a completely integrated managed-pressure-drilling (MPD) control system on a deepwater drilling rig.
MPD Monitoring Technique Allows Rapid Redesign, Flexibility
This work focuses on the development of specific methodologies to support managed-pressure-dilling (MPD) operations implemented on real-time diagnostic software.
Alliances Drive Upstream Digital Deployment
Partnerships with big tech, tech startups, and innovative service companies—and the merging of their data, cloud, and software applications—are proving essential for operators in the scaling phase of digital deployment. Equinor, Microsoft, and Halliburton are among those joining forces.
No editorial available
Don't miss out on the latest technology delivered to your email weekly. Sign up for the JPT newsletter. If you are not logged in, you will receive a confirmation email that you will need to click on to confirm you want to receive the newsletter.
09 October 2019
11 October 2019
04 October 2019
No editorial available