Machine Learning Optimizes Duvernay Shale-Well Performance
This paper discusses how machine learning by use of multiple linear regression and a neural network was used to optimize completions and well designs in the Duvernay shale. The methodology revealed solutions that could save more than $1 million per well and potentially deliver an improvement in well performance of greater than 50%. The work flow described rigorously analyzes the relationships between a significant number of well-completion variables, predicts results, and performs optimizations for ideal outcomes. The work flow is not Duvernay-specific and can be applied to other basins and formations.
Introduction
A fundamental problem for machine learning in many industries is that a responding variable is controlled not by one but by a number of predictor variables. Inferring the relationship between the responding variable and the predictor variables is of key importance. Interactions between predictor variables and noise in the data complicate matters further. This problem can be solved with multiple linear regression or a neural network, both of which use all of the predictor variables together. However, care must be taken to obtain a model that is truly predictive and not merely a result of overfitting the data.
In unconventional oil and gas reservoirs, well performance generally is characterized either at the well level by detailed technical work such as rate-transient analysis, microseismic, and other techniques or at the field level by statistical methods with ranges for production performance. Refinement of this statistical interpretation generally involves normalizing for only one or two key parameters, such as lateral length or tonnage. Additionally, wells usually are grouped or excluded entirely from the population for various reasons, such as substandard completion design. This introduces bias in the selected wells and reduces the sample size. As a result, this approach is limited to the key variables identified and the bias introduced by the well population selected.
The idea of using a neural network has been executed successfully in the past to optimize completions. However, data sets were limited. Recently, the use of machine learning has grown substantially by integrating more variables in the analysis, which reduces reservoir uncertainty.
If you would like to continue reading,
please Sign In, JOIN SPE or Subscribe to JPT
Machine Learning Optimizes Duvernay Shale-Well Performance
01 May 2019
Shale, Tight Oil Investments to Fall 12% in 2020; Global Deepwater Rising
Lower oil prices and capital discipline are expected bring forth a double-digit drop in shale and tight oil spending, while deepwater momentum is seen continuing. This comes as “massive investments” will be needed in the next decade to meet growing oil demand.
Enverus Forecasts “Significant Slowdown” in US Gas Output Growth in 2020
The data analytics firm projects US dry gas output growth will shrink to 2 Bcf/D next year from 8–9 Bcf/D in 2018 and 2019.
Total Officially Enters Libya’s Waha Concessions
Total will invest $650 million in the development of two main projects, North Gialo and NC 98, resulting in a 180,000-B/D increase in production.
No editorial available
ADVERTISEMENT
STAY CONNECTED
Don't miss out on the latest technology delivered to your email weekly. Sign up for the JPT newsletter. If you are not logged in, you will receive a confirmation email that you will need to click on to confirm you want to receive the newsletter.
ADVERTISEMENT
No editorial available
ADVERTISEMENT
ADVERTISEMENT