Reservoir-simulation-model inputs are numerous, and uncertainty is pervasive—before, during, and after development. On top of that, there is always pressure to deliver quality results as quickly as possible. This gives rise to a simple question, one that has yet to find a simple answer: How refined is refined enough and how coarse is too coarse? I run the risk of oversimplification here, but it seems we are faced with a classic dichotomy, one that is exacerbated by the pull of advances in high-performance computing that permit ever-greater model refinement, while, simultaneously, we have the push of (possibly stochastic) sampling of uncertainty, thereby encouraging the development of simplified (or surrogate) models that can run hundreds, even thousands, of times. The question is one of striking the right balance between two apparently contradictory approaches to simulation. The adage “horses for courses” is not particularly helpful in itself, even though it is probably appropriate. Does one have two distinct models with roughly commensurate scaling, or can we build a single all-purpose model with different scales within it that is both fast and accurate with multiscale grid?
Dimensional scale represents just one aspect of the term “multiscale,” which I have mistakenly taken to mean just the juxtaposition of, essentially, geometrical scale within a single model (such as that found in coupling a simulation grid and the wellbore). The large ratio associated with domain size, and the resolution of the geological data, is usually managed by upscaling. However, so-called multiscale methods represent a new avenue of research, one that may provide a bridge between the aforementioned push and pull of refinement resulting from the needs of different decision makers. Multiscale, which has been the subject of ongoing study over the past decade, knits together geometrical quantities (dimensional scale) with tailored computational schemas (numerical scale). This multifaceted multiscale concept may offer a means to construct an accurate coarser-scaled model, one honoring the attributes of the fine-scale heterogeneous geological data from both numerical and spatial standpoints. This method class computes local basis functions for the solution variables, to construct a smaller (coarse) system for computing an approximate solution on the original simulation grid.
While it is too early to say whether this broader notion of multiscale (numerical and geometrical) will provide a single, unifying, model for engineers, it is possible that this, or some other such method, may strike that elusive balance between refinement (accuracy) and surrogacy (speed). For those interested in reading up on this topic, the peer-reviewed SPE papers SPE 119183 and SPE 163649 provide more detail and clarify the status of some ongoing research.
This Month's Technical Papers
Recommended Additional Reading
SPE 169063 Application of Multiple-Mixing-Cell Method To Improve Speed and Robustness of Compositional Simulation by Mohsen Rezaveisi, The University of Texas at Austin, et al.
SPE 177634 Multiscale Geomechanics: How Much Model Complexity Is Enough?by Gerco Hoedeman, Baker Hughes
SPE 174905 Experimental Design or Monte Carlo Simulation? Strategies for Building Robust Surrogate Models by Jared Schuetter, Battelle Memorial Institute, et al.
SPE 169357 Reduced-Order Modeling in Reservoir Simulation Using the Bilinear Approximation Techniques by Mohammadreza Ghasemi, Texas A&M University, et al.
William Bailey, Schlumberger
01 July 2016
ConocoPhillips Leaning on Conventionals Near-Term, Unconventionals Long-Term
Most of ConocoPhillips’ oil and gas production by the end of the next decade will come from its unconventional operations. But, for the near-term, the Houston independent will rely on conventional assets as it seeks to keep spending in check, decline rates low, and cash flow on the rise.
Alliances Drive Upstream Digital Deployment
Partnerships with big tech, tech startups, and innovative service companies—and the merging of their data, cloud, and software applications—are proving essential for operators in the scaling phase of digital deployment. Equinor, Microsoft, and Halliburton are among those joining forces.
In my view, we still do not possess a full understanding of oil production in unconventional fractured reservoirs. Our ability to forecast such assets remains elusive, even with copious amounts of analytics, mountains of data, and an arsenal of machine-learning tools.
No editorial available
Don't miss out on the latest technology delivered to your email weekly. Sign up for the JPT newsletter. If you are not logged in, you will receive a confirmation email that you will need to click on to confirm you want to receive the newsletter.
12 May 2020