“Big data,” once only the concern of database geeks or a marketing technique among retailers, is now part of our mainstream consciousness and vocabulary. Big data is having a profound impact on the upstream E&P industry as well.
What is “big data”? According to a McKinsey Global Institute report, “big data refers to datasets whose size is beyond the ability of typical database software tools to capture, store, manage, and analyze.” I like this definition because it does not require a dataset of any specific size. It allows datasets to increase as technology advances. This definition also varies to the extent that different industries have different standards of database and analytic software tools.
To get a feel for just how big data has become, consider recent rates of data growth. In 2012, disk drives worldwide added 7 exabytes (1018) of new data. Global data volumes are projected to increase 40% per year, while global information technology (IT) spend will grow only 5%. Thirty billion pieces of content are shared on Facebook each month. Or consider that Asia now leads the world in the generation and storage of personal location data due simply to the number of mobile phones—an estimated 800 million in China alone.
How mainstream has big data become? Consider these examples of wide–spread usage:
- High-speed supercomputers statistically analyze massive amounts of data, enabling scientific breakthroughs such as the gene sequencing of individual organisms and entire ecosystems.
- Coaxing the Higgs boson into a nanosecond of existence required the global distribution and analysis of roughly 200 petabytes of data.
- Or, more familiar to most of us, retail organizations cleverly obtain our behaviors, preferences, and product perceptions by monitoring our Facebook and Twitter information.
The combination of massive amounts of data and sophisticated analytics is profoundly affecting a spectrum of industries, and oil and gas is no exception.
Big Data in the Oil Field
The amount of data that our industry acquires, communicates, stores, and analyzes is exploding exponentially. For example, the growth of land seismic acquisition is one of the most data-intensive phases of our industry. Channel count has roughly doubled every 3.5 years since 1970. Together with advances in the acquisition technology—which now measures vector information rather than scalar—the data acquired and stored in a typical survey has grown from gigabytes to petabytes. Furthermore, the growing popularity of permanently installed geophones for monitoring fluid fronts, passive monitoring of carbon capture sequestration sites, and observing microseismic events during hydraulic fracturing has driven an extraordinary and daunting growth in seismic data of all forms.
As a consequence of this data explosion, technology continues to advance. The industry has developed significant improvements in survey design, data compression techniques, auto-picking algorithms, and intelligent storage schemes. As an aside, it is worth noting how much value is created by new technologies that store and analyze what was once considered “noise,” revealing even more detail within complex reservoirs.
On the production side of our business, a well-known example of big data and associated analytics is the advent of the “smart field” or “digital oil field.” Here, the carefully defined combination of IT and data acquisition with intelligent production engineering analytics—often including artificial intelligence—has generated a deluge in data. Examples range from water/oil ratios along intelligent completions to measurements of methane emissions during hydraulic fracture flowbacks.
To understand the often frustrating amounts of data involved, consider the fact that for each well in a digital oil field we now measure and store—often using permanently installed fiber—different flow rates for each phase, various pressures and temperatures at the wellhead, as well as electrical submersible pump parameters, environmental data, and power usage. As a result, smart engineers now apply sophisticated algorithms and powerful software tools to enhance decision making and reduce cycle time, and to optimize productivity, return on investment, and net present value, thus reducing the need for skilled on-site personnel.
Big data combined with smart people and smart software is proving to be very powerful.
The rate and volume of growth in data generation and usage will continue to increase in our industry over time.
In thinking about recent trends, it is apparent that the value of taking many more measurements is increasing rapidly. Take the drilling domain, for example. From high-telemetry logging-while-drilling or measurement-while-drilling measurements to the many parameters required for automated or assisted drilling on the rig floor, the amount and the rate of drilling data being acquired, stored, transmitted, and interpreted is increasing dramatically every year. And as we are asked by regulators and the public to monitor our operations more closely, big data will only get bigger.
In each case, growth in data alone is not useful. While generating insights from more data is always important, “actionability” is the hallmark of big data, through data analytics and old-fashioned smart petroleum engineering. The next wave of technology innovation in our industry and the next level of understanding our reservoirs will depend on our ability to integrate diverse data, measurements, and domains. This implies, of course, that our young professionals will require additional skillsets to succeed.
SPE Activity in Big Data
With the growth in E&P applications of big data SPE has recognized that members need to collaborate on these new issues and to learn how to leverage the many opportunities provided. The first SPE Intelligent Energy conference was held in 2001 and since 2008 we have held seven conferences, 12 workshops, and one forum, dealing with intelligent energy and digital oil fields. A quick search in OnePetro shows more than 6,500 papers on the subject. As part of the new SPE strategy to create lifelong learning and fast-track young professionals, the topic will become more prevalent in all our meetings, whether it is the main topic or the catalyst behind the next big discovery.
Each month, I post my JPT column topic on the SPE LinkedIn group for comment and conversation. I invite you all to join in this discussion and look forward to hearing your viewpoints.
Manyika, J., Chui, M., Brown, B., et al. 2011. Big data: The next frontier for innovation, competition, and productivity. McKinsey Global Institute.
Jeff Spath, 2014 SPE President
01 January 2014
No editorial available
Don't miss out on the latest technology delivered to your email weekly. Sign up for the JPT newsletter. If you are not logged in, you will receive a confirmation email that you will need to click on to confirm you want to receive the newsletter.
26 May 2020
20 May 2020