Tapping the Value From Big Data Analytics
In a recent GE/Accenture report, surveys show that 81% of senior executives believe that big data analytics is one of the top three corporate priorities for the oil and gas industry through 2018. A striking finding was the sense of urgency felt by respondents about implementing data analytics solutions. This trend is driven primarily by current market conditions that are pushing companies to find new ways to become more efficient in exploration and production.
In the quest by operators to become more efficient, many new initiatives are currently under way within organizations. Driven primarily by central excellence teams, objectives are to leverage both low- and high-resolution data (high-resolution data defined as data collected in seconds/minutes) to make better decisions quickly vs. a traditional approach of evaluating trends over a 12- or 24-hour period or after the fact using old reporting methods. Decision makers are convinced that if other industries such as airlines and consumer Internet players such as Amazon and Expedia can leverage big data to drive efficiency and growth, the same should and can apply to the oil and gas industry.
Actionable Insights and Lower Costs
If their assumptions are correct, leveraging hidden insights from mining data can help enterprise users make better, smarter decisions and reduce operational costs. However, as the industry pays closer attention to these initiatives, it is getting exposed to some harsh realities, including big data being uncharted territory for information technology (IT) and a company’s business side. Efforts to improve may actually lead to worse instead of better decision making if conducted using the wrong approaches.
Further complicating the data analytics issue, most IT organizations are traditionally more familiar with process automation projects where business needs are known and stable. In contrast, data needs are context-dependent, dynamic, and may be unarticulated or even unknown sometimes. Solving this challenge requires anthropological skills that are in short supply in today’s IT world. Unfortunately, traditional requirements gathering fails when assessing data needs since the needs are fast-changing and diverse. Additionally, today’s machine data quality (especially on historical data) lacks accuracy, precision, completeness, and consistency for real-time analytics. As a practical matter, less than 50% of today’s enterprise users find information from corporate sources to be in a usable format. This problem will only get worse as the number of information sources, uses, and users continues to increase.
Also, IT does not have a sufficiently deep understanding of how, when, and why information will be used by specific user segments. At the same time, enterprise users do not fully trust data from others or their functions and current tools in the organization today. On the other end, with cost dominating every decision in today’s market, excellence centers are using traditional approaches of trials with multiple platforms without realizing the scalability and repeatability of the piloted solutions. Can management afford to wait years for these initiatives to showcase value when in today’s market capital efficiency, faster adoption and implementation, lower total cost of ownership for many workloads, and shorter development cycles are key to surviving? Quite the opposite. Today it is imperative to view technology platforms as a combination of technology, delivery, and price model that supports an enterprise user adoption quickly vs. the traditional one-dimension “technology only” initiative.
Realistically, the time is now for data analytics champions within oil and gas companies to consider adopting radical thinking while practicing “lessons learned” and avoiding faulty actions from the past. The following approaches can help management ensure that data initiatives will more quickly move their company toward generating intended results fast.
These analytics are targeted toward answering the question: “How do we make an enterprise user’s work life better as consumer products do in people’s personal lives?” It involves developing an understanding of their daily pain points, segmenting their information usage patterns, and their stance toward technology adoption (e.g., visualization, delivery of business insight expectations). This differs from the traditional approach toward deep customer intimacy, i.e., gathering user requirements in RFP and ensuring that platform providers can satisfy them. Instead, the focus must be on mapping “data to decision” loops for enterprise users and reducing latency in them significantly to provide accurate actionable insights to the enterprise users.
For successful adoption of this approach, consider the following:
- Decision-based questions—Identify the universe of decisions that enterprise users are required to make daily.
- Data architecture—Enable flexible, on-the-fly analysis capabilities through state-of-the-art architecture organized around key daily decision questions.
- Contextualized information access—Provide enterprise users with access to information organized to address their top daily business questions.
- Data quality transparency—Provide transparency into cleaning, filtering, and assembling all data sources to help the enterprise user gain trust in the data that will be used for decision making.
In contrast with workflow-based analytics, optimization-based analytics are targeted toward answering the question whether reservoirs and downhole tools can be optimized to preempt failures and ensure that timely actions can be taken beforehand. Note the greater emphasis on downhole tools/reservoir optimization vs. creating customer intimacy. Though a promising dimension, this takes longer for organizations to realize value and advance to enterprise user adoption. That is primarily because only a few optimization experts from the central team drive it with a very heavy engineering focus vs. actual enterprise user involvement. This garners attention from IT/central teams and traditional data platform providers as it justifies the huge upfront cost of buying these platforms, eventually ending up as an IT project focused on the technology-only dimension.
Timelines to realize value from this approach are relatively longer than the previous approach for some interesting reasons:
- It requires a lot of heavy lifting to map/configure/assemble the data from disparate sources, and additionally the disengagement of actual operational enterprise users, primarily the central group team, is involved.
- Since it is focused on solving very complex problems, the volumes and types of disparate data requirements to create optimization algorithms are cumbersome because legacy data lakes are fraught with bad quality data.
- The designed solution may solve problems in a region/geography but is usually not scalable and repeatable easily to others (due to complexity of reservoirs, formations, and inconsistency of standardization of downhole tools).
- The complexity of models requires a team of experts to vet the results 24/7, which is a huge upfront investment, not to mention change management and new processes introduction that are never easy to get implemented and adopted in the enterprises. Underscoring what management faces, an enterprise user survey revealed high dissatisfaction within the enterprise user community today around current IT. They voice the opinion that solutions being piloted are barely meeting their needs, complex to use, and require extensive heavy lifting, i.e., requiring business experts from vendor teams to extract value from them. One senior executive at an oil and gas enterprise said: “If you give a Lamborghini to a 12-year-old, will he have a clue how to get high performance?” He expected a negative response.
For entirely too many years, oil and gas companies have possessed a virtual gold mine, acquired simply by conducting their daily operations but typically underutilized and undervalued or not leveraged at all. That vital commodity is data and its value is now being viewed in a new “bankable” perspective through the power of big data analytics.
No matter which approach oil and gas management takes, the crux boils down to: “How do we apply the big data platform quickly to generate value and enable the ability to find and analyze information to make better decisions and insights at a reasonable investment?” Management throughout the oil and gas industry has a unique opportunity to realize quick wins and value from data platforms by focusing initially on workflow-based analytics to address the applicable facts and issues. The workflow approach results in generating business value and deeper customer intimacy as the key to adoption of any enterprise platform vs. the other approach where only business value will be realized over time.
Amit Mehta is CEO of Houston-based Moblize (www.moblize.com). He holds a master’s degree in manufacturing and business management and a BS in mechanical engineering from Cambridge University.
Tapping the Value From Big Data Analytics
Amit Mehta, Moblize
29 November 2016
BHA Behavior-Prediction Model Offers Potential for Automated Analysis
A new 3D drillstring model determines the static and dynamic behavior of bottomhole assemblies (BHAs) in realistic wellbores.
Digital Twins Mature in the North Sea
A proving ground for the use of digital twins has emerged in the North Sea. There, operators Total, Aker BP, and Shell have each developed and deployed twins that they expect to pay big dividends.
Value of ROV Data Extends to Decommissioning Strategy
The complete paper describes the potential global scientific value of video and other data collected by ROVs.
No editorial available
Don't miss out on the latest technology delivered to your email weekly. Sign up for the JPT newsletter. If you are not logged in, you will receive a confirmation email that you will need to click on to confirm you want to receive the newsletter.
01 December 2019
01 December 2019
03 December 2019
No editorial available