Data & Analytics

Moving to the Edge Is Crucial for Oil and Gas Companies To Make Better Use of Data

The oil and gas industry already lives on the edge when it comes to the remote and often inhospitable geographic locations that it operates in, but now it is moving its computing to the edge to gain valuable business insights that can increase operational efficiency and profitability.

edge.jpg

The oil and gas industry already lives on the edge when it comes to the remote and often inhospitable geographic locations that it operates in, but now it is moving its computing to the edge to gain valuable business insights that can increase operational efficiency and profitability.

For any industry, downtime is an anathema, a situation that all process companies strive to avoid. It can be costly and disruptive in any industry, but, for oil and gas companies, it can be particularly expensive. According to an Massachusetts Institute of Technology Sloan study, a single day of downtime for a liquefied natural gas (LNG) facility can cost $25 million. And a typical midsize LNG facility goes down about five times a year.

It is well recorded that oil facilities, both upstream and downstream, generate vast volumes of data. A report from Cisco estimates that a typical oil platform generates up to 2 TB of data every day. This creates enormous challenges when it comes to communications, storage, and analysis. One solution would be to collect less data, but, with the growth of sensors, there is no sign of any reduction in the flow of data, quite the opposite.

“While technologies such as cloud computing and hybrid storage have been touted as solutions, these still rely on data being transmitted, and, with many offshore facilities working on satellite communications at a speed of around 2 Mbps, that is still not practical,” said Jane Ren, chief executive officer and founder of Atomiton. “The obvious solution would be to deal with that data on site as close as possible to where it is generated. Not just handled but analyzed and used to deliver actionable business information. That is why edge computing is rapidly becoming a crucial tool in the industrial Internet of things (IIOT) toolbox.”

At this year’s World Mobile Congress, Jonathan Carpenter, Petrofac’s head of strategy, spoke about the struggle for uptime that his company had in its North Sea operations. He quoted average uptime in the North Sea as 73%, and he compared this to the aviation sector that had an uptime of 99.9%. He asserted that this low figure had traditionally been acceptable to the oil and gas industry because of the high oil prices. His answer was what he called Petroltytics, that utilized predictive analytics from data collected and processed by edge devices.

As Ren explains, edge computing is not a new phenomenon, but the maturation of several key technologies has made it much more viable in the current climate. “Firstly, there is the diminishing cost of computing power and sensors that reduce the cost of most IIOT applications,” she said. “Then there is the increasing amount of data both from within the process and externally, such as weather or commodity/energy pricing. Then there is the smaller footprint of computing devices such as microcontroller units (MCUs) and single-board computers or 'system on a chip.'

“But possibly the most significant driver is the growth of advanced machine learning and analytics capabilities that makes computing on the edge a very valuable process. MCUs are very-low-cost tiny computational devices. They are often found in the heart of IoT edge devices. With 15 billion MCUs shipped a year, these chips are everywhere. Their low-energy consumption means they can run for months on coin-cell batteries and require no heatsinks. Their simplicity helps to reduce the overall cost of the system.”

Read the full story here.