Challenges and Solutions Surface at SPE Annual Meeting

The frontiers of deepwater and pre-salt reservoirs garnered much attention at the 2013 SPE Annual Technical Conference and Exhibition (ATCE) held in New Orleans in October.

jpt-2013-11-speannualfig1hero.jpg
More than 12,000 people attended SPE’s Annual Technical Conference and Exhibition this year in New Orleans.

The frontiers of deepwater and pre-salt reservoirs garnered much attention at the 2013 SPE Annual Technical Conference and Exhibition (ATCE) held in New Orleans in October. The conference brought together 12,028 oil and gas professionals from around the world.

“This year’s ATCE ranked as the highest attended since 1999,” said Egbert Imomoh, 2013 SPE President. “The event capped a year of tremendous success for SPE with a growing global membership, increased technical content, and strong financial performance.”

The conference received 1,773 technical paper proposals this year, compared with slightly less than 1,500 a year ago. The 2013 program of 52 sessions included more than 300 technical papers and 67 ePoster presentations. With panel sessions, there were more than 400 presentations in total.

Opening Session

Discussion at the opening general session revolved around safety, technology, and the cost of deepwater oil and gas projects. A government official and four industry top executives made up  the panel for the session titled “Deepwater Exploration and Development—Challenging the Limits.”

Lars Herbst, Gulf of Mexico (GOM) regional director for the United States Bureau of Safety and Environmental Enforcement (BSEE), described GOM deepwater drilling rig activity as very robust, having recovered from the post-Macondo drilling moratorium. Seventeen new deepwater rigs are expected to enter the GOM under long-term contracts by the first quarter of 2015.

“So, to begin, we are talking about opportunity for drilling. There is plenty of opportunity,” he said.

In GOM deepwater development, 10 projects have been sanctioned and 10 discoveries are in various stages of appraisal, Herbst said. The deepwater activity is “going to require not only new technology but increasingly qualified employees,” Herbst said. “We often talk about competency, but it is going to be capacity as well.”

The BSEE official discussed deepwater challenges and risks. “There is a difference,” Herbst said. “A risk is something that has the capability of causing harm or injury, and a challenge is something that hopefully motivates us to move the status quo.” An item can move from one category to the other, he noted.

The challenges Herbst cited are high-pressure/high-temperature technology, improving blowout preventers (BOPs), enhanced oil recovery, and dual-gradient drilling, as well as workforce issues.

For risks, Herbst noted recent failures in small components, such as shackles and bolts. While no large incident occurred, “a small component can cause a major failure,” he said. There have also been numerous dynamic positioning failures in the past several years. In well control since Macondo, there have been shallow-water incidents and some of them “translate to deep water,” he said.

John Gremp, chairman and chief executive officer of FMC Technologies, put the industry’s deepwater needs in perspective.

“Why is deep water so important? Because our industry is challenged to deliver 27 million new or incremental BOPD by 2020 to respond to increasing demand and declines in production. Two-thirds of that 27 million BOPD will come from offshore, and one-third of that, or 10 million BOPD, will come from deep water.”

Morrison (Moe) Plaisance, vice president of contracts and marketing at Diamond Offshore Drilling, gave a short history of offshore rig development from the 1950s to present. The Mr. Charlie rig, built in New Orleans in 1953, had 36 beds and no heliport; workers went to the rig in boats. The rig, which cost USD 1 million to build, worked in a maximum water depth of 40 ft.

Plaisance compared it with a sixth-generation, harsh-environment semisubmersible being built today. The 180‑bed rig cost USD 750 million to build and will work in 10,000-ft waters. A sixth–generation drillship with 220 beds is costing USD 650 million to build and will work in 12,000-ft waters. Everything scales up with size on the new rigs, including factors such as mud storage capacity and BOP size.

“Ladies and gentlemen, I have been a roughneck for 43 years,” Plaisance said. “We can’t keep building bigger hammers. We are going to have to somehow slim these things down to get our job done.”

Richard Ward, president of completions at Baker Hughes, said that annual spending for deepwater projects in the GOM will grow to almost USD 20 billion per year by the end of the decade.

New Technologies

The costs and risks of exploration and production (E&P) operations are higher than ever and so is the need to put good technologies to use to continue meeting the world’s demand for hydrocarbons. A dinner program held by the SPE Research and Development Technical Session outlined some of the obstacles facing technological advancement in the oil and gas industry.

“We have more challenges than we did 10 or 15 years ago,” said Mario Ruscev, vice president and chief technology officer at Baker Hughes.

jpt-2013-11-speannualfig2.jpg
At SPE’s annual meeting, 2013 SPE President Egbert Imomoh, left, passed the gavel to Jeff Spath, right, who will be SPE’s president for 2014. 

He said the oil and gas industry must improve recovery rates from horizontal shale wells and extend the possibilities of ultradeepwater exploration. While impressed by the rate of technological advancement in North America over the past decade in regard to unconventional shale wells, Ruscev said more solutions are needed. Improved simulation and modeling could help oil companies better realize the potential of tight rock formations where the vast majority of the available hydrocarbons never reach the surface. The current lack of understanding about the inner workings of the source rock is “totally unsatisfying” to sustain long-term production growth, Ruscev said.

“The thing that strikes me,” he said of the shale boom in North America, “we drill, we complete, we frac, we produce, and then we produce a very small fraction of what we drill.”

Ruscev warned that focusing too much attention on the role of improving operational efficiency could over-shadow the need to improve completion and stimulation technology.

“Some people just believe that if you reduce costs, in the end, you win,” he said, adding, “Even if you cut 30% out of (the well cost), it is still too expensive.”

The extremely high costs associated with deepwater exploration should reshape the way companies plan for the life of a subsea field. Either subsea well intervention costs must be reduced, Ruscev said, or engineers must design wells that will produce for decades without requiring intervention. To achieve this, devices placed downhole at the time of completion must be able to remain dormant for years before being activated to help production. And, after producing oil and gas for 20 years, the reservoir will have changed and the well will require sophisticated sensors to detect these changes and react with mechanical devices to open or close valves for well injections.

“The pipeline is absolutely chockablock full of great ideas. There is no shortage of new ideas in the oil field,” said Jim Sledzik, president of Energy Ventures USA, a Norway-based venture capital firm that invests in emerging oil and gas technology firms. Sledzik said the challenge for firms like his is to select which ideas best address the needs of oil and gas companies and can be efficiently developed.

jpt-2013-11-speannualfig3.jpg
The SPE Research and Development Technical Section discussed monitoring of declining wells and extreme wells at a topical luncheon. 

In the spirit of addressing this challenge, SPE’s Research and Development Technical Section is launching a pilot competition at ATCE 2014 to attract science and engineering professionals outside the industry to address the new technology needs of oil and gas.

“We have to accept that petroleum engineers don’t have all the good ideas,” said David Curry, technology fellow at Baker Hughes and SPE R&D committee chairperson. With that in mind, the global R&D competition will seek to “stimulate application of basic sciences to the E&P industry’s technical challenges” and “attract new ideas and new talent to the E&P industry.”

The competition is seeking professionals involved in high-level academia to submit projects. The top three proposals will be selected on the basis of the extent to which their project brings new ideas from basic science and engineering and will receive cash awards ranging from USD 10,000 to USD 30,000.

Pre-Salt Reservoir Discoveries Drive Technology Needs

Communication Hurdles

To meet the challenges of production, effective communication must be established between reservoir engineers and facilities engineers. The challenges in achieving this communication were addressed during a dinner and panel session, “Determining the Basis of Design for PFC Projects—Why Facilities Engineers Need to Think.”

Moderated by Paul Jones, subsea manager at Chevron, the panelists presented their observations and solutions to dealing with a reality faced by facilities engineers during project development.

“We need to learn to deal with uncertainty and probability, because that is all the reservoir engineers will give us,” Jones said.

While reservoir engineers focus on the uncertainties of reservoirs, facilities engineers are tasked with designing projects to produce first oil on time with the information provided by  reservoir engineers. The devil is in the details, and from the facilities engineer’s perspective, the devilish details are held by the reservoir. The range of uncertainties and probabilities fail to provide the specific parameters that facilities engineers prefer for designing a project from its earliest stages through its life cycle. Ineffective communication between the disciplines often compounds the difficulties.

The panelists included Luigi Saputelli, president of Frontender; David Aron, managing director at Petroleum Development Consultants; Maurice Mullaly, project manager at WorleyParsons; and Mark Hollaar, manager of systems design at SBM Offshore Houston.

Ongoing changes during project development are inherent. Saputelli said each phase of development is planned with the consideration of different scenarios. Each scenario is compared by the value it creates and the risks it represents. The contingencies are becoming more complex with the growth of more difficult projects, such as in deeper water, remote locations, and harsh environments. Decision making and gated processes, asset management, reservoir characterization, drilling and completions, production optimization, surface facilities, improved and enhanced oil recovery, and new technologies are evaluated—and form the basis for the changing needs in the design of the project.

Saputelli said that, although comprehensive gathering of information is important, “there must also be a balance between the investment in capturing information and the amount of information.”

Mullaly described the disconnect between the disciplines: “What facilities engineers would like is certainty. We want to know more than what you can tell us. What we need is understanding. A basis of design starts off as an idea and gets progressively more detailed as we move through the development process. We need critical information as soon as possible.”

He added that the link between the “what and why” is often missing in the communication between disciplines. What needs to be done is stated, but why the change is needed is not explained. A simple description of the reason for the change goes a long way in making the adaptation relevant to another discipline.

Aron suggested educational processes may offer solutions to improve communication between the disciplines. “In the UK, the oil and gas industry is the main employer of chemical engineers, yet many undergraduate chemical engineering courses ignore the discipline’s relevance to the oil and gas industry.” Wider educational integration across the engineering disciplines could improve understanding and communication based on increased familiarity and knowledge. Cross-training of a company’s new personnel would be beneficial, along with enhancing communication between disciplines.

Panel Talks About Life After Fracturing

Helping Machines Understand

Communication between engineers is not the only type of communication that sometimes presents challenges in the industry. As computers play an increasingly large role, getting them to understand the data can be difficult. Currie Boyle, distinguished engineer with IBM, gave a keynote talk about how information technology (IT) systems are being developed that can sift through data from a huge variety of sources and data types and extract meaningful content in a way similar to how the human brain operates.

He presented at the SPE Petroleum Data-Driven Analytics Technical Section annual dinner.

“Your industry has a lot more patience than the IT industry,” Boyle said, “which has an attention span of about an hour and a half.”

In asking, “Why do you care about natural language understanding?” Boyle pointed to all the unstructured information the oil and gas industry has in text form. “What if you had a system that could interpret and understand that content, automatically summarize it, and even synthesize it? How hard can natural language be?”

jpt-2013-11-speannualfig4.jpg
Attendees watch a presentation at this year’s ATCE. Including panel sessions, the 2013 conference was host to more than 400 presentations. 

“Data is an interesting thing,” Boyle said. “People say it is the answer to all the world’s problems.” However, he said, the data needs to be relevant and in an appropriate context.

Vague concepts, such as answers to “what” and “how” questions, are hard to understand. “If you want to understand a dialog with an unspecified question,” he said, “you have to understand the industry well enough to carry on that dialog to zero in on what you want.”

Boyle said that a deep natural language process would look at all kinds of references and would remember the discussion the next day, the way a human would, and would take the conversation forward. “It is a relationship with a computer that you build,” he said.

Digitization

The trend toward digitization of technology and data is not unique to the oil and gas industry, and oil and gas can learn from other industries’ experiences. Archie Deskus, vice president and chief information officer at Baker Hughes, drew comparisons between digitization in the aerospace and E&P industries in a keynote address at the SPE Digital Energy Technical Section dinner.

“There is a new generation coming in,” Deskus said, “new technology coming in. Within the last decade, connectivity has emerged and has had a huge impact. In China and India, for example, a whole population moved right to mobile phones. And 34% of the people in the world are connected to the Internet.”

Typically, since the 1960s, technology cycles have tended to last 10 years. But, she pointed out, the introduction and proliferation of tablets only took 3 years.

Now, the global drivers for digitization, she said, tend to be “speed and smart.” Speed is acceleration in innovation, technology, adoption, and knowledge consumption. And customers, products, and services tend to be smarter, incorporating the use of sensors and innovative software.

With aerospace, integrating the supply chain “linked all the supply chain partners, with multitier planning and scheduling. Management and planning had to change,” Deskus said.

The next step was design collaboration, she said.

“The industry dramatically reduced design times. Modeling improved with enormous historical data. The whole development life cycle was shortened from 10 to 20 years down to 2 to 3 years.”

E&P’s cohesive supply chain is similar to other industries, Deskus said. Cross-disciplinary product collaboration, digital standards, and interoperability are vital issues, in spite of there being “a lot more unknowns in this industry,” she said.

An attendee pointed out, “The natural resource development cycle—it’s not ever going to get on the information technology cycle time.”

Deskus agreed. “I don’t think you’ll ever get to full deterministic data in E&P.”

“But follow the data,” she said. “When you start doing that, then the data starts telling you important things.”

Drilling Automation

One of the effects of the trend toward digitization is automation for drilling. An update on drilling automation put on by the SPE Drilling Systems Automation Technical Section (DSATS) showed what is being done to create computerized controls able to adjust what the system is doing on the basis of downhole data without human intervention.

jpt-2013-11-speannualfig5.jpg
A second-line jazz band plays on the exhibition floor at the 2013 ATCE in New Orleans. More than 500 exhibitors had booths at this year’s show.

“We are doing a lot this year,” said Hege Kverneland, chief technical officer for National Oilwell Varco. “We have several tests planned on land rigs with automated controls based on downhole data.”

Those systems are looking more like the end point envisioned by DSATS when it was created, but many improved pieces are needed, from communication systems able to deliver meaningful data economically and reliably to changes in how drillers are hired and paid to offer incentives for those able to deliver greater efficiency and well quality.

The effort presents both technical and organizational challenges. On the organizational side, a group has been created with support from organizations in oil and gas—SPE and the International Association of Drilling Contractors—and the trade group representing makers and users of robotic vehicles—the Association for Unmanned Vehicle Systems International—to create a detailed, systematic plan to transform drilling by applying methods used to create pilotless planes and flexible robotic manufacturing systems.

“The industry needs a strategy because we are so fragmented,” said John deWardt, a consultant and namesake of his firm, who was one of three people who started the project to create a drilling systems automation roadmap. He saw it as a way to make the case that a systematic process can manage the risks of a sweeping change in how things are done.

“If you want step changes, a roadmap can identity what needs to be done to address risks. It will enable management and investors to see value beyond the risks,” deWardt said.

While deWardt would like to have the plan completed by the end of next year, it is a large, complex project that has been taken on by a relatively small group of volunteers.

Even when there is a roadmap, not everyone will be headed to the same destination. There is a divide between those working on nearly autonomous drilling operation—the one-button approach—and those such as ExxonMobil, whose drilling technology budget is aimed at highlighting significant information and offering drillers the data they need to improve based on their training.

ExxonMobil would like systems to offer advice to an operator of potentially telling changes in drilling conditions and how to adjust the controls to increase efficiency, said Paul Pastusek, drilling mechanics advisor for ExxonMobil Development.

And he said past technology changes in the oil business have been the product of many independent initiatives, with service companies often consolidating multiple changes into systems. He said he does not see it following a roadmap this time. “We live in a messy industry,” Pastusek said.

Monitoring Extreme Wells

Operators are asking the makers of downhole monitoring and control systems to deliver solutions that will help improve production in declining fields and enable the operation of some of the most challenging wells ever drilled. The topic was discussed by a panel of systems experts at an SPE Research and Development Technical Section luncheon. Among the pressing issues covered was the need to improve existing downhole sensors and fully develop the next generation of systems to withstand crushing pressures and ultrahigh temperatures.

“I think we have run out of superlatives for the next level, but, in fact, we are there,” said Tad Bostick, vice president of reservoir monitoring at Weatherford. “We are working at 500°F and being asked to build sensors that will survive 30,000 psi and even 35,000 psi at those temperatures.”

Some companies, he said, are now demanding of permanent downhole sensors that can last for the entire life of the well, up to 40 years. The vast majority of sensors installed in wells today are simple temperature gauges, followed by flow gauges. But, Bostick said, as temperatures and pressures increase, optical sensors made of glass must be used instead of electronic sensors, which do not function as well under extreme conditions. Optical sensors also provide the advantage of allowing operators to have a continuous string of data transmitted from throughout the well.

Optical sensor technology also comes with a high level of reliability. Bostick said that, out of the 17 million ft of optical fiber with 5 million sensor points that Weatherford has installed in the past 5 years, 97% of those installed systems are working and producing data today.

To take the sensor technology to the next level, Bostick said Weatherford has expanded its testing programs and introduced “what if” tests proposed by operators working in extreme downhole environments.

“We are taking sensors from 20,000 psi to 0 psi in 1 second, and we are taking groups of sensors through this whole program instead of just one,” he said.

Well Positioning

The risks that inaccurate position calculations can pose to wellbore placement, and best practices to avoid those risks, were discussed in the topical luncheon “Rolling the Dice on Your Well Position.”

Steve Mullin, manager of North America business development at Gyrodata, spoke on ensuring that the wellbore is “in the right place at the right time so that your production is going to be what you expect.”

“Why survey at all?” Mullin asked. He then gave a list of reasons: to ensure that a safe wellbore is drilled to its target; to ensure hitting the target; to avoid hitting another well; to provide accurate positioning to geologists and geophysicists; to provide good reserve estimates; to report data to regulators; to have the best positioning data if the well does not perform to expectation; to ensure efficient operation of equipment such as pumps placed in the wellbore; and to ensure the accurate placement of a relief well, in the rare case that it is needed.

“I am afraid that, all too often, people become focused on cost savings … saving that USD 10,000 by skipping on a gyro [gyroscopic survey] or not updating MWD [measurement-while-drilling] surveys as often as they should,” Mullin said.

The consequence could mean shutting in a neighboring producing well while drilling takes place nearby because accurate positions are not known. Or it could mean having to drill a USD 1 million sidetrack well because of a missed target. Hitting a dry well dead zone could cost USD 10 million; a collision blowout could run into billions of dollars with many other damaging consequences.

Inaccurate drilling of a reservoir involving multiple wells could cut recoverable reserve values far in excess of any survey cost savings, Mullin said.

“The accuracy of downhole starts with the rig location,” Mullin said. “If the rig is not where you think it is, there is little chance that the end of that hole will be where you expect it to be.” Systems need to be in place to check all corrections that have been made to the data, and “a well-designed survey program that includes redundant data” will detect any gross errors and correct them, often just as they occur, he said.

JPT Editors Robin Beckwith, Trent Jacobs, Joel Parshall, Stephen Rassenfoss, and Adam Wilson, and Oil and Gas Facilities Senior Editor Pam Boschee contributed to this report.