BP supercomputing center to aid safer exploration drilling, enable better well placement

Posted on 30 October 2013

By Linda Hsieh, managing editor

James Dupree, BP COO for reservoir development and technology, inaugurates the new Center for High-Performance Computing in Houston on 22 October. The facility began research service on 9 October.

James Dupree (center), BP COO for reservoir development and technology, inaugurates the new Center for High-Performance Computing in Houston at a ribbon-cutting ceremony on 22 October. The facility began research service on 9 October.

BP has opened a new Center for High-Performance Computing (CHPC) that the company says houses the largest commercial supercomputer for research in America. “It’s going to take seismic exploration to the next level,” James Dupree, BP COO for reservoir development and technology, said during a launch event in Houston on 22 October. The center will be a hub for processing and managing large amounts of geophysical data from BP’s global operations. “It’s going to help our teams be much more efficient about how we drill exploration wells so that we can see what we’re drilling. It’s going to make it safer,” he said. Well placement will also be improved to maximize recovery.

The CHPC will be able to process information at up to 2.2 petaflops, which equals more than 2,000 trillion calculations per second. The system also has a total memory of 1,000 terrabytes and disc space of 23.5 petabytes, equivalent to approximately 40,000 average laptops. “Imaging jobs that would’ve taken four months to complete just 12 years ago can now be done in a matter of days,” Jackie Mutschler, head of upstream technology for BP, said. “Having that kind of power at our fingertips has allowed us to expand our big data research and development beyond seismic imaging.”

By being able to analyze extremely large volumes of data, the center will be able to, for example, calculate petrophysical rock properties and model fluid flow directly from high-resolution micro-CT 3D rock images, she said. “As you might expect, the more ways in which we use the CHPC, the more horsepower we want.”

Networking and storage have been separated from the computing side at the new BP CHPC. Keith Gray, HPC manager for BP, said this allows for more redundancy in power and cooling and a more typical computer room temperature.

Networking and storage have been separated from the computing side at the new BP CHPC. Keith Gray, HPC manager for BP, said this allows for more redundancy in power and cooling and a more typical computer room temperature.

BP’s first high-performance computing (HPC) facility, also in Houston, is limited by power, cooling and floor loading capabilities, HPC manager Keith Gray said. The new CHPC currently has approximately 2.5 megawatts of peak power draw, and the building can scale to almost 7.5 megawatts, he explained. Once the computers from the first center are moved into the new CHPC, there will be close to 6,000 computers, each with 16-20 processors, in the facility. “In total there are over 100,000 cores, CPUs, that are operating in parallel doing work for us,” he added.

HPC has come a long way to get to where it is today. BP has been using supercomputers to process seismic images for decades, but “what we considered super back then and what we consider super today are very different,” Ms Mutschler said. She explained that the VAX 11/780 was considered state of the art back in the 1970s to early ‘80s. “It could perform about 1 million floating point operations per second and had a few megabytes of memory. Imaging a single two-dimensional seismic line using the simplest method could take at least 15 minutes, and employing complex algorithms similar to the ones we use today wasn’t even remotely practical.”

By the mid-’80s, the Cray had emerged as the next generation of HPC but was prohibitively expensive and limited by memory and input/output capability. Because machine time was a scarce resource, Ms Mutschler continued, processing seismic data to support operational decisions was prioritized over research applications.

When parallel computers emerged, it helped BP to begin making use of 3D seismic data. “These could do 10 billion operations per second, a 10,000-factor improvement over the VAX of just a decade earlier, but we still couldn’t do the same things in 3D that we could do in 2D. We needed still more power,” she said.

John Etgen, BP distinguished advisor for seismic imaging, said advances in HPC have enabled better and better “vision” and development of fields like Mad Dog.

John Etgen, BP distinguished advisor for seismic imaging, said advances in HPC have enabled increasingly better subsurface “vision” and development of fields like Mad Dog.

The HPC center soon began to take shape, and by 2001 BP had achieved teraflop capability, or 1 trillion calculations per second. “It was around this time that we were approached by NASA to benchmark their capabilities. I think they expected they would dwarf us. As it turned out, our Houston center had more memory – just memory, not disc, not file system space, just memory – than all of the data space they had, memory and disc combined,” she continued. “They didn’t ask us to fill out the rest of the survey.”

As sophisticated as the current CHPC is, BP recognizes that it must continue investments in computing power if it wants to efficiently develop deep and complex fields like the Gulf of Mexico’s Mad Dog and beyond. John Etgen, BP distinguished advisor for seismic imaging, said he sees the need for ever-greater HPC going on indefinitely. “The demand is always there to improve (the images) better, allow us to operate more efficiently, allow us to find new resources. We’re always pushing the limits of what we can do.”

Leave a Reply

*

FEATURED MICROSITES


Recent Drilling News

  • 26 November 2014

    US EIA: Increase in energy demand requires 33 million bbl/day more oil

    The world is going to need one-third more oil between 2010 and 2040, with developing countries, such as China and India, accounting for a major portion in the increase in...

  • 25 November 2014

    Prescriptive Analytics prescribes how to complete wells to maximize production

    "Big data" is taking on a new role in industry, where companies are combining the findings from geology and geophysics with new data collected from advanced technologies...

  • 25 November 2014

    Petrobras completes drilling first appraisal well in Libra Consortium

    Petrobras recently completed drilling the first appraisal well in the Libra area, the 3-BRSA-1255 (3-RJS-731), known as NW1. Located in the Northwest portion of the Libra block, in the pre-salt layer of the...

  • 24 November 2014

    Ensign secures contracts for 2 ADR 1500S pad drill rigs in Montney area

    Ensign Energy Services has signed contracts for two of its new ADR 1500S pad drill rigs for a major operator in the Montney area of northeast British Columbia...

  • 21 November 2014

    Statoil cancels Stena Carron rig contract offshore Angola

    Statoil has decided to cancel the Stena Carron rig contract after fulfilling the work commitments in the Statoil-operated blocks 38 and 39 in the Kwanza basin offshore Angola...

  • Read more news