February 2016
Columns

What's new in exploration

Industry’s supercomputing needs increase
Satinder Chopra / Contributing Editor

The search for oil and gas (O&G) has moved into more complex frontier areas, where geological risks are greater. The need to maximize production in a low-price environment requires more advanced geoscience technology applications.

The increased speed and capacity of modern supercomputers helps us in three ways: 1) to compute more accurate solutions that were previously intractable; 2) to compute solutions in near-real time, thereby impacting business decisions while a well is being drilled; and 3) to compute solutions of multiple scenarios, rather than just the best reservoir scenario to better quantify the risk of a given prediction. While such endeavors require greater power, they also generate enormous quantities of data. The processing of larger volumes of data has been demanding, pushing the limits of hardware speed, system memory and the input/output operations per second. Consequently, the O&G industry, and meteorology, have been the two leading consumers of supercomputing resources.

The supercomputers of the 1970s and 1980s were large, $20-million single machines. Today, supercomputing in the petroleum industry most commonly consists of the same hardware that sits on your desktop, but is repackaged into large clusters consisting of thousands of processors. Such “high-performance computing (HPC),” achieves its speed by solving multiple problems in parallel. Here, are the more important computer-intensive petroleum applications.

1. Efforts that, until now, were too computationally intensive.

a. Seismic imaging, or “migration,” places seismic measurements made at the surface in their correct subsurface locations. This is one of the final steps in seismic data processing. By doing so, a more accurate subsurface image is created. The simplistic migration methods have limitations in the presence of complex, steeply dipping seismic reflections. The wave equation-based, reverse-time migration (RTM) technique was introduced in the late 1970s, but it was too computationally intensive for 3D surveys until recently. Thanks to HPC, RTM is now being run by many processing shops. It is the method of choice in imaging below salt.

b. Seismic modeling has many applications in different guises, in seismic data acquisition, processing, interpretation and reservoir characterization. The most familiar is synthetic seismogram generation from well log data. Many O&G companies are collaborating to improve seismic modeling and imaging, wherein 3D elastic models can be generated. All such sophisticated modeling exercises are only suited for HPC machines.

c. Full waveform inversion (FWI) is modeling in a loop, where the objective is to forward-model the seismic response, and then perturb velocities until the modeled data approximates the measured data on the earth’s surface. The synthetic data are generated by using the acquisition geometry for the field data. By utilizing elastic wavefield propagation mechanisms accounting for anisotropy, attenuation, and multiples, the minimization is affected by non-linear optimization techniques. The high-resolution velocity models serve as an accurate input for depth imaging, which, in turn, provides better-defined subsurface images that are amenable to more meaningful interpretation.

2. Applications that need to run in near-real time.

Supercomputing and real-time applications for weather prediction, and defense from a missile attack, have long been major users of supercomputing systems. Predicting where a hurricane may hit or a missile may strike, after the fact, has little practical value. Very fast, real-time computing on a seismic acquisition ship was first used almost 20 years ago, allowing the acquisition company to go back and reshoot seismic lines that had problems. More recently, “real time” telecommunication and high-speed processing of microseismic events generated during hydraulic fracturing allows the driller to recognize problems, estimate stimulated volumes, and make adjustments to the completion process on-site.

3. Applications that are run multiple times to generate statistically valid estimates.

Stochastic reservoir simulation. Despite the best reservoir characterization exercises, there is always some inherent uncertainty when it comes to reservoir performance. Accurate reservoir models help operators to make informed operational decisions. The uncertainty can be mitigated by simulating a number of scenarios based on parameter variation. Such models are then compared and/or a mean of those realizations derived that shows the lowest uncertainty. High-end machines have been used for this purpose, which are now the HPC computers.

Geostatistical inversion: It provides a quantitative way to integrate well data, seismic data, and the vertical and lateral property variability determined with geostatistical tools called variograms. Using HPC, a large number of acoustic impedance model traces are generated at the location of each seismic trace by conditional simulation and minimizing an objective function between synthetic and real seismic traces. These high-resolution models can then be used to better quantify risk, providing estimates of the P10, P50 and P90 of volumetrics, or when coupled to stochastic reservoir simulation and production.

To address the challenges faced by our industry, in terms of advanced technology applications and enhanced drilling success, supercomputing needs have grown, and will continue to grow. Of the world’s most powerful supercomputers, the fastest computer has a rating of 33.86 petaflops. Future supercomputers will probably have exaflop capability (1 exaflop = 1,000 petaflops), which will significantly reduce the time required for data processing. wo-box_blue.gif 

About the Authors
Satinder Chopra
Contributing Editor
Satinder Chopra works as chief geophysicist, Reservoir, at Arcis Seismic Solutions, TGS, in Calgary, and specializes in processing, interactive interpretation of seismic data and reservoir characterization. He was the 2010/11 CSEG Distinguished Lecturer, the 2011/12 AAPG/SEG Distinguished Lecturer and the 2014/15 EAGE e-Distinguished Lecturer. He is a well-published author and has won several awards.
Related Articles FROM THE ARCHIVE
Connect with World Oil
Connect with World Oil, the upstream industry's most trusted source of forecast data, industry trends, and insights into operational and technological advances.