April 2020
Features

Seismic toolbox reaches deep into data to identify hidden intricacies

New algorithms, from an innovative seismic toolbox, generate a different approach to stratigraphic and structural event studies by combining large datasets, to identify hidden trends and subtle changes around geologic features.
Luis Vernengo / Pan American Energy Group Eduardo Trinchero / Pan American Energy Group

The rule of thumb for geophysical interpreters using onshore 3D seismic states that data can be reprocessed three times, every four years. Then, it is time to record new data again (16 years). This cycle takes into account the improvements of new processing algorithms and their evolution, which improves visibility for better results. After that, historical records indicate new recording technologies will have improved enough to make it profitable to record new 3D data.

To set a baseline for regular 3D onshore seismic, we will set an arbitrary cost of 100 monetary units. The pre-stack depth migration (PSDM) seismic processing would be approximately 6% of that value, with special processing, like simultaneous inversion, accounting for roughly 4% of the recording. After that, it is relatively inexpensive to reprocess applying for example tomographic static corrections and new migration processes to have a greater impact on the improvement in results.

Apart from that, the interpreters have a technology-based toolbox (no matter the software they use) that is capable of calculating many attributes and processes, to ease the interpretation sequence. The quantitative analysis of seismic information also requires extremely careful processing, in addition to reliable, high-quality data. The analysis also requires geophysicists to make assumptions about the nature of geological events and earth elements that are not fully evident.

COST VS REWARD

The challenge that all asset managers must address is when, and how much, to invest when ordering new seismic data to gain a more detailed image and/or better knowledge of their reservoirs. It has become common to attempt to develop, or better define, reservoirs using insufficient quality legacy seismic data to produce a robust geologic model. If the quality and parameters of available data are not adequate to solve reasonably the reservoir’s architecture, it may not be feasible to gain an advantage by applying innovative processing algorithms. New data may be required, particularly in the presence of subtle traps or in areas of high tectonic complexity, that require better horizontal and vertical resolution.

An example of the improvements that can be obtained using new-generation algorithms together with new recording is when the seismic data document the existence of important intrusive bodies that deteriorate information quality. These events are a natural barrier for seismic frequencies and amplitudes. So, to achieve a consistent, coherent seismic image, it is necessary to focus technical effort through acquisition designs, using innovative criteria and special algorithms for data processing.

The presence of these bodies determines the choice of parameters, particularly those focused on increasing horizontal continuity. Outside the intrusive zone, the information is generally good-quality, with regular bandwidth, similar to the other areas. But in the critical zone, the frequency range is very poor. If bandwidth is increased, by post-processing below the intrusive, the information has a noisy appearance and loses continuity. In order to optimize the choice of parameters, it was necessary to perform multiple tests at each step of the work sequence. The recognition of the first arrivals did not present difficulties outside that area, but this task was performed in the intrusive zone. A tomographic static solution must be selected with great discretion to obtain good results, both in long- and short-period statics.

5D PROCESS

The implementation of the 5D interpolation process in the common offset vector’s domain allowed the pre-stack migration geometry to have a homogeneous sub-surface coverage, prior to the pre-stack migration in the offset planes. In this case, a multi-dimensional method (5D) of regularization and interpolation was used, based on an extension of the anti-leakage Fast Fourier Transform (FFT) algorithm, in which the data are processed in overlapping space-time blocks. Each temporal sample is transformed to the spatial frequency domain through FFT. Then, its inverse transform reconstructs the energy to the desired locations, using all the information available in the four spatial dimensions simultaneously.

Fig. 1. An east-west seismic section shows the impact of a shallow intrusive body on the seismic data. (a) This section had little chance of being interpreted consistently in the area below the event. (b) After new recording, in the zone of silence the image was improved considerably. This enabled interpretation of the main reservoirs and the location of new drilling prospects.
Fig. 1. An east-west seismic section shows the impact of a shallow intrusive body on the seismic data. (a) This section had little chance of being interpreted consistently in the area below the event. (b) After new recording, in the zone of silence the image was improved considerably. This enabled interpretation of the main reservoirs and the location of new drilling prospects.

The regular disposition of the input data enables assess to a good-quality migrated product, with well positioned events and a better structural approach. Comparing images from the old process in Golfo San Jorge basin, Argentina, (Fig. 1a) to the new acquisition and processing technique (Fig. 1b), it is possible to observe the improvement in information quality. The improved visualization is made possible by better spatial sampling and higher density of traces per square kilometer, as well as to a truck-mounted vibroseis sweep with better content of low and high frequencies, compared to that used previously.

It is important to set the rules to be adaptable in all areas and moments of the development. The workflows should be designed to obtain the best results in the least amount of time, applying techniques, accordingly, to the problem to be solved. Recording, processing and interpreting seismic have previous sequences that can be adapted in every situation.

The power of visualization. The characterization of an earth model to understand the reservoir requires geoscientists to identify the mechanism that enables the next step in the visualization of geometries and shapes of geo-bodies in the lateral and vertical assembly.

Stratigraphic and structural events studies, from subtle traps to structures with different degrees of complexity and magnitude, require specific strategies and new sequences in the interpreter’s tasks. These specific attitudes must be supported by algorithms from the seismic toolbox that generate different approaches to the data.

An interpreter can visualize the data in different ways. The seismic is formed by traces in vertical time or depth space. Amplitude and phase define the representation. Clearly, if it is possible to separate both characteristics, they will show what each component represents. Every amplitude attribute that encompasses more than tens of those, will show hydrocarbons directly, or reservoir lithology. In case of phase and frequency, apart from the amplitude, this will show the way to visualize other characteristics of the rocks, such as, porosity and the architecture of the reservoir.

Different approaches developed by the interpreter, or their combination, facilitate the search for new ways to view the information that the seismic data possess. This analysis must be incorporated into the interpreter’s job routine. In general, attributes are appropriate, both in their association with the continuity of the reflectors and in the visual appearance of the geometry of the geo-bodies.

It is useful for the interpreter to work on the desktop with raw data. After seismic processing or reprocessing, one of the mandatory deliverables from the company is a copy of final data without filters and gain. Every amplitude attribute must be extracted from that raw cube, although the interpretation could be made in gain and filter-applied data. Seismic interpreters must make use of their virtual toolbox to select the attributes and algorithms that are more consistent with the geological environment they are working.

Preparing data for different needs. For example, when an intrusive signal is provoking noise, the interpreter could play one sequence with filters emphasizing low frequency. Therefore, the visualization of the data through special attributes provides a visual sensation that strengthens the analysis and understanding of the lateral and vertical geometric relationships, between the different events that make up the geological column of the study area subsurface. To meet the specific challenges, novel types of interactive visualization systems and techniques are required to reflect the state of special process results.

This choice must be directed and focused on the target. For example, it is possible to identify the special attributes and processes linked to fluvial environments, and the characterization of their geo-bodies through spectral decomposition. This is formally defined as the continuous time-frequency analysis from a seismic trace, having a frequency spectrum for each time sample. 

Fig. 2. A seismic section showing the imprint of a wide channel near the top of Castillo formation (a). A 3D visualization of the channel zone was enhanced by extracting seismic character attributes and geo-body recognition tools (b).
Fig. 2. A seismic section showing the imprint of a wide channel near the top of Castillo formation (a). A 3D visualization of the channel zone was enhanced by extracting seismic character attributes and geo-body recognition tools (b).

This technique improves the definition of prospects significantly beyond the seismic resolution, thus enabling the interpreter to understand geological details that cannot be solved in the time domain, Figs. 2 and 3. The results of the high-resolution spectral decomposition provide images that often identify small geo-bodies and their geometry, beyond the limits of traditional seismic analysis. When, in the time domain, the amplitudes cannot separate thicknesses, layers of different thicknesses can be identified in the frequency domain, with independent amplitudes. Using these criteria, it can be stated definitively that the definition of the classical resolution limit is not applicable in the frequency domain.

Fig. 3. The high-resolution spectral decomposition provides important details of analyzed geo-bodies in events linked to a fluvial environment.
Fig. 3. The high-resolution spectral decomposition provides important details of analyzed geo-bodies in events linked to a fluvial environment.

Figure 3 shows a high-resolution spectral decomposition horizon with RGB composite display of frequencies 26 Hz, 28 Hz, 56 Hz (red circle = productive well). Figures a and b are high-sinuosity geo-bodies. Figure b is a detailed depiction of a, where the overflow lobes, the central channel, a lower sinuous channel and an abandoned meander are visible and recognized. Figure c is an isometric view of the channel area and the associated seismic horizon, close to a seismic section that intersects the channel area. Figure c visualizes a horizon slice of coherence attribute, which shows a complex network of meandriform channels in the middle of the Castillo Formation, Golfo San Jorge basin, Argentina.

The architectural seismic elements of the structural geometry of the subsurface must be linked to a reliable static model, but also the variations and the behavior of the reservoirs inside distribution of the fluids and matrix. The seismic character of the structural and stratigraphic discontinuities almost always involves lateral variations in the waveform, dip, amplitude and other seismic characteristics.

The persistence of these patterns, detected by various processes and innovative calculations of attributes, allows the assembly of areas that may have been, for example, subject to varying degrees of tectonic stress, diagenesis and faulting. These features contribute to the construction of a static model of the reservoir, and the identification of options to be considered in the characteristics of future wells or in the design of hydraulic fractures.

When data are multiplied. The fundamental goal of Big Data methodology is to present, transform and convert all data into an efficient and effective visual representation system, so that users of different disciplines can rapidly, intuitively and readily understand, analyze and comprehend the information with all its details.

High-quality immersive visualization can enhance the understanding, interpretation and modeling of Big Data. The combination of advanced analytical methodologies, and a flexible visualization toolkit, will allow the industry to deliver efficient and effective insight, characterization and control of very complex heterogeneous systems that make up an oil and/or gas reservoir.

As a result, applying the new methodologies to metadata, or a large data set, transforms raw data into usable information, and ultimately into knowledge to quantify geological uncertainties in a complex, heterogeneous subsurface system. This will minimize risks in field development strategies and tactics. For example, during the pre-stack seismic processing sequence, attributes from the common depth point are generated. The critical information obtained from such seismic images is directional or azimuthal and offset-related. A large amount of data is created in the pre-stack classification cluster of multiple traces. It is essential to adopt new methodologies as Big Data and/or a Big Analytics suite of strategies that can store a plethora of seismic attributes in memory, and via a suite of correlation and regression processes. The objective is to perform an exploration of analytics on surface trends, to recognize and understand the hidden patterns and tendencies.

Big Data technologies integrate common and disparate data sets, to deliver the right information, at the appropriate time, to the correct decision-maker. These capabilities help firms act on large volumes of data, transforming decision-making from reactive to proactive, and optimizing all phases of exploration, development and production. Furthermore, Big Data offers multiple opportunities to ensure safer and more responsible operations.

There is an opportunity to build a large repository of seismic data, and its attributes. The repository can also include well information, in addition to production and completion data related to perforations, cores and different tests performed in various wells. By combining all these data in a system that can make realizations to find a pattern, we can identify critical relationships between the different datasets that are very difficult to see with normal techniques. Also, artificial intelligence (AI) algorithms can be applied, to learn from the results for application in offsetting fields, in addition to remote areas distanced from the initial work.

Fig. 4. An example of machine learning methodology and the construction of petrophysical properties with seismic information and well log data. The porosity in an extended zone of an important productive field and the surrounding prospective areas were mapped (a). A cross-plot derived from well logs of the “P” impedances and the “S” impedances in the context of the associated shale volume (b). This large amount of data required special treatment and a specific type of visualization.
Fig. 4. An example of machine learning methodology and the construction of petrophysical properties with seismic information and well log data. The porosity in an extended zone of an important productive field and the surrounding prospective areas were mapped (a). A cross-plot derived from well logs of the “P” impedances and the “S” impedances in the context of the associated shale volume (b). This large amount of data required special treatment and a specific type of visualization.

An example is shown, where the machine learning methodology provides holistic solutions for building data-driven models for petrophysical properties, using well log and seismic data. The use of seismic data allows modelling of petrophysical and geo-mechanical properties throughout the reservoir, thus enabling a more-informed decision-making process. It includes the data QC, as well as setting the seismic information and well data to the same resolution. Then, a dimensionality reduction via correlation matrix and principal component (PCA) is conducted, along with finalizing in normalization. Using this technique, a porosity cube was obtained while working on a fluvial reservoir, Fig. 4a.

A cross-plot of a large data set, derived from well logs, was taken to analyzing the interrelation between different properties of the reservoirs Fig. 4b. The “P” Impedance data versus the “S” impedance data were plotted and colored by shale volume information. This type of representation, of a massive quantity of data presented in this manner, facilitates comprehension and analysis of the main tendencies and data associations.

SHALE RESERVOIRS

The new Big Data strategic toolbox also can be applied when analyzing unconventional resources. It is especially useful, when geoscientists are analyzing the large dataset associated with processing special attributes. In this case, seismic geomechanics help to characterize the reservoir, in order to recognize and identify brittle, ductile zones through the calculation of special attributes by geometric and geomechanical algorithms.

Fig. 5. A consistency analysis between geometric and geomechanical attributes, calculated with Big Data analysis from seismic values, inside an 8-millisecond window in the most productive zone of the Vaca Muerta formation.
Fig. 5. A consistency analysis between geometric and geomechanical attributes, calculated with Big Data analysis from seismic values, inside an 8-millisecond window in the most productive zone of the Vaca Muerta formation.

This information is then visualized and evaluated in detail. A large dataset is plotted in Fig. 5 and it shows cross-plots that compare geometric attributes versus geomechanical attributes in the basal sequence of the Vaca Muerta formation, Neuquén Basin, Argentina. In all the figures, the dotted circles indicate the most brittle and ductile zones. In (a), edge detection is plotted against lambda*rho (lamé first elastic coefficient*density). In (b), edge detection is plotted against mu*rho (lamé second elastic coefficient or share modulus*density). In (c), coherence is plotted against the dip azimuth. In (d), dip azimuth is plotted against mu*rho. The aforementioned attributes are reinforced in the same direction, defining recognizable areas in the selected and analyzed seismic window data.

Although most geoscientists are good at their jobs, this new process has resulted in a learning curve. In most cases, AI models must learn how an interpreter analyzes a seismic section through its interactive interpretation programs. This enables the AI to understand how geoscientists make decisions. Different professionals take different paths, which makes training on AI systems more complicated, because geoscientists tend to rationalize in a non-linear manner.

VALUE ADDED

By using new applications, software and techniques, seismic interpreters can identify hidden trends and previously unseen geological intricacies. The new seismic toolbox enables geoscientists to work in collaboration in cross-functional teams, to fully exploit all available data to optimize borehole placement and enhance the drilling operation.

The interpreter should select an appropriate seismic algorithm from the virtual toolbox that is directly sensitive to the geological features of interest. The algorithm should be selected, to consider the petrophysical properties of the reservoir, to optimize the definition of depositional environments and the structural context.

By properly using the toolbox during the interpretative geological problem/solution, workflow will optimize planning of the analysis strategy and contribute substantially to the definition of new prospective challenges.

CONCLUSION

Versions of this application, and its data manipulation tools, can be applied, using existing data. Training of personnel is ongoing, and software capabilities are on an upward trend. The seismic toolbox enables interpreters to improve their knowledge of models by selecting the appropriate algorithms to recognize seismic patterns associated with optimization in the continuity of reflectors and the visual appearance of geo-body edges to identify fault zones. It also helps in observing the limits and stratigraphic borders, lateral variations, and their association with various seismofacies without altering the basic seismic response of the geological characteristics.

Applying these systematic methodologies to enhance the comprehension of new technologies will reduce risk and aid in formulating creativity strategies to enhance investment decisions, and ultimately return-on-investment.

 

ACKNOWLEDGEMENT

The authors would like to thank Pan American Energy Group for permission to publish the seismic data shown in this article.

About the Authors
Luis Vernengo
Pan American Energy Group
Luis Vernengo is head of geophysics at Pan American Energy, Argentina, where he has worked for 24 of the 33 years of his career. He has a degree in geophysics from the University of La Plata (Argentina) in Facultad de Ciencias Astronómicas y Geofísicas. Mr. Vernengo has experience mainly in seismic interpretation, attributes and seismic inversion. He has worked in different basins around the world, primarily in Latin America in various geological structural and stratigraphic scenarios.
Eduardo Trinchero
Pan American Energy Group
Eduardo Trinchero is a geophysical consultant with 37 years of experience in different aspects of seismic exploration and resource exploitation. He has a strong background in development geophysics, reservoir seismic interpretation, attributes, seismic stratigraphic interpretation, seismic new technologies applications, seismic acquisition and processing. He has worked in different geological contexts in basins around the world and has served as operations country manager in Maracaibo, Venezuela, for three years and as county reservoir manager in Comodoro Rivadavia, south Patagonia. He has published articles in various association publications, World Oil, and in several congresses in Argentina, Brazil and the U.S. Mr. Trinchero received a post-degree in geophysics from the Universidad de Buenos Aires and post-degree in seismic interpretation from the Universidad Nacional de Cuyo, Mendoza, Argentina.
Related Articles
Connect with World Oil
Connect with World Oil, the upstream industry's most trusted source of forecast data, industry trends, and insights into operational and technological advances.