Boosting the power of Big Data for completions
The world runs on hydrocarbons, and big data is powering the potential now and into the future. This now-ubiquitous term that has made its way into the global vernacular has been a turning point for hydrocarbon production, unlocking new opportunities in frontiers, such as unconventional plays, that were once inaccessible. Operators capture and analyze huge amounts of real-time data from the wellsite, every day, to build geomodels and compute sophisticated simulations, in the push to deliver wells that will maximize estimated ultimate recovery. The key in this data-driven environment, however, is to augment traditional simulation and modeling techniques, to ensure sound decision-making.
While the ability to effectively mine and analyze meaningful downhole data while drilling has helped revolutionize that side of the business, there is still room for improvement, when it comes to completion optimization. This is more challenging for a variety of reasons. The data are plentiful, but methods for mining these data efficiently, and in ways that bring value have remained elusive.
This is especially the case in the unconventional basins, characterized by tight, low-permeability formations, heterogeneity and complex fluid-flow mechanisms. The number of variables—complexities of fracture modeling and their limitations and expense of seismic analysis—are among the many factors that inhibit development of models for fast, accurate and cost-effective completion design. Typically, workflows for unconventional well completion and production optimization require extensive earth modeling, and fracture and production simulations, to determine a wide range of parameters. These include optimal cluster spacing, proppant loading and well spacing for each well—an expensive process that can take anywhere from a few weeks to several months.
Although operators recognize the importance of subsurface characterization, geomechanics analysis, hydraulic fracture simulation and numerical modeling of the reservoir response, less than 5% of the wells fractured in North America, today, are designed using advanced simulation. This is due to the required level of data, large numbers of variables and long computing times.
The fast pace of development prior to the 2014 downturn, and subsequent market volatility and uncertainty, prompted the industry to seek innovative solutions for achieving an optimal hydraulic fracturing design and better predicting of production for a given well—solutions that can lower development costs, potentially improve recovery and maximize return on investment (ROI) in field development campaigns.
A new workflow, which combines earth sciences with data analytics, makes well completion optimization operational by building a predictive proxy model that facilitates sensitivity analysis on completion designs within a matter of hours, Fig. 1. Field-tested in the Eagle Ford play during 2015–2016, the workflow was developed using parallel fracture and reservoir simulations in the cloud, combined with data analytics and artificial intelligence (AI) algorithms.
CREATING A PROXY MODEL
The foundation of the workflow is machine learning, a branch of AI dating back to 1959, where machines learn and adapt through experience to automate analytical model building. The workflow applies a unique methodology of data science and machine learning for unconventional reservoir completion by developing a proxy model. This serves as an alternative to conventional, discrete numerical modeling. The approach effectively mimics the combination of a physical model and a simulation model.
As described in SPE paper 189790, “Need for speed: Data analytics coupled to reservoir characterization fast tracks well completion optimization,” the integrated methodology uses vast amounts of data to bridge traditional reservoir understanding. It also uses simulation methods to create a predictive proxy model that allows operators to make fast, effective decisions for optimizing completions.
The proxy equation combines multiple data points, such as logs, geology, geomechanics, natural and hydraulic fractures, stress field simulation and reservoir simulation, into a single model. The more data points for simulation, the more comprehensive and reliable the model becomes to predict future performance. When the calibrated proxy model is placed in the context of uncertain completion parameters and variables, it generates hundreds of thousands of realizations for measuring the impact of change of one variable and predicting cumulative production.
By using advanced analytic tools to run a set of completion variables in multiple dimensions and calibrating the combinations in a single model, it is possible to create a roadmap of operating parameters from one well to the next. This eliminates the need to build numerical models for each well.
A predictive proxy model requires three sets of data—a geological earth model, a completion strategy and a fracture treatment schedule—to achieve the key objective of directly and immediately translating a completion strategy in a simple workflow not impacted by data acquisition, modeling and recalibration on each well.
Once developed, the proxy model can be used to predict production for multiple wells by changing and/or plugging in specific completion parameters. This includes such things as number of stages or pump rates, assuming the geology and petrophysical properties are similar to the original earth model. If the geology is significantly different, a new proxy model will likely need to be created.
MORE THAN 2,000 DATA POINTS
For the Eagle Ford field test, a predictive proxy model collected more than 2,000 data points from a single well, to determine completion designs for future wells in the area. First, a numerical 3D earth model for the existing wells covering the boundaries of the drainage area was created on the Petrel* E&P software platform, which integrates data using enhanced multi-disciplinary workflows, Fig. 2. The platform incorporates the Kinetix* shale reservoir-centric stimulation-to-production software, integrating geophysics, geology, petrophysics, completion engineering, reservoir engineering and geomechanics in a repeatable process to ensure data integrity, Fig. 3.
Additionally, the INTERSECT* high-resolution reservoir simulator, in the cloud, modeled the flow of fluids in the reservoir and calculated the resulting pressure and saturation changes to perform 3D static or 4D measurements. The simulator can be implemented for field development planning with improved accuracy and efficiency.
The earth model was based on a single well that had originally been fractured with 21 stages and six clusters per stage. The well’s TVD was 7,780 ft, and the lateral length was 5,850 ft. The hydraulic fracture geometries for the well were calibrated using microseismic data and pressure history matching.
Proppant mass per stage for the original well completion was about 400,000 lb, or 1,435 lb/ft along the lateral, with a pump rate of 90 bbl/min. Natural fractures were incorporated into the earth model as a discrete fracture network, using existing image log data for the area. A complex fracture model was used to simulate the hydraulic fractures, which were pressure-matched with the actual treatment data. An unstructured grid pattern conveyed the footprint of the hydraulic fractures and proppant distribution, and also performed the numerical simulation to obtain the production history-match and production forecast. The reservoir was calibrated by altering properties—such as reservoir and fracture relative permeabilities—to match the observed production data. This calibrated model became the basis for creating the proxy model.
The analytics dataset was generated using eight completion parameters, including number of stages; clusters per stage; fluid type; proppant type and size; proppant loading; pump rates; and flowing bottomhole pressure (BHP). The BHP, pump rate and schedules, and proppant/fluid ratio, were sampled using a uniform distribution, while a discrete distribution method was applied to sample the number of stages and clusters per stage. Sixty different pump schedules were generated to capture the variability in proppant types, and sizes and fluid types, which included slickwater, linear gel and crosslinked gelled fluids at different gel loadings. Proppant types included simple white sand, resin-coated sand, ceramic sand and resin-coated ceramic sand, with sizes ranging from 100- to 12/20-mesh sizes. Changing the proppant/fluid ratio for each pump schedule provided further variation.
Once built, the proxy model became a predictive tool to perform sensitivity analysis on completion designs and pump schedules, within minutes, to translate user-controlled parameters, such as completion strategy and pump schedule, to production performance.
PREDICTING PRODUCTION PERFORMANCE
Approximately 2,000 experiments were conducted with numerical engines and fundamental reservoir engineering platforms on a multi-threaded cluster environment. This was done so that multiple samples could be tested in parallel, in a short period of time. The objective was to determine production performance at different timeframes through numerical simulations, which were run in groups of 160 cases at a time, to obtain the results within one week.
These results were then tabulated with completion sensitivities, or inputs, and production results, or outputs, to predict the best 1-, 3- and 12-month and cumulative five-year production of oil and gas, Fig. 4. Predictive analytics techniques also can be applied to the results, to further identify trends and more precisely measure the relationship between input and output parameters. New data, gathered as the drilling and completion cycles progress, can further improve the accuracy of the predictive proxy model.
Because some of the experiments failed, 1,867 cases contained useful results to feed into the process for creating the proxy. Data analytics algorithms were tested on the results of 1,572 experiments. A ten-fold 70/30 train/test approach was used to cross-validate the model performance. The machine learning technique of gradient boosting showed 90% accuracy between the proxy model and actual results, validating that the proxy could predict the wellbore productivity accurately for any given change in completion design.
Fig. 5. The gradient boosting technique shows the greatest degree of accuracy for predicting the response to the actual results. Image: Schlumberger. |
The study also revealed that prediction accuracy increases when the margin of error increases, as observed when using the gradient boosting method to predict cumulative five-year production. Only 52.73% of the predictions were within 5% of the target, while 98.77% of the predictions were accurate within 25% of the target, Fig. 5.
Additionally, the completion parameters did not have the same impact on production while creating the proxy model. Stage length, number of clusters and proppant amount had the most influence, while proppant size and type of proppant had minimum impact.
A RELIABLE PREDICTOR OF PRODUCTION
The Eagle Ford pilot project demonstrates that using a calibrated numerical model to create a proxy equation can serve as a highly-reliable predictor of production for the best 1-, 3- and 12-month and cumulative five-year performance targets, with higher accuracy achieved for longer-term predictions.
The project also showed the value of using a single, calibrated proxy model—based on data science—in achieving a more engineered approach of completing wells in factory mode. The proxy can serve as an effective plug-and-play tool to determine the impact on production when making quick, unplanned completion design changes, such as adjusting the amount of fluid or proppant and altering the pump rate. In this way, operators can make more confident, high-level and real-time decisions regarding the impact of ineffective stimulation, versus the conventional approach that involves time-consuming numerical simulations and post-job analysis, Fig. 6.
By eliminating the need to repeat the process of creating a separate numerical model for each well, a proxy model used to predict performance in multiple wells has the potential to deliver significant ROI improvement through cumulative time-savings, improved productivity and quick decision-making.
The workflow, using single-well simulation and modeling to develop a predictive proxy model, also can be applied to other disciplines, such as drilling and artificial lift, to streamline the model-to-decision process.
- Prices and governmental policies combine to stymie Canadian upstream growth (February 2024)
- U.S. producing gas wells increase despite low prices (February 2024)
- U.S. drilling: More of the same expected (February 2024)
- U.S. oil and natural gas production hits record highs (February 2024)
- Quantum computing and subsurface prediction (January 2024)
- U.S. upstream muddles along, with an eye toward 2024 (September 2023)
- Applying ultra-deep LWD resistivity technology successfully in a SAGD operation (May 2019)
- Adoption of wireless intelligent completions advances (May 2019)
- Majors double down as takeaway crunch eases (April 2019)
- What’s new in well logging and formation evaluation (April 2019)
- Qualification of a 20,000-psi subsea BOP: A collaborative approach (February 2019)
- ConocoPhillips’ Greg Leveille sees rapid trajectory of technical advancement continuing (February 2019)