EBookClubs

Read Books & Download eBooks Full Online

EBookClubs

Read Books & Download eBooks Full Online

Book Novel Sampling Techniques for Reservoir History Matching Optimisation and Uncertainty Quantification in Flow Prediction

Download or read book Novel Sampling Techniques for Reservoir History Matching Optimisation and Uncertainty Quantification in Flow Prediction written by Lina Mahgoub Yahya Mohamed and published by . This book was released on 2011 with total page pages. Available in PDF, EPUB and Kindle. Book excerpt: Modern reservoir management has an increasing focus on accurately predicting the likely range of field recoveries. A variety of assisted history matching techniques has been developed across the research community concerned with this topic. These techniques are based on obtaining multiple models that closely reproduce the historical flow behaviour of a reservoir. The set of resulted history matched models is then used to quantify uncertainty in predicting the future performance of the reservoir and providing economic evaluations for different field development strategies. The key step in this workflow is to employ algorithms that sample the parameter space in an efficient but appropriate manner. The algorithm choice has an impact on how fast a model is obtained and how well the model fits the production data. The sampling techniques that have been developed to date include, among others, gradient based methods, evolutionary algorithms, and ensemble Kalman filter (EnKF). This thesis has investigated and further developed the following sampling and inference techniques: Particle Swarm Optimisation (PSO), Hamiltonian Monte Carlo, and Population Markov Chain Monte Carlo. The inspected techniques have the capability of navigating the parameter space and producing history matched models that can be used to quantify the uncertainty in the forecasts in a faster and more reliable way. The analysis of these techniques, compared with Neighbourhood Algorithm (NA), has shown how the different techniques affect the predicted recovery from petroleum systems and the benefits of the developed methods over the NA. The history matching problem is multi-objective in nature, with the production data possibly consisting of multiple types, coming from different wells, and collected at different times. Multiple objectives can be constructed from these data and explicitly be optimised in the multi-objective scheme. The thesis has extended the PSO to handle multi-objective history matching problems in which a number of possible conflicting objectives must be satisfied simultaneously. The benefits and efficiency of innovative multi-objective particle swarm scheme (MOPSO) are demonstrated for synthetic reservoirs. It is demonstrated that the MOPSO procedure can provide a substantial improvement in finding a diverse set of good fitting models with a fewer number of very costly forward simulations runs than the standard single objective case, depending on how the objectives are constructed. The thesis has also shown how to tackle a large number of unknown parameters through the coupling of high performance global optimisation algorithms, such as PSO, with model reduction techniques such as kernel principal component analysis (PCA), for parameterising spatially correlated random fields. The results of the PSO-PCA coupling applied to a recent SPE benchmark history matching problem have demonstrated that the approach is indeed applicable for practical problems. A comparison of PSO with the EnKF data assimilation method has been carried out and has concluded that both methods have obtained comparable results on the example case. This point reinforces the need for using a range of assisted history matching algorithms for more confidence in predictions.

Book History Matching and Uncertainty Quantification Using Sampling Method

Download or read book History Matching and Uncertainty Quantification Using Sampling Method written by Xianlin Ma and published by . This book was released on 2010 with total page pages. Available in PDF, EPUB and Kindle. Book excerpt: Uncertainty quantification involves sampling the reservoir parameters correctly from a posterior probability function that is conditioned to both static and dynamic data. Rigorous sampling methods like Markov Chain Monte Carlo (MCMC) are known to sample from the distribution but can be computationally prohibitive for high resolution reservoir models. Approximate sampling methods are more efficient but less rigorous for nonlinear inverse problems. There is a need for an efficient and rigorous approach to uncertainty quantification for the nonlinear inverse problems. First, we propose a two-stage MCMC approach using sensitivities for quantifying uncertainty in history matching geological models. In the first stage, we compute the acceptance probability for a proposed change in reservoir parameters based on a linearized approximation to flow simulation in a small neighborhood of the previously computed dynamic data. In the second stage, those proposals that passed a selected criterion of the first stage are assessed by running full flow simulations to assure the rigorousness. Second, we propose a two-stage MCMC approach using response surface models for quantifying uncertainty. The formulation allows us to history match three-phase flow simultaneously. The built response exists independently of expensive flow simulation, and provides efficient samples for the reservoir simulation and MCMC in the second stage. Third, we propose a two-stage MCMC approach using upscaling and non-parametric regressions for quantifying uncertainty. A coarse grid model acts as a surrogate for the fine grid model by flow-based upscaling. The response correction of the coarse-scale model is performed by error modeling via the non-parametric regression to approximate the response of the computationally expensive fine-scale model. Our proposed two-stage sampling approaches are computationally efficient and rigorous with a significantly higher acceptance rate compared to traditional MCMC algorithms. Finally, we developed a coarsening algorithm to determine an optimal reservoir simulation grid by grouping fine scale layers in such a way that the heterogeneity measure of a defined static property is minimized within the layers. The optimal number of layers is then selected based on a statistical analysis. The power and utility of our approaches have been demonstrated using both synthetic and field examples.

Book Population based Algorithms for Improved History Matching and Uncertainty Quantification of Petroleum Reservoirs

Download or read book Population based Algorithms for Improved History Matching and Uncertainty Quantification of Petroleum Reservoirs written by Yasin Hajizadeh and published by . This book was released on 2011 with total page pages. Available in PDF, EPUB and Kindle. Book excerpt: In modern field management practices, there are two important steps that shed light on a multimillion dollar investment. The first step is history matching where the simulation model is calibrated to reproduce the historical observations from the field. In this inverse problem, different geological and petrophysical properties may provide equally good history matches. Such diverse models are likely to show different production behaviors in future. This ties the history matching with the second step, uncertainty quantification of predictions. Multiple history matched models are essential for a realistic uncertainty estimate of the future field behavior. These two steps facilitate decision making and have a direct impact on technical and financial performance of oil and gas companies. Population-based optimization algorithms have been recently enjoyed growing popularity for solving engineering problems. Population-based systems work with a group of individuals that cooperate and communicate to accomplish a task that is normally beyond the capabilities of each individual. These individuals are deployed with the aim to solve the problem with maximum efficiency. This thesis introduces the application of two novel population-based algorithms for history matching and uncertainty quantification of petroleum reservoir models. Ant colony optimization and differential evolution algorithms are used to search the space of parameters to find multiple history matched models and, using a Bayesian framework, the posterior probability of the models are evaluated for prediction of reservoir performance. It is demonstrated that by bringing latest developments in computer science such as ant colony, differential evolution and multiobjective optimization, we can improve the history matching and uncertainty quantification frameworks. This thesis provides insights into performance of these algorithms in history matching and prediction and develops an understanding of their tuning parameters. The research also brings a comparative study of these methods with a benchmark technique called Neighbourhood Algorithms. This comparison reveals the superiority of the proposed methodologies in various areas such as computational efficiency and match quality.

Book DEVELOPMENT OF AN ASSISTED HISTORY MATCHING AND UNCERTAINTY QUANTIFICATION TOOL BASED ON GAUSSIAN PROCESSES PROXY MODELS AND VARIOGRAM BASED SENSITIVITY ANALYSIS

Download or read book DEVELOPMENT OF AN ASSISTED HISTORY MATCHING AND UNCERTAINTY QUANTIFICATION TOOL BASED ON GAUSSIAN PROCESSES PROXY MODELS AND VARIOGRAM BASED SENSITIVITY ANALYSIS written by Sachin Rana and published by . This book was released on 2017 with total page pages. Available in PDF, EPUB and Kindle. Book excerpt: History matching is an inverse solution process in which uncertain parameters of the numerical reservoir model are tuned in an eort to minimize the mismatch between simulated production and observed production data. History matching problem can be solved as an optimization or data assimilation problem. In this research, the history matching problem is solved from the optimization point of view. Currently, many commercial history matching tools use evolutionary strategy optimization algorithms such as dierential evolution, particle swarm optimization etc. to find solutions of history matching. However, these algorithms usually require a large number of numerical simulation runs in order to converge to acceptable solutions. If each numerical simulation takes an extensive time to complete, these algorithms become inecient. In this research, a new assisted history matching tool named as GP-VARS is presented that can provide multiple solutions of history matching fewer numerical simulations. GP-VARS uses Gaussian process (GP) based proxy models to provide fast approximate forward solutions which are used in Bayesian optimization to find history match solutions in an iterative manner. An application of VARS based sensitivity analysis is applied on forward GP model to calculate the sensitivity index for uncertain reservoir parameters. The results of sensitivity analysis are used to regulate the lower and upper bounds of dierent reservoir parameters in order to achieve faster convergence. A second GP model is used to provide an inverse solution which also provides temporary history match solutions. Since the history matching problem has non-unique solutions, the uncertainty in reservoir parameters is quantified using Markov Chain Monte Carlo (MCMC ) sampling from the trained forward GP model. The collected MCMC samples are then passed to a third GP model that is trained to predict the EUR values for any combination of reservoir parameters. The GP-VARS methodology is applied to three dierent heterogeneous reservoir case studies including a benchmark PUNQ-S3 reservoir located in north sea and the M4.1 reservoir located in Gulf of Mexico. The results show that history matching can be performed in approximately four times less number of numerical simulation runs as compared to the state of the art dierential evolution algorithm. In addition, it was found that the P50 estimates of EUR are in close agreement with truth values in the presented case studies.

Book Uncertainty Quantification of Unconventional Reservoirs Using Assisted History Matching Methods

Download or read book Uncertainty Quantification of Unconventional Reservoirs Using Assisted History Matching Methods written by Esmail Mohamed Khalil Eltahan and published by . This book was released on 2019 with total page 368 pages. Available in PDF, EPUB and Kindle. Book excerpt: A hallmark of unconventional reservoirs is characterization uncertainty. Assisted History Matching (AHM) methods provide attractive means for uncertainty quantification (UQ), because they yield an ensemble of qualifying models instead of a single candidate. Here we integrate embedded discrete fracture model (EDFM), one of fractured-reservoirs modeling techniques, with a commercial AHM and optimization tool. We develop a new parameterization scheme that allows for altering individual properties of multiple wells or fracture groups. The reservoir is divided into three types of regions: formation matrix; EDFM fracture groups; and stimulated rock volume (SRV) around fracture groups. The method is developed in a sleek, stand-alone form and is composed of four main steps: (1) reading parameters exported by tool; (2) generating an EDFM instance; (3) running the instance on a simulator; and (4) calculating a pre-defined objective function. We present two applications. First, we test the method on a hypothetical case with synthetic production data from two wells. Using 20 history-matching parameters, we compare the performance of five AHM algorithms. Two of which are based on Bayesian approach, two are stochastic particle-swarm optimization (PSO), and one is commercial DECE algorithm. Performance is measured with metrics, such as solutions sample size, total simulation runs, marginal parameter posterior distributions, and distributions of estimated ultimate recovery (EUR). In the second application, we assess the effect of natural fractures on UQ of a single horizontal well in the middle Bakken. This is achieved by comparing four AHM scenarios with increasingly varying natural-fracture intensity. Results of the first study show that, based on pre-set acceptance criteria, DECE fails to generate any satisfying solutions. Bayesian methods are noticeably superior to PSO, although PSO is capable to generate large number of solutions. PSO tends to be focused on narrow regions of the posteriors and seems to significantly underestimate uncertainty. Bayesian Algorithm I, a method with a proxy-based acceptance/rejection sampler, ranks first in efficiency but evidently underperforms in accuracy. Results from the second study reveal that, even though varying intensity of natural fractures cam significantly alter other model parameters, that appears not to have influence on UQ (or long-term production)

Book Multi objective Methods for History Matching  Uncertainty Prediction and Optimisation in Reservoir Modelling

Download or read book Multi objective Methods for History Matching Uncertainty Prediction and Optimisation in Reservoir Modelling written by Junko Jhonson Juntianus Hutahaean and published by . This book was released on 2017 with total page pages. Available in PDF, EPUB and Kindle. Book excerpt:

Book Ensemble based Reservoir History Matching Using Hyper reduced order Models

Download or read book Ensemble based Reservoir History Matching Using Hyper reduced order Models written by Seonkyoo Yoon and published by . This book was released on 2016 with total page 106 pages. Available in PDF, EPUB and Kindle. Book excerpt: Subsurface flow modeling is an indispensable task for reservoir management, but the associated computational cost is burdensome owing to model complexity and the fact that many simulation runs are required for its applications such as production optimization, uncertainty quantification, and history matching. To relieve the computational burden in reservoir flow modeling, a reduced-order modeling procedure based on hyper-reduction is presented. The procedure consists of three components: state reduction, constraint reduction, and nonlinearity treatment. State reduction based on proper orthogonal decomposition (POD) is considered, and the impact of state reduction, with different strategies for collecting snapshots, on accuracy and predictability is investigated. Petrov- Galerkin projection is used for constraint reduction, and a hyper-reduction that couples the Petrov-Galerkin projection and a 'gappy' reconstruction is applied for the nonlinearity treatment. The hyper-reduction method is a Gauss-Newton framework with approximated tensors (GNAT), and the main contribution of this study is the presentation of a procedure for applying the method to subsurface flow simulation. A fully implicit oil-water two-phase subsurface flow model in three-dimensional space is considered, and the application of the proposed hyper-reduced-order modeling procedure achieves a runtime speedup of more than 300 relative to the full-order method, which cannot be achieved when only constraint reduction is adopted. In addition, two types of sequential Bayesian filtering for history matching are considered to investigate the performance of the developed hyper-reduced-order model to relive the associated computational cost. First, an ensemble Kalman filter (EnKF) is considered for Gaussian system and a procedure embedding the hyper-reduced model (HRM) into the EnKF is presented. The use of the HRM for the EnKF significantly reduces the computational cost without much loss of accuracy, but the combination requires a few remedies such as clustering to find an optimum reduced-order model according to spatial similarity of geological condition, which causes an additional computation. For non-Gaussian system, an advanced particle filter, known as regularized particle filter (RPF), is considered because it does not take any distributional assumptions. Particle filtering has rarely been applied for reservoir history matching due to the fact that it is hard to locate the initial particles on highly probable regions of state spaces especially when large scale system is considered, which makes the required number of particles scale exponentially with the model dimension. To resolve the issues, reparameterization is adopted to reduce the order of the geological parameters. For the reparameterization, principal component analysis (PCA) is used to compute the reduced space of the model parameters, and by constraining the filtering analysis with the computed subspace the required number of initial particles can be reduced down to a manageable level. Consequently, a huge computational saving is achieved by embedding the HRM into the RPF. Furthermore, the additional cost of clustering required to identify the geospatially optimum reduced-order model is saved because the advanced particle filter allows to easily identify the groups of geospatially similar particles.

Book Introduction to Geological Uncertainty Management in Reservoir Characterization and Optimization

Download or read book Introduction to Geological Uncertainty Management in Reservoir Characterization and Optimization written by Reza Yousefzadeh and published by Springer Nature. This book was released on 2023-04-08 with total page 142 pages. Available in PDF, EPUB and Kindle. Book excerpt: This book explores methods for managing uncertainty in reservoir characterization and optimization. It covers the fundamentals, challenges, and solutions to tackle the challenges made by geological uncertainty. The first chapter discusses types and sources of uncertainty and the challenges in different phases of reservoir management, along with general methods to manage it. The second chapter focuses on geological uncertainty, explaining its impact on field development and methods to handle it using prior information, seismic and petrophysical data, and geological parametrization. The third chapter deals with reducing geological uncertainty through history matching and the various methods used, including closed-loop management, ensemble assimilation, and stochastic optimization. The fourth chapter presents dimensionality reduction methods to tackle high-dimensional geological realizations. The fifth chapter covers field development optimization using robust optimization, including solutions for its challenges such as high computational cost and risk attitudes. The final chapter introduces different types of proxy models in history matching and robust optimization, discussing their pros and cons, and applications. The book will be of interest to researchers and professors, geologists and professionals in oil and gas production and exploration.

Book Parameter Estimation and Uncertainty Quantification in Water Resources Modeling

Download or read book Parameter Estimation and Uncertainty Quantification in Water Resources Modeling written by Philippe Renard and published by Frontiers Media SA. This book was released on 2020-04-22 with total page 177 pages. Available in PDF, EPUB and Kindle. Book excerpt: Numerical models of flow and transport processes are heavily employed in the fields of surface, soil, and groundwater hydrology. They are used to interpret field observations, analyze complex and coupled processes, or to support decision making related to large societal issues such as the water-energy nexus or sustainable water management and food production. Parameter estimation and uncertainty quantification are two key features of modern science-based predictions. When applied to water resources, these tasks must cope with many degrees of freedom and large datasets. Both are challenging and require novel theoretical and computational approaches to handle complex models with large number of unknown parameters.

Book Quantification of Uncertainty During History Matching

Download or read book Quantification of Uncertainty During History Matching written by Martin Guillermo Alvarado and published by . This book was released on 2003 with total page pages. Available in PDF, EPUB and Kindle. Book excerpt: This study proposes a new, easily applied method to quantify uncertainty in production forecasts based on reservoir simulation. The new method uses only observed data and mismatches between simulated values and observed values as history matches of observations progress to a final "best" match. The method is applicable even when only limited information is available from a field. Previous methods suggested in the literature require more information than our new method. Quantifying uncertainty in production forecasts (i.e., reserve estimates) is becoming increasingly important in the petroleum industry. Many current investment opportunities in reservoir development require large investments, many in harsh exploration environments, with intensive technology requirements and possibly marginal investment indicators. Our method of quantifying uncertainty uses a set of history-match runs and includes a method to determine the probability density function (pdf) of future oil production (reserves) while the history match is evolving. We applied our method to the lower-Pleistocene 8-Sand reservoir in the Green Canyon 18 field, Gulf of Mexico. This field was a challenge to model because of its complicated geometry and stratigraphy. We objectively computed the mismatch between observed and simulated data using an objective function and developed quantitative matching criteria that we used during history matching. We developed a method based on errors in the mismatches to assign likelihood to each run, and from these results, we determined the pdf of reservoir reserves and thus quantified the uncertainty in the forecast. In our approach, we assigned no preconceived likelihoods to the distribution of variables. Only the production data and history matching errors were used to assess uncertainty. Thus, our simple method enabled us to estimate uncertainty during the history-matching process using only dynamic behavior of a reservoir.

Book Reservoir Characterization and History Matching with Uncertainty Quantification Using Ensemble based Data Assimilation with Data Re parameterization

Download or read book Reservoir Characterization and History Matching with Uncertainty Quantification Using Ensemble based Data Assimilation with Data Re parameterization written by Mingliang Liu and published by . This book was released on 2021 with total page 153 pages. Available in PDF, EPUB and Kindle. Book excerpt: Reservoir characterization and history matching are essential steps in various subsurface applications, such as petroleum exploration and production and geological carbon sequestration, aiming to estimate the rock and fluid properties of the subsurface from geophysical measurements and borehole data. Mathematically, both tasks can be formulated as inverse problems, which attempt to find optimal earth models that are consistent with the true measurements. The objective of this dissertation is to develop a stochastic inversion method to improve the accuracy of predicted reservoir properties as well as quantification of the associated uncertainty by assimilating both the surface geophysical observations and the production data from borehole using Ensemble Smoother with Multiple Data Assimilation. To avoid the common phenomenon of ensemble collapse in which the model uncertainty would be underestimated, we propose to re-parameterize the high-dimensional geophysics data with data order reduction methods, for example, singular value decomposition and deep convolutional autoencoder, and then perform the models updating efficiently in the low-dimensional data space. We first apply the method to seismic and rock physics inversion for the joint estimation of elastic and petrophysical properties from the pre-stack seismic data. In the production or monitoring stage, we extend the proposed method to seismic history matching for the prediction of porosity and permeability models by integrating both the time-lapse seismic and production data. The proposed method is tested on synthetic examples and successfully applied in petroleum exploration and production and carbon dioxide sequestration.

Book Comparison of Sampling Methods for Uncertainty Evaluation in Reservoir Flow Predictions

Download or read book Comparison of Sampling Methods for Uncertainty Evaluation in Reservoir Flow Predictions written by Soraya Sofia Betancourt Pocaterra and published by . This book was released on 2000 with total page 178 pages. Available in PDF, EPUB and Kindle. Book excerpt:

Book History Matching and Uncertainty Characterization

Download or read book History Matching and Uncertainty Characterization written by Alexandre Emerick and published by LAP Lambert Academic Publishing. This book was released on 2012-04 with total page 264 pages. Available in PDF, EPUB and Kindle. Book excerpt: In the last decade, ensemble-based methods have been widely investigated and applied for data assimilation of flow problems associated with atmospheric physics and petroleum reservoir history matching. Among these methods, the ensemble Kalman filter (EnKF) is the most popular one for history-matching applications. The main advantages of EnKF are computational efficiency and easy implementation. Moreover, because EnKF generates multiple history-matched models, EnKF can provide a measure of the uncertainty in reservoir performance predictions. However, because of the inherent assumptions of linearity and Gaussianity and the use of limited ensemble sizes, EnKF does not always provide an acceptable history-match and does not provide an accurate characterization of uncertainty. In this work, we investigate the use of ensemble-based methods, with emphasis on the EnKF, and propose modifications that allow us to obtain a better history match and a more accurate characterization of the uncertainty in reservoir description and reservoir performance predictions.

Book Re sampling the Ensemble Kalman Filter for Improved History Matching and Characterizations of Non gaussian and Non linear Reservoir Models

Download or read book Re sampling the Ensemble Kalman Filter for Improved History Matching and Characterizations of Non gaussian and Non linear Reservoir Models written by Siavash Nejadi and published by . This book was released on 2014 with total page 203 pages. Available in PDF, EPUB and Kindle. Book excerpt: Reservoir simulation models play an important role in the production forecasting and field development planning. To enhance their predictive capabilities and capture the uncertainties in model parameters, stochastic reservoir models should be calibrated to both geologic and flow observations. The relationship between production performance and model parameters is vastly non-linear, rendering history matching process a challenging task. The Ensemble Kalman Filter (EnKF) is a Monte-Carlo based technique for assisted history matching and real-time updating of reservoir models. EnKF works efficiently with Gaussian variables, but it often fails to honor the reference probability distribution of the model parameters where the distribution of model parameters are non-Gaussian and the system dynamics are strongly nonlinear. In this thesis, novel sampling procedures are proposed to honor geologic information in reservoirs with non-Gaussian model parameters. The methodologies include generating multiple geological models and updating the uncertain parameters using dynamic flow responses using iterative EnKF technique. Two new re-sampling steps are presented for characterization of multiple facies reservoirs. After certain number of assimilation steps, the updated ensemble is used to generate a new ensemble that is conditional to both the geological information and the early production data. Probability field simulation and a novel probability weighted re-sampling scheme are introduce to re-sample a new ensemble. After the re-sampling step, iterative EnKF is again applied on the ensemble members to assimilate the remaining production history. A new automated dynamic data integration workflow is implemented for characterization and uncertainty assessment of fracture reservoir models. This new methodology includes generating multiple discrete fracture network (DFN) models, upscaling the models for flow simulation, and updating the DFN model parameters using dynamic flow responses. The assisted history matching algorithm entails combining a probability weighted sampling with iterative EnKF. The performances of the introduced methodologies are evaluated by performing various simulation studies for different synthetic and field case studies. The qualities of the final matching results are assessed by examining the geological realism of the updated ensemble using the reference probability distribution of the model parameters and computing the predicted dynamic data mismatch.

Book Data space Approaches for Efficient Uncertainty Quantification in Subsurface Flow Problems

Download or read book Data space Approaches for Efficient Uncertainty Quantification in Subsurface Flow Problems written by Wenyue Sun and published by . This book was released on 2018 with total page pages. Available in PDF, EPUB and Kindle. Book excerpt: Uncertainty quantification for subsurface flow problems is typically accomplished through the use of model inversion procedures in which multiple posterior (history-matched) geological models are generated and used for flow predictions. These procedures can be demanding computationally, and it is not always straightforward to maintain geological realism in the resulting history-matched models. In some applications, it is the flow predictions themselves (and the uncertainty associated with these predictions), rather than the posterior geological models, that are of primary interest. This is the motivation for the data-space inversion (DSI) procedures developed in this work. In the DSI framework, an ensemble of prior model realizations, honoring prior geostatistical information and hard data at wells, are generated and then (flow) simulated. The resulting reservoir responses (e.g., time-series of flow rate data at wells, and/or limited spatial saturation fields) are assembled into data vectors that represent prior `realizations' in the data space. The conditional distribution of data variables given observed data is then constructed within a Bayesian framework. This distribution is directly sampled using a data-space randomized maximum likelihood method. Due to the non-Gaussian characteristics of the data variables, we introduce pattern-based mapping operations, or histogram transformation, along with principal component analysis. These treatments allow us to represent the data variables using a set of low-dimensional variables that are closer to multivariate Gaussian, which is shown to improve the performance of DSI. We present extensive numerical results for two example cases involving oil-water flow in a bimodal channelized system and oil-water-gas flow in a Gaussian permeability system, in which the quantities of interest (QoI) are time-series data at wells. DSI results, with pattern-based mapping operations, for uncertainty quantification (e.g., P10, P50, P90 posterior predictions) are compared with those obtained from a strict rejection sampling (RS) procedure. Reasonable agreement between the DSI and RS results is consistently achieved, even when the (synthetic) true data to be matched fall near the edge of the prior distribution. Computational savings using DSI are very substantial in that RS requires O(10^5--10^6) flow simulations, in contrast to 500 for DSI, for the cases considered. We then apply the DSI procedure, with the histogram transformation treatment for data reparameterization, for naturally fractured reservoirs (NFRs), represented as general discrete-fracture-matrix (DFM) models. This DSI procedure is first tested on two-dimensional DFM systems involving multiple fracture scenarios. Comparison with an approximate rejection sampling procedure for this case indicates the DSI results for the P10, P50 and P90 responses are again consistent with RS results. The DSI method is then applied to a realistic NFR that has undergone 15 years of primary production and is under consideration for waterflooding. To construct the DSI representation, around 400 prior DFM models, which correspond to different geologic concepts and properties, are simulated. Two different reference `true' models, along with different data-assimilation durations, are considered. In all cases, the DSI predictions are shown to be consistent with the forecasts from the `true' model, and to provide reasonable quantification of forecast uncertainty. Finally, we investigate the application of DSI to quantify the uncertainty associated with carbon storage operations, in which the QoI is the spatial distribution of CO2 saturation in the top layer of a storage aquifer, and the observed data are pressure and CO2 saturation measurements from a few monitoring wells. We also introduce a procedure to optimize the locations of monitoring wells using only prior-model simulation results. This approach is based on analytical DSI results, and determines monitoring well locations such that the reduction in expected posterior variance of a relevant quantity is maximized. The new DSI procedure is applied to three-dimensional heterogeneous aquifer models involving uncertainties in a wide range of geological parameters, including variogram orientation, porosity and permeability fields, and regional pressure gradient. Multiple monitoring scenarios, involving four to eight monitoring wells, are considered in this evaluation. Application of DSI with optimal monitoring wells is shown to consistently reduce the posterior variance in predictions of the average CO2 saturation in the top layer, and to provide detailed saturation fields in reasonable correspondence with the `true' saturation distribution.

Book Data Analytics in Reservoir Engineering

Download or read book Data Analytics in Reservoir Engineering written by Sathish Sankaran and published by . This book was released on 2020-10-29 with total page 108 pages. Available in PDF, EPUB and Kindle. Book excerpt: Data Analytics in Reservoir Engineering describes the relevance of data analytics for the oil and gas industry, with particular emphasis on reservoir engineering.

Book Uncertainty Quantification in Unconventional Reservoirs Using Conventional Bootstrap and Modified Bootstrap Methodology

Download or read book Uncertainty Quantification in Unconventional Reservoirs Using Conventional Bootstrap and Modified Bootstrap Methodology written by Chukwuemeka Okoli and published by . This book was released on 2020 with total page 238 pages. Available in PDF, EPUB and Kindle. Book excerpt: Various uncertainty quantication methodologies are presented using a combination of several deterministic decline curve analysis models and two bootstrapping algorithms. The bootstrapping algorithms are the conventional bootstrapping method (CBM) and the modied bootstrapping method (MBM). The combined deterministic-stochastic combination models are applied to 126 sample wells from the Permian basin. Results are presented for 12 to 72 months of production hindcast given an average well production history of 120 months. Previous researchers used the Arps model and both conventional and modied bootstrapping with block re-sampling techniques to reliably quantify uncertainty in production forecasts. In this work, we applied both stochastic techniques to other decline curve analysis models|namely, the Duong and the Stretched Exponential Production Decline (SEPD) models. The algorithms were applied to sample wells spread across the three main sub-basins of the Permian. A description of how both the deterministic and stochastic methods can be combined is provided. Also, pseudo-codes that describes the methodologies applied in this work is provided to permit readers to replicate results if necessary. Based on the average forecast error plot in the Permian Basin for 126 active wells, we can also conclude that the MBM-Arps, CBM-Arps, and MBM-SEPD combinations produce P50 forecasts that match cumulative production best regardless of the sub-basin and amount of production hindcast used. Regardless of concerns about the coverage rate, the CBM-Arps, MBM-Arps, CBM-SEPD, and MBMSEPD algorithm combinations produce cumulative P50 predictions within 20% of the true cumulative production value using only a 24-month hindcast. With a 12 month-hindcast, the MBM-Arps combined model produced cumulative P50 predictions with a forecast error of approximately 20%. Also, the CBM-SEPD and MBM-SEPD models were within 30% of the true cumulative production using a 12- month hindcast. Another important result is that all the deterministic-stochastic method combinations studied under-predicted the true cumulative production to varying degrees. However, the CBM-Duong combination was found to severely under-predict cumulative production, especially for the 12-month hindcast. It is not a suitable model combination based on forecast error, especially when hindcast fractions on the low end of the spectrum are used. Accordingly, the CBM- Duong combination is not recommended, especially if production history of no more than 24 months is available for hindcasting. As expected, the coverage rate increased, and the forecast error decreased for all the algorithm combinations with increasing hindcast duration. The novelty of this work lies in its extension of the bootstrapping technique to other decline curve analysis models. The software developed can also be used to analyze many wells quickly on a standard engineering computer. This research is also important because realistic estimates of reserves can be estimated in plays like the Permian basin when uncertainty is correctly quantied.