EBookClubs

Read Books & Download eBooks Full Online

EBookClubs

Read Books & Download eBooks Full Online

Book Uncertainty Quantification in Seismic Imaging

Download or read book Uncertainty Quantification in Seismic Imaging written by Iga Pawelec and published by . This book was released on 2018 with total page 73 pages. Available in PDF, EPUB and Kindle. Book excerpt:

Book Time lapse Seismic Imaging and Uncertainty Quantification

Download or read book Time lapse Seismic Imaging and Uncertainty Quantification written by Maria Kotsi and published by . This book was released on 2020 with total page pages. Available in PDF, EPUB and Kindle. Book excerpt: Time-lapse (4D) seismic monitoring is to date the most commonly used technique for estimating changes of a reservoir under production. Full-Waveform Inversion (FWI) is a high resolution technique that delivers Earth models by iteratively trying to match synthetic prestack seismic data with the observed data. Over the past decade the application of FWI on 4D data has been extensively studied, with a variety of strategies being currently available. However, 4D FWI still has challenges unsolved. In addition, the standard outcome of a 4D FWI scheme is a single image, without any measurement of the associated uncertainty. These issues beg the following questions: (1) Can we go beyond the current FWI limitations and deliver more accurate 4D imaging?, and (2) How well do we know what we think we know? In this thesis, I take steps to answer both questions. I first compare the performances of three common 4D FWI approaches in the presence of model uncertainties. These results provide a preliminary understanding of the underlying uncertainty, but also highlight some of the limitations of pixel by pixel uncertainty quantification. I then introduce a hybrid inversion technique that I call Dual-Domain Waveform Inversion (DDWI), whose objective function joins traditional FWI with Image Domain Wavefield Tomography (IDWT). The new objective function combines diving wave information in the data-domain FWI term with reflected wave information in the image-domain IDWT term, resulting in more accurate 4D model reconstructions. Working with 4D data provides an ideal situation for testing and developing new algorithms. Since there are repeated surveys at the same location, not only is the surrounding geology well-known and the results of interest are localized in small regions, but also they allow for better error analysis. Uncertainty quantification is very valuable for building knowledge but is not commonly done due to the computational challenge of exploring the range of all possible models that could fit the data. I exploit the structure of the 4D problem and propose the use of a focused modeling technique for a fast Metropolis-Hastings inversion. The proposed framework calculates time-lapse uncertainty quantification in a targeted way that is computationally feasible. Having the ground truth 4D probability distributions, I propose a local 4D Hamiltonian Monte Carlo (HMC) - a more advanced uncertainty quantification technique - that can handle higher dimensionalities while offering faster convergence.

Book Joining Statistics and Geophysics for Assessment and Uncertainty Quantification of Three dimensional Seismic Earth Models

Download or read book Joining Statistics and Geophysics for Assessment and Uncertainty Quantification of Three dimensional Seismic Earth Models written by Carène Larmat and published by . This book was released on 2017 with total page 13 pages. Available in PDF, EPUB and Kindle. Book excerpt: Seismic inversions produce seismic models, which are 3-dimensional (3D) images of wave velocity of the entire planet retrieved by fitting seismic measurements made on records of past earthquakes or other seismic events. Computing power of the TeraFlop era, along with the dataflow from new, very dense, seismic arrays, has led to a new generation of 3D seismic Earth models with an unprecedented level of resolution. Here we compare two recent models of western United States from the Dynamic North America (DNA) seismic imaging effort. The two models only differ in the wave propagation that was used for their inversion: one is based on ray theory (RT), and the other on finite frequency (FF). We evaluate the two models using an independent numerical method and statistical tests. We show that they differ in how they produce seismic signals from a subset of earthquakes that were used in the original inversion and were recorded on the US array. This is especially true for measurements done in the Yellowstone area which has a large negative seismic anomaly. This result is of importance for seismologists who have been debating on the practical benefit of using FF in ill-posed Earth inversions. Model evaluation, such as the one reported here, represents an opportunity for collaboration between geophysical and statistical communities. More opportunities should arise with the upcoming Exascale era, which will provide enough computational power to explore together several sources of errors in models with thousands of parameters, opening the way of uncertainty quantification of seismic models.

Book Large Scale Inverse Problems and Quantification of Uncertainty

Download or read book Large Scale Inverse Problems and Quantification of Uncertainty written by Lorenz Biegler and published by John Wiley & Sons. This book was released on 2011-06-24 with total page 403 pages. Available in PDF, EPUB and Kindle. Book excerpt: This book focuses on computational methods for large-scale statistical inverse problems and provides an introduction to statistical Bayesian and frequentist methodologies. Recent research advances for approximation methods are discussed, along with Kalman filtering methods and optimization-based approaches to solving inverse problems. The aim is to cross-fertilize the perspectives of researchers in the areas of data assimilation, statistics, large-scale optimization, applied and computational mathematics, high performance computing, and cutting-edge applications. The solution to large-scale inverse problems critically depends on methods to reduce computational cost. Recent research approaches tackle this challenge in a variety of different ways. Many of the computational frameworks highlighted in this book build upon state-of-the-art methods for simulation of the forward problem, such as, fast Partial Differential Equation (PDE) solvers, reduced-order models and emulators of the forward problem, stochastic spectral approximations, and ensemble-based approximations, as well as exploiting the machinery for large-scale deterministic optimization through adjoint and other sensitivity analysis methods. Key Features: Brings together the perspectives of researchers in areas of inverse problems and data assimilation. Assesses the current state-of-the-art and identify needs and opportunities for future research. Focuses on the computational methods used to analyze and simulate inverse problems. Written by leading experts of inverse problems and uncertainty quantification. Graduate students and researchers working in statistics, mathematics and engineering will benefit from this book.

Book Uncertainty Quantification in Seismic Interferometry

Download or read book Uncertainty Quantification in Seismic Interferometry written by Daniella Ayala-Garcia and published by . This book was released on 2022 with total page 0 pages. Available in PDF, EPUB and Kindle. Book excerpt:

Book Computationally Efficient Methods for Uncertainty Quantification in Seismic Inversion

Download or read book Computationally Efficient Methods for Uncertainty Quantification in Seismic Inversion written by Georgia K. Stuart and published by . This book was released on 2020 with total page pages. Available in PDF, EPUB and Kindle. Book excerpt: Full waveform inversion is an iterative optimization technique used to estimate subsurface physical parameters in the earth. A seismic energy source is generated in a borehole or on the surface of the earth which causes a seismic wave to propagate into the underground material. The transmitted wave then reflects off of material interfaces (rocks and fluids) and the returning wave is recorded at geophones. The inverse problem involves estimating parameters that describe this wave propagation (such as velocity) to minimize the misfit between the measured data and data we simulate from our mathematical model. The seismic velocity inversion problem is difficult because it contains sources of uncertainty, due to the instruments used to record the data and our mathematical model for seismic wave propagation. Using uncertainty quantification (UQ), we construct distributions of earth velocity models. Distributions give information about how probable an Earth model is, given the recorded seismic data. This rich information impacts real-world decision making, such as where to drill a well to produce oil and gas. UQ methods based on repeated sampling to construct estimates of the distribution, such as Markov chain Monte Carlo (MCMC), are desirable because they do not impose restrictions on the shape of the distribution. How ever, MCMC methods are computationally expensive because they require solving the wave equation repeatedly to generate simulated seismic wave data. This dissertation focuses on techniques to reduce the computational expense of MCMC methods for the seismic velocity inversion problem. Two-stage MCMC uses an inexpensive filter to cheaply reject unacceptable velocity models. The operator upscaling method, an inexpensive surrogate for the wave equation, is one such filter. We find that two-stage MCMC with the operator upscaling filter is effective at producing the same uncertainty information as traditional one-stage MCMC, but reduces the computational cost by between 20% and 45%. A neural network, in conjunction with operator upscaling, is another choice of filter. We find that the neural network filter reduces the computational cost of MCMC by 65% for our experiment, which includes the time needed to generate the training set and the neural network. The size of the problem we can solve using two-stage MCMC is limited by the random walk sampler. Hamiltonian Monte Carlo (HMC) and the No-U-Turn sampler (NUTS) use gradient information and Hamiltonian dynamics to steer the sampler, thereby eliminating the inefficient random walk behavior. Discretizing Hamiltonian dynamics requires two user specified parameters: trajectory length and step size. The NUTS algorithm avoids setting the trajectory length in advance by constructing variable-length paths. We find that the NUTS algorithm for seismic inversion results in superior decrease in the residual over traditional HMC while removing the need for costly tuning runs. However, constructing the gradient for the seismic inverse problem is computationally expensive. In two-stage, neural network-enhanced HMC we replace the costly gradient computation with a neural network. Additionally, we use the neural network to reject unacceptable samples as in two-stage MCMC. We find that the two-stage neural network HMC scheme reduces the computational cost by over 80% when compared to traditional HMC for a 100-unknown layered problem.

Book Epistemic Uncertainty Quantification of Seismic Damage Assessment

Download or read book Epistemic Uncertainty Quantification of Seismic Damage Assessment written by Hesheng Tang and published by . This book was released on 2017 with total page pages. Available in PDF, EPUB and Kindle. Book excerpt: The damage-based structural seismic performance evaluations are widely used in seismic design and risk evaluation of civil facilities. Due to the large uncertainties rooted in this procedure, the application of damage quantification results is still a challenge for researchers and engineers. Uncertainties in damage assessment procedure are important consideration in performance evaluation and design of structures against earthquakes. Due to lack of knowledge or incomplete, inaccurate, unclear information in the modeling, simulation, and design, there are limitations in using only one framework (probability theory) to quantify uncertainty in a system because of the impreciseness of data or knowledge. In this work, a methodology based on the evidence theory is presented for quantifying the epistemic uncertainty of damage assessment procedure. The proposed methodology is applied to seismic damage assessment procedure while considering various sources of uncertainty emanating from experimental force-displacement data of reinforced concrete column. In order to alleviate the computational difficulties in the evidence theory-based uncertainty quantification analysis (UQ), a differential evolution-based computational strategy for efficient calculation of the propagated belief structure in a system with evidence theory is presented here. Finally, a seismic damage assessment example is investigated to demonstrate the effectiveness of the proposed method.

Book Uncertainty Quantification for Prestack Time lapse Seismic Tomography

Download or read book Uncertainty Quantification for Prestack Time lapse Seismic Tomography written by Valentin Tschannen and published by . This book was released on 2014 with total page 0 pages. Available in PDF, EPUB and Kindle. Book excerpt:

Book Deep Learning for Automatic Geophysical Interpretation with Uncertainty Quantification

Download or read book Deep Learning for Automatic Geophysical Interpretation with Uncertainty Quantification written by Nam Phuong Pham and published by . This book was released on 2022 with total page 0 pages. Available in PDF, EPUB and Kindle. Book excerpt: Geophysical interpretation such as picking faults and geobodies, analyzing well logs, and picking arrivals is a tedious, manual, and time-consuming process. Deep learning is a data-driven technique that has been getting more attention recently in different fields, such as medical imaging and computer vision. With large volumes of available data of different types and advancements in computing technology, geophysics is a promising field for applying deep learning. Applying deep learning to geophysical interpretation can make the process faster and the workflow less subjective. Decision-making based on interpretation is uncertain. Therefore, uncertainties in geophysical interpretation are very important. To utilize the deep learning models effectively, uncertainties from data and models' parameters need to be quantified. In this dissertation, I address the problem by including uncertainties in several deep learning-based interpretation algorithms, and show the feasibility of applying them to various geophysical interpretation problems on different types of data. First, I develop a generative adversarial network to produce data that have style from a particular region in the field data. Different styles allow to generate different data to train various convolutional neural networks for automatic fault picking in 2D seismic images. I use a bootstrapping method to generate prediction scenarios and quantify the uncertainties from training data. Second, I introduce an end-to-end network for picking channel geobodies in 3D seismic volumes, which includes uncertainties from data and the model's parameters. This workflow is fast and easy to quantify uncertainties, not only from data, but also from the parameters of a neural network. I then apply a similar workflow to quantify the uncertainties from the model's parameters in picking channel facies and faults simultaneously in 3D seismic volumes. I also analyze the relationship between quantified uncertainties and geologic features in the seismic volumes. Apart from applying the workflow to the segmentation problem, I design a recurrent style network for predicting missing sonic logs from gamma-ray, density, and neutron porosity logs. This is a regression problem with two outputs of compressional and shear sonic logs. The workflow generates mean prediction and quantile values for upper and lower bounds. In the last chapter, I apply a transformer-based network for picking arrivals of earthquake data. I change from discrete labels of 0 and 1, where ones are picks, to continuous distributions with peaks at picks. This helps to quantify the uncertainties of the picking algorithm along time. Finally, I discuss some limitations and suggest some possible future research topics

Book A First Course in Applied Mathematics

Download or read book A First Course in Applied Mathematics written by Jorge Rebaza and published by John Wiley & Sons. This book was released on 2012-04-24 with total page 458 pages. Available in PDF, EPUB and Kindle. Book excerpt: Explore real-world applications of selected mathematical theory, concepts, and methods Exploring related methods that can be utilized in various fields of practice from science and engineering to business, A First Course in Applied Mathematics details how applied mathematics involves predictions, interpretations, analysis, and mathematical modeling to solve real-world problems. Written at a level that is accessible to readers from a wide range of scientific and engineering fields, the book masterfully blends standard topics with modern areas of application and provides the needed foundation for transitioning to more advanced subjects. The author utilizes MATLAB® to showcase the presented theory and illustrate interesting real-world applications to Google's web page ranking algorithm, image compression, cryptography, chaos, and waste management systems. Additional topics covered include: Linear algebra Ranking web pages Matrix factorizations Least squares Image compression Ordinary differential equations Dynamical systems Mathematical models Throughout the book, theoretical and applications-oriented problems and exercises allow readers to test their comprehension of the presented material. An accompanying website features related MATLAB® code and additional resources. A First Course in Applied Mathematics is an ideal book for mathematics, computer science, and engineering courses at the upper-undergraduate level. The book also serves as a valuable reference for practitioners working with mathematical modeling, computational methods, and the applications of mathematics in their everyday work.

Book Geophysics and Geosequestration

Download or read book Geophysics and Geosequestration written by Thomas L. Davis and published by Cambridge University Press. This book was released on 2019-05-09 with total page 391 pages. Available in PDF, EPUB and Kindle. Book excerpt: An overview of the geophysical techniques and analysis methods for monitoring subsurface carbon dioxide storage for researchers and industry practitioners.

Book An Introduction to Geophysical Exploration

Download or read book An Introduction to Geophysical Exploration written by Philip Kearey and published by John Wiley & Sons. This book was released on 2013-04-16 with total page 292 pages. Available in PDF, EPUB and Kindle. Book excerpt: This new edition of the well-established Kearey and Brooks text is fully updated to reflect the important developments in geophysical methods since the production of the previous edition. The broad scope of previous editions is maintained, with even greater clarity of explanations from the revised text and extensively revised figures. Each of the major geophysical methods is treated systematically developing the theory behind the method and detailing the instrumentation, field data acquisition techniques, data processing and interpretation methods. The practical application of each method to such diverse exploration applications as petroleum, groundwater, engineering, environmental and forensic is shown by case histories. The mathematics required in order to understand the text is purposely kept to a minimum, so the book is suitable for courses taken in geophysics by all undergraduate students. It will also be of use to postgraduate students who might wish to include geophysics in their studies and to all professional geologists who wish to discover the breadth of the subject in connection with their own work.

Book Introduction to uncertainty quantification

Download or read book Introduction to uncertainty quantification written by T. J. Sullivan and published by . This book was released on 2015 with total page 342 pages. Available in PDF, EPUB and Kindle. Book excerpt:

Book Modeling Uncertainty in the Earth Sciences

Download or read book Modeling Uncertainty in the Earth Sciences written by Jef Caers and published by John Wiley & Sons. This book was released on 2011-05-25 with total page 294 pages. Available in PDF, EPUB and Kindle. Book excerpt: Modeling Uncertainty in the Earth Sciences highlights the various issues, techniques and practical modeling tools available for modeling the uncertainty of complex Earth systems and the impact that it has on practical situations. The aim of the book is to provide an introductory overview which covers a broad range of tried-and-tested tools. Descriptions of concepts, philosophies, challenges, methodologies and workflows give the reader an understanding of the best way to make decisions under uncertainty for Earth Science problems. The book covers key issues such as: Spatial and time aspect; large complexity and dimensionality; computation power; costs of 'engineering' the Earth; uncertainty in the modeling and decision process. Focusing on reliable and practical methods this book provides an invaluable primer for the complex area of decision making with uncertainty in the Earth Sciences.

Book Quantitative Analysis of Geopressure for Geoscientists and Engineers

Download or read book Quantitative Analysis of Geopressure for Geoscientists and Engineers written by Nader C. Dutta and published by Cambridge University Press. This book was released on 2021-03-11 with total page 582 pages. Available in PDF, EPUB and Kindle. Book excerpt: Geopressure, or pore pressure in subsurface rock formations impacts hydrocarbon resource estimation, drilling, and drilling safety in operations. This book provides a comprehensive overview of geopressure analysis bringing together rock physics, seismic technology, quantitative basin modeling and geomechanics. It provides a fundamental physical and geological basis for understanding geopressure by explaining the coupled mechanical and thermal processes. It also brings together state-of-the-art tools and technologies for analysis and detection of geopressure, along with the associated uncertainty. Prediction and detection of shallow geohazards and gas hydrates is also discussed and field examples are used to illustrate how models can be practically applied. With supplementary MATLAB® codes and exercises available online, this is an ideal resource for students, researchers and industry professionals in geoscience and petroleum engineering looking to understand and analyse subsurface formation pressure.

Book Quantitative Seismic Interpretation

Download or read book Quantitative Seismic Interpretation written by Per Avseth and published by Cambridge University Press. This book was released on 2010-06-10 with total page 524 pages. Available in PDF, EPUB and Kindle. Book excerpt: Quantitative Seismic Interpretation demonstrates how rock physics can be applied to predict reservoir parameters, such as lithologies and pore fluids, from seismically derived attributes. The authors provide an integrated methodology and practical tools for quantitative interpretation, uncertainty assessment, and characterization of subsurface reservoirs using well-log and seismic data. They illustrate the advantages of these new methodologies, while providing advice about limitations of the methods and traditional pitfalls. This book is aimed at graduate students, academics and industry professionals working in the areas of petroleum geoscience and exploration seismology. It will also interest environmental geophysicists seeking a quantitative subsurface characterization from shallow seismic data. The book includes problem sets and a case-study, for which seismic and well-log data, and MATLAB® codes are provided on a website (http://www.cambridge.org/9780521151351). These resources will allow readers to gain a hands-on understanding of the methodologies.

Book Applications of a Fast Helmholtz Solver in Exploration Seismology

Download or read book Applications of a Fast Helmholtz Solver in Exploration Seismology written by Gregory Tsiang Ely and published by . This book was released on 2019 with total page 150 pages. Available in PDF, EPUB and Kindle. Book excerpt: Seismic imaging techniques rely on a velocity model inverted from noisy data via a non-linear inverse problem. This inferred velocity model may be inaccurate and lead to incorrect interpretations of the subsurface. In this thesis, I combine a fast Helmholtz solver, the field expansion method, with a reduced velocity model parameterization to address the impact of an uncertain or inaccurate velocity model. I modify the field expansion framework to accurately simulate the acoustic field for velocity models that commonly occur in seismic imaging. The field expansion method describes the acoustic field in a periodic medium in which the velocity model and source repeat infinitely in the horizontal direction, much like a diffraction grating. This Helmholtz solver achieves significant computational speed by restricting the velocity model to consists of a number of non-overlapping piecewise layers. I modify this restricted framework to allow for the modeling of more complex velocity models with dozens of parameters instead of the thousands or millions of parameters used to characterize pixelized velocity models. This parameterization, combined with the speed of the forward solver allow me to examine two problems in seismic imaging: uncertainty quantification and benchmarking global optimization methods. With the rapid speed of the forward solver, I use Markov Chain Monte Carlo methods to estimate the non-linear probability distribution of a 2D seismic velocity model given noisy data. Although global optimization methods have recently been applied to inversion of seismic velocity model using raw waveform data, it has been impossible to compare various types of algorithms and impacts of parameters on convergence. The reduced forward model presented in this paper allows me to benchmark these algorithms and objectively compare their performance to one another. I also explore the application of these and other geophysical methods to a medical ultrasound dataset that is well approximated by a layered model.