EBookClubs

Read Books & Download eBooks Full Online

EBookClubs

Read Books & Download eBooks Full Online

Book Quantifying Uncertainty in Earthquake Rupture Models

Download or read book Quantifying Uncertainty in Earthquake Rupture Models written by Morgan T. Page and published by ProQuest. This book was released on 2007 with total page 210 pages. Available in PDF, EPUB and Kindle. Book excerpt: Using dynamic and kinematic models, we analyze the ability of GPS and strong-motion data to recover the rupture history of earthquakes. By analyzing the near-source ground-motion generated by earthquake ruptures through barriers and asperities, we determine that both the prestress and yield stress of a frictional inhomogeneity can be recovered. In addition, we find that models with constraints on rupture velocity have less ground motion than constraint-free, spontaneous dynamic models with equivalent stress drops. This suggests that kinematic models with such constraints overestimate the actual stress heterogeneity of earthquakes.

Book Source Parameter Estimation and Related Uncertainties of Small Earthquakes in Southern California

Download or read book Source Parameter Estimation and Related Uncertainties of Small Earthquakes in Southern California written by Deborah Lynn Kane and published by . This book was released on 2011 with total page 142 pages. Available in PDF, EPUB and Kindle. Book excerpt: Understanding the physics of earthquake rupture is critical to providing accurate estimates of seismic hazard and for effectively mitigating these hazards. Matching physical models to seismic data in order to better understand the earthquake rupture processes requires accurate and precise estimates of earthquake source properties. Measuring source properties such as rupture size and stress drop must include accounting for the effects of seismic wave propagation and making proper assumptions about the rupture process. This thesis focuses on the methods of estimating source properties of small earthquakes and on the application of these methods to earthquakes in two distinctly different seismogenic regions of California. In the San Jacinto Fault Zone, earthquakes recorded by a small aperture array allow quantification of source parameter uncertainties using empirical Green's functions and frequency-domain techniques. These uncertainties are frequently overlooked in source parameter estimation, and this study constrains them to ~30% of estimate values. A non-parametric time-domain method using a set of empirical Green's functions is described and applied to a series of example earthquakes. This approach minimizes the assumptions regarding the rupture process and can be used to study less simple ruptures. Correcting for the effects of seismic wave propagation is an important aspect of techniques used in source parameter estimation, and the conditions necessary to effectively use nearby earthquakes as path corrections are tested and quantified. At the San Andreas Fault near Parkfield, the high degree of waveform similarity among closely spaced earthquakes is used to apply spatially averaged propagation path corrections and search for rupture directivity effects. This analysis shows that the population of small earthquakes in this region does not have a consistent unilateral rupture direction, but 70% of M>3 earthquakes exhibit characteristics of southeast-directed rupture. Computational models featuring a fault interface separating two materials for Parkfield-like conditions agree with the preferential southeast-directed rupture and present potential implications for earthquake source physics. Combined, these studies of earthquake source parameter estimation can be used to improve future source parameter estimates, offer appropriate metrics for establishing uncertainty bounds, and contribute to the study of earthquake source physics.

Book Improving Uncertainty Quantification and Visualization for Spatiotemporal Earthquake Rate Models for the Pacific Northwest

Download or read book Improving Uncertainty Quantification and Visualization for Spatiotemporal Earthquake Rate Models for the Pacific Northwest written by Max Schneider and published by . This book was released on 2021 with total page 0 pages. Available in PDF, EPUB and Kindle. Book excerpt: The Pacific Northwest (PNW) has substantial earthquake risk, both due to the offshore Cascadia megathrust fault but also other fault systems that produce earthquakes under the region's population centers. Forecasts of aftershocks following large earthquakes are thus highly desirable and require statistical models of a catalog of the PNW’s past earthquakes and aftershock sequences. This is complicated by the fact that the PNW contains multiple tectonic regimes hypothesized to have different aftershock dynamics as well as two types of earthquake clustering (aftershock sequences and swarms). The Epidemic-Type Aftershock Sequence (ETAS) model is a top-performing spatiotemporal point process model which describes the dynamics of earthquakes and aftershocks in a seismic region using a set of parameters. Typically, maximum likelihood estimation is used to fit ETAS to an earthquake catalog; however, the ETAS likelihood suffers from flatness near its optima, parameter correlation and numerical instability, making likelihood-based estimates less reliable. We present a Bayesian procedure for ETAS estimation, such that parameter estimates and uncertainty can be robustly quantified, even for small and complex catalogs like the PNW. The procedure is conditional on knowing which earthquakes triggered which aftershocks; this latent structure and the ETAS parameters are estimated iteratively. The procedure uses a Gibbs sampler to conditionally estimate the posterior distributions of each part of the model. We simulate several synthetic catalogs and test the modelling procedure, showing well-mixed posterior distributions centered on true parameter values. We also use the procedure to model the continental PNW, using a new catalog formed by algorthmically combining US and Canadian data sources and then, identifying and removing earthquake swarms. While MLEs are unstable and depend on both the optimization procedure and its initial values, Bayesian estimates are insensitive to these choices. Bayesian estimates also fit the catalog better than do MLEs. We use the Bayesian method to quantify the uncertainty in ETAS estimates when including swarms in the model or modelling across different tectonic regimes, as well as from catalog measurement error. Seismicity rate estimates and the earthquake forecasts they yield vary spatially and are usually represented as heat maps. While the visualization literature suggests that displaying forecast uncertainty improves understanding in users of forecast maps, research on uncertainty visualization (UV) is missing from earthquake science. In a pre-registered online experiment, we test the effectiveness of three UV techniques for displaying uncertainty in aftershock forecasts. Participants completed two map-reading tasks and a comparative judgment task, which demonstrated how successful a visualization was in reaching two key communication goals: indicating where many aftershocks and no aftershocks are likely (sure bets) and where the forecast is low but the uncertainty is high enough to imply potential risk (surprises). All visualizations performed equally well in the goal of communicating sure bet situations. But the visualization mapping the lower and upper bounds of an uncertainty interval was substantially better than the other map designs at communicating potential surprises. We discuss the implications of these experimental results for the communication of uncertainty in aftershock forecast maps.

Book Best Practices in Physics based Fault Rupture Models for Seismic Hazard Assessment of Nuclear Installations

Download or read book Best Practices in Physics based Fault Rupture Models for Seismic Hazard Assessment of Nuclear Installations written by Luis A. Dalguer and published by Birkhäuser. This book was released on 2017-12-20 with total page 333 pages. Available in PDF, EPUB and Kindle. Book excerpt: This volume collects several extended articles from the first workshop on Best Practices in Physics-based Fault Rupture Models for Seismic Hazard Assessment of Nuclear Installations (BestPSHANI). Held in 2015, the workshop was organized by the IAEA to disseminate the use of physics-based fault-rupture models for ground motion prediction in seismic hazard assessments (SHA). The book also presents a number of new contributions on topics ranging from the seismological aspects of earthquake cycle simulations for source scaling evaluation, seismic source characterization, source inversion and physics-based ground motion modeling to engineering applications of simulated ground motion for the analysis of seismic response of structures. Further, it includes papers describing current practices for assessing seismic hazard in terms of nuclear safety in low seismicity areas, and proposals for physics-based hazard assessment for critical structures near large earthquakes. The papers validate and verify the models by comparing synthetic results with observed data and empirical models. The book is a valuable resource for scientists, engineers, students and practitioners involved in all aspects of SHA.

Book QUANTIFYING UNCERTAINTIES IN GROUND MOTION SIMULATIONS FOR SCENARIO EARTHQUAKES ON THE HAYWARD RODGERS CREEK FAULT SYSTEM USING THE USGS 3D VELOCITY MODEL AND REALISTIC PSEUDODYNAMIC RUPTURE MODELS

Download or read book QUANTIFYING UNCERTAINTIES IN GROUND MOTION SIMULATIONS FOR SCENARIO EARTHQUAKES ON THE HAYWARD RODGERS CREEK FAULT SYSTEM USING THE USGS 3D VELOCITY MODEL AND REALISTIC PSEUDODYNAMIC RUPTURE MODELS written by and published by . This book was released on 2008 with total page 5 pages. Available in PDF, EPUB and Kindle. Book excerpt: This project seeks to compute ground motions for large (M>6.5) scenario earthquakes on the Hayward Fault using realistic pseudodynamic ruptures, the USGS three-dimensional (3D) velocity model and anelastic finite difference simulations on parallel computers. We will attempt to bound ground motions by performing simulations with suites of stochastic rupture models for a given scenario on a given fault segment. The outcome of this effort will provide the average, spread and range of ground motions that can be expected from likely large earthquake scenarios. The resulting ground motions will be based on first-principles calculations and include the effects of slip heterogeneity, fault geometry and directivity, however, they will be band-limited to relatively low-frequency (

Book Parameter Estimation and Uncertainty Quantification in Water Resources Modeling

Download or read book Parameter Estimation and Uncertainty Quantification in Water Resources Modeling written by Philippe Renard and published by Frontiers Media SA. This book was released on 2020-04-22 with total page 177 pages. Available in PDF, EPUB and Kindle. Book excerpt: Numerical models of flow and transport processes are heavily employed in the fields of surface, soil, and groundwater hydrology. They are used to interpret field observations, analyze complex and coupled processes, or to support decision making related to large societal issues such as the water-energy nexus or sustainable water management and food production. Parameter estimation and uncertainty quantification are two key features of modern science-based predictions. When applied to water resources, these tasks must cope with many degrees of freedom and large datasets. Both are challenging and require novel theoretical and computational approaches to handle complex models with large number of unknown parameters.

Book Probabilistic Estimates and Theoretical Controls of Earthquake Size

Download or read book Probabilistic Estimates and Theoretical Controls of Earthquake Size written by Jeremy Maurer and published by . This book was released on 2018 with total page pages. Available in PDF, EPUB and Kindle. Book excerpt: Earthquakes are complex, stochastic processes. They occur throughout the world, and even though we know so much about them, more often than not we learn something new each time a large event occurs. In California, where much time and money has been spent on monitoring, measuring, and forecasting seismicity, there is little evidence to suggest when the next San Andreas-rupturing earthquake will occur, or exactly how big it will get. However, earthquakes do obey the laws of physics. Some of these laws, friction in particular, are stochastic in their very nature; it is imperative to couple physics-based understanding with quantitative stochastic modeling to obtain a complete description of how earthquakes work. To do this, I look at two problems. The first problem is to develop a method to quantify uncertainty in the moment deficit rate (MDR) for regional-scale fault systems. There are three sources of uncertainty: with the data itself, in how well our models of the crust reflect its actual properties, and in the sensitivity of the data to the fault at depth. Prior to this work, no rigorous methods have existed to measure how uncertain any particular MDR estimate is. I show how a method termed the Constrained Optimization Bounding Estimator (COBE) rigorously quantifies uncertainty in the MDR related to the data and resolution of a given model, and that the current MDR in southern California is larger than expected given the historic moment release. The second problem I address is what controls the size of earthquakes in the context of induced seismicity. I consider how the restriction of seismicity to the pressurized zone would alter the frequency-magnitude statistics of induced events, and whether or not such changes have been or could be observed in real data. The last chapter uses fully dynamic rupture simulations on 2-D rough faults to characterize maximum earthquake size with and without pressure perturbations. The results indicate that earthquake hazard is governed more by the background stress conditions and less by injection-related parameters.

Book Seismic Hazard Analysis

Download or read book Seismic Hazard Analysis written by Rodrigo Araya Montoya and published by . This book was released on 1988 with total page 326 pages. Available in PDF, EPUB and Kindle. Book excerpt:

Book Mega Quakes  Cascading Earthquake Hazards and Compounding Risks

Download or read book Mega Quakes Cascading Earthquake Hazards and Compounding Risks written by Katsuichiro Goda and published by Frontiers Media SA. This book was released on 2018-03-15 with total page 314 pages. Available in PDF, EPUB and Kindle. Book excerpt: Large-scale earthquake hazards pose major threats to modern society, generating casualties, disrupting socioeconomic activities, and causing enormous economic loss across the world. Events, such as the 2004 Indian Ocean tsunami and the 2011 Tohoku earthquake, highlighted the vulnerability of urban cities to catastrophic earthquakes. Accurate assessment of earthquake-related hazards (both primary and secondary) is essential to mitigate and control disaster risk exposure effectively. To date, various approaches and tools have been developed in different disciplines. However, they are fragmented over a number of research disciplines and underlying assumptions are often inconsistent. Our society and infrastructure are subjected to multiple types of cascading earthquake hazards; therefore, integrated hazard assessment and risk management strategy is needed for mitigating potential consequences due to multi-hazards. Moreover, uncertainty modeling and its impact on hazard prediction and anticipated consequences are essential parts of probabilistic earthquake hazard and risk assessment. The Research Topic is focused upon modeling and impact assessment of cascading earthquake hazards, including mainshock ground shaking, aftershock, tsunami, liquefaction, and landslide.

Book Quantifying Earthquake Collapse Risk of Tall Steel Moment Frame Buildings Using Rupture to rafters Simulations

Download or read book Quantifying Earthquake Collapse Risk of Tall Steel Moment Frame Buildings Using Rupture to rafters Simulations written by Hemanth Siriki and published by . This book was released on 2015 with total page 348 pages. Available in PDF, EPUB and Kindle. Book excerpt: There is a sparse number of credible source models available from large-magnitude past earthquakes. A stochastic source model generation algorithm thus becomes necessary for robust risk quantification using scenario earthquakes. We present an algorithm that combines the physics of fault ruptures as imaged in laboratory earthquakes with stress estimates on the fault constrained by field observations to generate stochastic source models for large-magnitude (Mw 6.0-8.0) strike-slip earthquakes. The algorithm is validated through a statistical comparison of synthetic ground motion histories from a stochastically generated source model for a magnitude 7.90 earthquake and a kinematic finite-source inversion of an equivalent magnitude past earthquake on a geometrically similar fault. The synthetic dataset comprises of three-component ground motion waveforms, computed at 636 sites in southern California, for ten hypothetical rupture scenarios (five hypocenters, each with two rupture directions) on the southern San Andreas fault. A similar validation exercise is conducted for a magnitude 6.0 earthquake, the lower magnitude limit for the algorithm. Additionally, ground motions from the Mw7.9 earthquake simulations are compared against predictions by the Campbell-Bozorgnia NGA relation as well as the ShakeOut scenario earthquake. The algorithm is then applied to generate fifty source models for a hypothetical magnitude 7.9 earthquake originating at Parkfield, with rupture propagating from north to south (towards Wrightwood), similar to the 1857 Fort Tejon earthquake. Using the spectral element method, three-component ground motion waveforms are computed in the Los Angeles basin for each scenario earthquake and the sensitivity of ground shaking intensity to seismic source parameters (such as the percentage of asperity area relative to the fault area, rupture speed, and risetime) is studied. Under plausible San Andreas fault earthquakes in the next 30 years, modeled using the stochastic source algorithm, the performance of two 18-story steel moment frame buildings (UBC 1982 and 1997 designs) in southern California is quantified. The approach integrates rupture-to-rafters simulations into the PEER performance based earthquake engineering (PBEE) framework. Using stochastic sources and computational seismic wave propagation, three-component ground motion histories at 636 sites in southern California are generated for sixty scenario earthquakes on the San Andreas fault. The ruptures, with moment magnitudes in the range of 6.0-8.0, are assumed to occur at five locations on the southern section of the fault. Two unilateral rupture propagation directions are considered. The 30-year probabilities of all plausible ruptures in this magnitude range and in that section of the fault, as forecast by the United States Geological Survey, are distributed among these 60 earthquakes based on proximity and moment release. The response of the two 18-story buildings hypothetically located at each of the 636 sites under 3-component shaking from all 60 events is computed using 3-D nonlinear time-history analysis. Using these results, the probability of the structural response exceeding Immediate Occupancy (IO), Life-Safety (LS), and Collapse Prevention (CP) performance levels under San Andreas fault earthquakes over the next thirty years is evaluated. Furthermore, the conditional and marginal probability distributions of peak ground velocity (PGV) and displacement (PGD) in Los Angeles and surrounding basins due to earthquakes occurring primarily on the mid-section of southern San Andreas fault are determined using Bayesian model class identification. Simulated ground motions at sites within 55-75km from the source from a suite of 60 earthquakes (Mw 6.0 - 8.0) primarily rupturing mid-section of San Andreas fault are considered for PGV and PGD data.

Book Epistemic Uncertainty Quantification of Seismic Damage Assessment

Download or read book Epistemic Uncertainty Quantification of Seismic Damage Assessment written by Hesheng Tang and published by . This book was released on 2017 with total page pages. Available in PDF, EPUB and Kindle. Book excerpt: The damage-based structural seismic performance evaluations are widely used in seismic design and risk evaluation of civil facilities. Due to the large uncertainties rooted in this procedure, the application of damage quantification results is still a challenge for researchers and engineers. Uncertainties in damage assessment procedure are important consideration in performance evaluation and design of structures against earthquakes. Due to lack of knowledge or incomplete, inaccurate, unclear information in the modeling, simulation, and design, there are limitations in using only one framework (probability theory) to quantify uncertainty in a system because of the impreciseness of data or knowledge. In this work, a methodology based on the evidence theory is presented for quantifying the epistemic uncertainty of damage assessment procedure. The proposed methodology is applied to seismic damage assessment procedure while considering various sources of uncertainty emanating from experimental force-displacement data of reinforced concrete column. In order to alleviate the computational difficulties in the evidence theory-based uncertainty quantification analysis (UQ), a differential evolution-based computational strategy for efficient calculation of the propagated belief structure in a system with evidence theory is presented here. Finally, a seismic damage assessment example is investigated to demonstrate the effectiveness of the proposed method.

Book Grand Challenges in Earthquake Engineering Research

Download or read book Grand Challenges in Earthquake Engineering Research written by National Research Council and published by National Academies Press. This book was released on 2011-09-30 with total page 102 pages. Available in PDF, EPUB and Kindle. Book excerpt: As geological threats become more imminent, society must make a major commitment to increase the resilience of its communities, infrastructure, and citizens. Recent earthquakes in Japan, New Zealand, Haiti, and Chile provide stark reminders of the devastating impact major earthquakes have on the lives and economic stability of millions of people worldwide. The events in Haiti continue to show that poor planning and governance lead to long-term chaos, while nations like Chile demonstrate steady recovery due to modern earthquake planning and proper construction and mitigation activities. At the request of the National Science Foundation, the National Research Council hosted a two-day workshop to give members of the community an opportunity to identify "Grand Challenges" for earthquake engineering research that are needed to achieve an earthquake resilient society, as well as to describe networks of earthquake engineering experimental capabilities and cyberinfrastructure tools that could continue to address ongoing areas of concern. Grand Challenges in Earthquake Engineering Research: A Community Workshop Report explores the priorities and problems regions face in reducing consequent damage and spurring technological preparedness advances. Over the course of the Grand Challenges in Earthquake Engineering Research workshop, 13 grand challenge problems emerged and were summarized in terms of five overarching themes including: community resilience framework, decision making, simulation, mitigation, and design tools. Participants suggested 14 experimental facilities and cyberinfrastructure tools that would be needed to carry out testing, observations, and simulations, and to analyze the results. The report also reviews progressive steps that have been made in research and development, and considers what factors will accelerate transformative solutions.

Book Seismic Hazard Analysis

Download or read book Seismic Hazard Analysis written by Rodrigo Araya M. and published by . This book was released on 1988 with total page 172 pages. Available in PDF, EPUB and Kindle. Book excerpt:

Book Geostatistical and Network Analysis of Non stationary Spatial Variation in Ground Motion Amplitudes

Download or read book Geostatistical and Network Analysis of Non stationary Spatial Variation in Ground Motion Amplitudes written by Yilin Chen and published by . This book was released on 2021 with total page pages. Available in PDF, EPUB and Kindle. Book excerpt: When an earthquake causes shaking in a region, the amplitude of shaking varies spatially. Ground motion models have been developed to predict the median and standard deviation of ground motion intensity measures. However, the remaining variation in ground motion prediction ``residuals'' is significant, and shows spatial correlations at scales of tens of kilometers in separation distance. These correlations are important when assessing the risk to spatially distributed infrastructure or portfolios of properties. State of the art today is to assume that these spatial correlations depend mainly on separation distance (stationarity assumption). This dissertation aims to advance spatial correlation models of ground motions, by conducting a comprehensive correlation study on various data sets, evaluating key assumptions of current models, and proposing a novel framework for modeling spatial correlations. First, this dissertation proposes a method of site-specific correlation estimation and techniques for quantifying non-stationary spatial variations. Applying these methods to various data sets, factors related to non-stationary spatial correlations are investigated. Using physics-based ground motion simulations, it studies the dependency of non-stationary spatial correlations on source effects, path effects, and relative location to rupture. Using data from recent well-recorded earthquakes in New Zealand, it analyzes site-specific and region-specific correlations in ground motion amplitude for Wellington and Christchurch, and observed strong non-stationarity in spatial correlations. Results suggest that heterogeneous geologic conditions appear to be associated with the non-stationary spatial correlation. Second, this dissertation formulates a framework for detecting and modeling non-stationary correlations. By utilizing network analysis techniques, it proposes a community detection algorithm to find regions in spatial data with higher correlations. Applying this algorithm to physics-based ground motion simulations, it detects communities of earthquake stations with high correlation to uncover underlying reasons for non-stationarity in spatial correlations. Factors associated with the communities of high correlation are identified. Results suggest that communities of high correlation in ground shaking tend to be associated with common geological conditions and relative location along the rupture strike direction. In addition, it applies the algorithm to a mixed-source data set from the simulations, and compares correlation characteristics of simulations and instrumental data. Results suggest that the mixed-source data tend to average out the non-stationary influence of source and path effects from a single rupture. Finally, this dissertation presents a framework for quantifying uncertainty in the estimation of correlations, and true variability in correlations from earthquake to earthquake. A procedure for evaluating estimation uncertainty is proposed and used to evaluate several methods that have been used in past studies to estimate correlations. The proposed procedure is also used to distinguish between estimation uncertainty and the true variability in model parameters that exist in a given data set. Results suggest that a Weighted Least Squares fitting method is most effective for correlation model estimation. Fitted correlation model parameters are shown to have substantial estimation uncertainty even for well-recorded earthquakes, and underlying true variability is relatively stable among well-recorded and poorly recorded earthquakes.

Book Calculating Catastrophe

Download or read book Calculating Catastrophe written by G. Woo and published by World Scientific. This book was released on 2011 with total page 368 pages. Available in PDF, EPUB and Kindle. Book excerpt: 1. Natural hazards. 1.1. Causation and association. 1.2. Extra-terrestrial hazards. 1.3. Meteorological hazards. 1.4. Geological hazards. 1.5. Geomorphic hazards. 1.6. Hydrological hazards -- 2. Societal hazards. 2.1. Political violence. 2.2. Infectious disease pandemics. 2.3. Industrial and transportation accidents. 2.4. Fraud catastrophe -- 3. A sense of scale. 3.1. Size scales of natural hazards. 3.2. Hazard spatial scales. 3.3. The human disaster toll. 3.4. Models of a fractal world -- 4. A measure of uncertainty. 4.1. The concept of probability. 4.2. The meaning of uncertainty. 4.3. Aleatory and epistemic uncertainty. 4.4. Probability ambiguity. 4.5. The weighing of evidence -- 5. A matter of time. 5.1. Temporal models of hazards. 5.2. Long-term data records. 5.3. Statistics of extremes -- 6. Catastrophe complexity. 6.1. Emergent catastrophes. 6.2. Financial crashes. 6.3. Ancillary hazards -- 7. Terrorism. 7.1. A thinking man's game. 7.2. Defeating terrorist networks. 7.3. Counter-radicalization -- 8. Forecasting. 8.1. Earthquake forecasting. 8.2. Verification. 8.3. River flows and sea waves. 8.4. Accelerating approach to criticality. 8.5. Evidence-based diagnosis -- 9. Disaster warning. 9.1. Decision in the balance. 9.2. Evacuation. 9.3. The wisdom of experts -- 10. Disaster scenarios. 10.1. Scenario simulation. 10.2. Footprints and vulnerability. 10.3. Fermi problems -- 11. Catastrophe cover. 11.1. Probable maximum loss. 11.2. Coherent risk measures. 11.3. The Samaritan's dilemma -- 12. Catastrophe risk securitization. 12.1. Catastrophe bonds. 12.2. The price of innovation -- 13. Risk horizons. 13.1. Ecological catastrophe. 13.2. Climate change. 13.3. War and conflict resolution

Book The Uniform California Earthquake Rupture Forecast  Version 2  UCERF 2

Download or read book The Uniform California Earthquake Rupture Forecast Version 2 UCERF 2 written by Working Group on California Earthquake Probabilities and published by . This book was released on 2008 with total page 108 pages. Available in PDF, EPUB and Kindle. Book excerpt: Accompanying CD-ROM has same title as book.