NHERI Computational Symposium

May 28-29, 2026

Seismic hazard models and their influence on damage predictions

Session 3D: Brayton Community Room #162, 1pm Chair: Matthew DeJong

 


Mersad Fathizadeh

Outline of a generic headshot

PhD Student The university of Arkansas

Computational Workflow for Batch Processing HVSR Data and Developing 3D Bedrock Surface Maps

Co-Authors: Clinton M. Wood (University of Arkansas), Hosna Kianfar (University of Arkansas), and Mohammadyar Rahimi (Keller North America)

Abstract: Microtremor horizontal-to-vertical spectral ratio (mHVSR) measurements provide a cost-effective approach for estimating fundamental site resonance frequencies, which can characterize sediment thickness. The technique records microtremors using a three-component seismometer and computes the ratio of horizontal to vertical spectral amplitudes, yielding the fundamental frequency f_0 . f_0 is a function of the layer thickness above an impedance contrast (e.g. bedrock) and the average shear-wave velocity (V_s) through f_0=\frac{V_s}{4h} where h is depth to bedrock and Vs is the average shear wave velocity. This study presents a reproducible computational workflow for batch processing HVSR data that integrates spatial metadata with site-specific V_s information to rapidly generate 3D bedrock depth maps. The methodology employs automated and user-guided peak detection algorithms to process multiple HVSR records throughout a site. The workflow supports two complementary mapping scenarios: 1) when V_s is determined by site-specific measurements (e.g., surface-wave surveys) and HVSR-derived frequencies are converted to bedrock depths and spatially interpolated and 2) when borehole logs establish bedrock depth and HVSR peaks enable back-calculation of profile averaged V_s values, which are used for spatially interpolating bedrock depth. The resulting 3D bedrock surface maps offer transparent, reproducible subsurface maps for mass excavation on civil engineering projects and seismic microzonation studies, particularly in regions with sparse borehole coverage. This workflow demonstrates how integration of HVSR measurements with site-specific constraints yields robust subsurface maps particularly for areas where terrain or surface conditions make typical geophysical and drilling methods challenging to conduct.

Maijia Su

Outline of a generic headshot

Postdoctoral Scholar University of California Berkeley

Site-based Stochastic Ground Motion Generation for Uncertainty Quantification in Earthquake Engineering

Co-Authors: Marco Broccardo (University of Trento) and Ziqi Wang (University of California Berkeley)

Abstract: Stochastic ground motion models (GMMs) are essential for representing seismic input for uncertainty quantification in earthquake engineering. This study presents a site-based stochastic GMM for simulating site-specific ground motion time series, in which source, path, and site effects are implicitly captured through statistical relationships derived from recorded seismic data. Specifically, a large set of recorded ground motions collected at multiple stations and across diverse earthquake scenarios is selected to represent future seismic input at a given site. This representation relies on a form of statistical equilibrium, whereby spatial variability in past recordings is treated as representative of the temporal variability that would occur at the site. Training the site-based stochastic GMM involves two steps. First, each selected record is fitted using an 11-parameter modulated filtered white noise model, enabling a compact yet flexible description of key temporal and spectral characteristics. Second, vector-based linear regression analysis (i.e., generalized vector ground motion prediction equations) is used to predict the stochastic model parameters, including Arias intensity, significant duration, predominant frequency, bandwidth measures, and parameters governing the temporal evolution of intensity. The regression analysis yields a conditional joint probability density function of the ground motion parameters, characterizing how their means, variabilities, and correlations evolve with changing earthquake scenarios. The resulting synthetic motions are well suited for computational simulation workflows in earthquake engineering, supporting uncertainty quantification and scenario-based analyses in cases where seismic data are lacking, particularly for large-magnitude events at small rupture distances.

Sangwoo Kim

Outline of a generic headshot

Researcher Moody’s RMS

Machine Learning-Based Ground Motion Models for Shallow Crustal Earthquakes in California Using the NGA-West3 Database

Co-Authors: Yenan Cao (Moody’s RMS) and Emel Seyhan (Moody’s RMS)

Abstract: This study proposes machine learning (ML)-based ground motion models for California, utilizing the recently published NGA-West 3 database applicable to active crustal regions. Two models are developed: the first is an ergodic model that uses input variables such as moment magnitude, Joyner-Boore distance, hypocentral depth, and Vs30 (the time-averaged shear wave velocity in the top 30 meters of the site), while the second is a non-ergodic model that incorporates these parameters along with site- and event-specific coordinates, such as latitude and longitude. Both models are trained using an ensemble of 100 Artificial Neural Networks (ANNs) and 100 XGBoost models, employing a stratified K-fold cross-validation approach to ensure robust performance. Aleatory and epistemic uncertainties are quantified using Gaussian negative log-likelihood (NLL) loss for the ANN models and quantile regression for the XGBoost models. The performance of these ML-based models is validated against NGA-West 2 models in terms of median predictions and uncertainty estimates. Additionally, event-specific comparisons are conducted for independent post-NGA-West 2 earthquakes, including the 2019 Ridgecrest Earthquake sequence and the 2016 South Napa Earthquake. Preliminary findings highlight the potential of ML-based approaches to enhance probabilistic seismic hazard assessment (PSHA) by integrating spatial and source-to-site dependencies, resulting in improved prediction accuracy.

Xiaolei Chu

Xiaolei Chu

PhD Student UC Berkeley

The Foundation Model of Ground Motion Generation

Co-Authors: Maijia Su (UC Berkeley) - Presenter and Ziqi Wang (UC Berkeley)

Abstract: High-fidelity ground motion generation is a cornerstone of performance-based earthquake engineering, yet data scarcity and limited controllability hinder current methods. We propose a novel foundation model framework that integrates physical laws with generative artificial intelligence to synthesize realistic seismic records. First, we utilize physics-based simulations to generate a massive dataset, capturing fundamental wave propagation mechanics across diverse fault scenarios. We then pre-train a conditional diffusion model on this synthetic corpus, enabling the network to learn the underlying physics of ground motion. To bridge the simulation-to-reality gap, the model is subsequently fine-tuned on real-world strong motion recordings. Finally, to ensure the generated motions meet specific engineering requirements, we employ Group Relative Policy Optimization (GRPO) to align the model’s outputs with designer-specified spectral acceleration targets. This multi-stage approach yields a robust foundation model capable of producing physically consistent, realistic, and purpose-aligned ground motions, offering a transformative tool for seismic hazard analysis and structural design.

EE-UQR2DDesignSafe HPCRAPID

Maha Kenawy

Maha Kenawy

Assistant Professor Oklahoma State University

Optimization of Physics-Based Regional Earthquake Simulations using Sequential Bayesian Experimental Design

Co-Authors: Melisa Herrera (Oklahoma State University) and Malak Baya (Oklahoma State University)

Abstract: Regional-scale physics-based earthquake simulations have undergone significant advances over the past decade, creating new opportunities for engineering assessments of the consequences of large and rare earthquake events. However, because of their computational costs, identifying representative earthquake simulation scenarios is typically determined by expert judgement, and does not consider the estimation of engineering risk measures and associated uncertainties. In this study, we address this problem by using the concepts of design of computer experiments to optimize the acquisition of data from budget-constrained sets of regional-scale simulations. We propose to guide the design of regional physics-based earthquake simulations using a sequential Bayesian experimental design technique, which optimizes the allocation of computational resources by sequentially identifying the most informative simulations to conduct. The sequential design method is driven by quantities of interest relevant for seismic risk assessment, such that the selected earthquake simulations directly reduce the uncertainty associated with those quantities. The proposed technique is used to analyze the seismic demands on building structures across the San Francisco Bay Area in California due to a magnitude 7.0 rupture on the Hayward fault, and demonstrates improved performance over conventional Monte Carlo sampling. This approach may reduce the computational costs associated with site-specific regional seismic risk assessment, and improve the quantification of uncertainties associated with physics-based fault-rupture simulations.

DesignSafe HPC

Amirreza Mohammadi Member GSC

Outline of a generic headshot

PhD Student University of Maryland

A Probabilistic Framework for Assessing Distributed Infrastructure Performance under Spatially Correlated Sequential and Concurrent Hazards

Co-Authors: Uichan Seok (Seoul National University) and Michelle Bensi (University of Maryland)

Abstract: Distributed infrastructure systems face risks from a range of natural hazards that can occur individually and in combination. Multi-hazard events may involve hazards that occur simultaneously or in close succession, such that the effects of one hazard remain unresolved when the next hazard strikes. Such overlapping or sequential events can cause individual component failures and lead to widespread system impacts, resulting in significant financial and functional losses. While probabilistic risk assessments have been widely used for individual hazards, integrated studies addressing inter-hazard relationships and combined risk modeling remain relatively limited.

This study introduces a modular probabilistic framework designed to assess the fragility of spatially distributed infrastructure systems under multi-hazard scenarios. The study uses seismic and wind hazards to illustrate the proposed framework, but it is intended to be more generally applicable. The framework leverages Bayesian Networks (BNs) to incorporate spatial correlations of hazard intensities (hazard module), quantify component-level vulnerabilities (fragility module), and evaluate system-level functionality (system performance module). This presentation focuses primarily on the development of hazard and fragility BN modules. The hazard module accounts for spatially correlated intensity measures of seismic and wind hazards, while the fragility module characterizes component response based on multiple hazard parameters. An illustrative system example is used to demonstrate how the framework models exposure to concurrent and sequential hazard scenarios and propagates these effects to estimate risk across the system.

By offering a unified, probabilistic approach to multi-hazard risk analysis, the proposed framework supports more informed decision-making for infrastructure resilience planning and emergency preparedness under multi-hazard scenarios.

Ioannis Vouvakis Manousakis

Ioannis Vouvakis Manousakis

Practicing engineer UC Berkeley

Coupled seismic hazard and risk analysis for the treatment of time-dependent rupture effects

Co-Author: Dimitrios Konstantinidis (UC Berkeley)

Abstract: Most seismic risk evaluation applications are based on the assumption of time-independent earthquake occurrence. Although practical, this assumption has limitations. This study explores the use of stochastic Earthquake Rupture Catalogs (ERCs) generated by UCERF3-ETAS, an advanced earthquake rupture forecasting model for California, to conduct seismic hazard and risk analysis under a unified framework. We examine a three-story steel Special Moment-Resisting Frame (SMRF) archetype building with idealized damage consequences, and explore the impact of time-dependent effects on structural response and seismic loss exceedance rates. Our results show that spatiotemporal clustering can increase the mean annual loss rates associated with a particular asset across the entire spectrum of potential shaking intensities. We demonstrate a viable path towards accounting for these effects by tightly integrating advanced seismic hazard analysis with component-level damage and loss estimation. Our application demonstrates that such a setup is feasible and can offer valuable insight into how time-dependent effects may bias seismic risk estimates.

PelicunDesignSafe HPC