NHERI Computational Symposium

February 1-2, 2024

Presenters Session 2


Neetesh Sharma

Postdoctoral Scholar
Stanford University

Presentation Title: Optimal scenario selection for probabilistic multi-hazard analyses

Co-Author: Jack Baker

Abstract: Regional multi-hazard risk analysis requires evaluating the likelihood and socioeconomic impacts of the combined hazard events. Accounting for the uncertainty in future hazard characteristics requires evaluating a large number of events. However, assessments of infrastructure functionality loss and the ensuing socioeconomic impact are computationally intensive, making brute-force simulation approaches impractical. Current practices of selecting representative hazard scenarios primarily focus on individual marginal distributions of hazards, disregarding their relative impacts on the region of interest and the correlations between distinct intensity measures across locations. This oversight may underestimate the composite risks in specific areas and lead to sub-optimal allocation of mitigation resources. Our work introduces a novel framework for optimal multi-hazard scenario selection, addressing the intricacies of multivariate intensity measure distributions, vector intensity measures, and data non-Gaussianity. We present algorithmic tools, including a customized objective function and algorithm, accommodating dimension reduction, proxy metrics, and customizable weights. Additionally, we developed an open-source Python package allowing users to implement the proposed methodology. A case study highlights the approach's effectiveness in evaluating combined flood and earthquake risk in a region based on an optimally selected suite of scenarios. By incorporating spatial and inter-intensity measure correlations, our methodology advances holistic risk analysis, informing resilient strategies considering interconnected hazards, infrastructure, and regional responses. This work enhances understanding of multi-hazard risks, empowering comprehensive risk management.

Pouria Kourehpaz

PhD Student
University of British Columbia

Presentation Title: How important are parameter choices in seismic loss and recovery time estimation?

Co-Authors: Carlos Molina Hutt; Adam Zsarnóczay

Abstract: Seismic risk quantification in buildings involves substantial uncertainty. This study employs a probabilistic sensitivity analysis to investigate the impact of uncertain model parameters on earthquake-induced economic loss and recovery time estimates. We adopt a variance-based sensitivity analysis procedure to compute the relative importance of model parameters in terms of the Sobol index for two seismic loss measures, i.e., the probability of irreparable damage and the mean repair cost, and two recovery measures, i.e., downtime to re-occupancy and functional recovery. The assessments are conducted on modern reinforced concrete shear wall building archetypes ranging from 8 to 16 stories subjected to ground motion shaking in Seattle with return periods of 100, 475, 975, and 2475 years. Results indicate that the modeling uncertainty associated with the simulated demand distribution has the most significant impact on the variance in seismic losses at all, but the 2475-year hazard level. Additionally, the findings suggest that at low hazard levels, i.e., 100-year and 475-year, the uncertainty in the capacity of structural components (e.g., slab-column connections) and nonstructural components (e.g., elevator) are the main contributors to variance in downtime to re-occupancy and downtime to functional recovery, respectively. At the 2475-year intensity level, the uncertainty in building replacement cost or replacement time becomes the primary driver of variance in the seismic loss and recovery time outputs due to the high probability of irreparable damage. The analyses presented in this study offer valuable insights into what parameters deserve the most attention when conducting seismic loss and post-earthquake recovery assessments.

Jianhua Xian

Postdoctoral Scholar
University of California, Berkeley

Presentation Title: Physics and data co-driven surrogate modeling for high-dimensional rare event simulation

Co-Author: Ziqi Wang

Abstract: In this talk, we present a physics and data co-driven surrogate modeling framework for efficient rare event simulation of mechanical and civil systems with high-dimensional input uncertainties. The framework embodies a fusion of interpretable low-fidelity physical models with data-driven error corrections. The overarching hypothesis is that a well-designed low-fidelity physical model can extract and preserve the fundamental features of the original high-fidelity model, while machine learning techniques can fill the remaining gaps between the high- and low-fidelity models. The coupled physics-data-driven surrogate model is adaptively trained using active learning, aiming to maximize the correlation between the high- and low-fidelity model responses in the critical parametric region for a rare event. Due to the strong correlation between the well-trained surrogate and original models, an importance sampling step is finally introduced to drive the probability estimations toward the theoretically guaranteed solutions. Numerical examples of static and dynamic problems with high-dimensional input uncertainties (up to 1,000 input random variables) are studied to demonstrate the proposed framework.

Arthriya Subgranon

Assistant Professor
University of Florida

Presentation Title: Uncertainty quantification of wind-tunnel-informed translation models for simulation of non-Gaussian stochastic wind pressures on buildings

Co-Authors: Thays G.A. Duarte; Srinivasan Arunachalam; Arthriya Subgranon; Seymour M.J. Spence

Abstract: Methods that effectively simulate realizations of multivariate wind processes are essential for the probabilistic assessment of building systems subject to wind excitation. In particular, stochastic wind models based on the translation process can generate non-gaussian stationary wind processes that match exactly the target marginal distribution. The translation process can be integrated with the proper-orthogonal decomposition method to reduce computational cost through mode truncation while preserving adequate accuracy. Recent studies have explored the use of experimental wind tunnel data to calibrate such a stochastic wind model as it can capture complex building-specific aerodynamic phenomena for general wind directions and building geometry. However, there is still a lack of guidance and information on the potential errors associated with calibrating the model with typical short-duration wind tunnel records. To quantify such errors, an extensive experimental campaign was conducted at the NHERI Boundary Layer Wind Tunnel at UF, considering wind tunnel records of different lengths, wind directions, and surrounding configurations. Errors associated with the wind tunnel variability, calibration process, numerical model, and mode truncation were evaluated. The results shed light on the requirements and guidance to confidently use wind tunnel data to calibrate the data-informed stochastic wind model, which is a new addition to the SimCenter tools.

Ahsan Kareem

Professor
University of Notre Dame

Presentation Title: Multi-scale simulation of typhoon wind field at building scale utilizing mesoscale model with nested large eddy simulation

Co-Authors: Mingfeng Huang; Sunce Liao; Wenjuan Lou; Wei Lin

Abstract: Based on the mesoscale WRF (Weather Research Forecast) and LES (Large Eddy Simulation), a coupled numerical simulation framework aiming to resolve typhoon wind fields around urban building blocks has been developed. The holistic multi-scale simulation spans four scales: macroscale (~1000km), mesoscale (~100km), microscale (~1km), and building scale (0.1~100m). By interactively sharing the meteorological data among different scales, this simulation framework addressed the challenge of downscaling typhoon effects in urban environments. Typhoon weather reanalysis and urban wind field simulation were conducted by the WRF model with nested computational domains incorporating high-resolution topography data. The wind profiles generated by the WRF model were applied at the interface with augmented turbulent fluctuations in the flow as the inlet boundary condition for the downscaled LES model with urban building blocks. Wind velocity and pressure fields around the K11 building in Hong Kong during Typhoon Kammuri were simulated by the proposed multi-scale framework, and the simulation results were compared to the meteorological observations and full-scale measurements at the top of the K11 building. The interaction of the wind field and building clusters resulting in as vortex shedding, separations, and channeling effects have been resolved in detail. Probability distributions and higher-order statistics, i.e., skewness and kurtosis statistics were presented to highlight on-Gaussian features of the simulated local wind pressure on the K11 building. The simulated wind-induced vibration of the K11 building using this framework was analyzed and compared to the full-scale measurement data.

Justin Bonus

Postdoctoral Scholar
University of California, Berkeley

Presentation Title: Bringing Disney-esque Approaches to Tsunamis and Storm-Surge Design / Uncertainty Quantification via the NHERI SimCenter's HydroUQ

Co-Authors: Pedro Arduino; Mike Motley; Frank McKenna; Marc Eberhard

Abstract: Tsunamis, storm-surges, landslides, and avalanches driving debris-fields are computationally complex coinciding hazards. Yet, in recent 3D animated films these events are rendered to an incredible visual benchmark at computational scales exceeding what most engineers can match. Disney’s Frozen and Moana, in an effort to tell resonating stories with simulated snow and sand media (history-dependent, topology changing, large-deformation, and nonlinear), championed an interesting although expensive numerical tool: The Material Point Method (MPM). Their software optimizations bypassed a computational barrier that had limited engineering use. Just as animators use engineering techniques to improve physical accuracy, we show engineers can adopt the visual story-telling of 3D artists and the optimized codes of computer graphics developers. Doing so accelerates hazard simulations 10x - 100x, grows them by 2x - 10x, and may amplify the public impact of our numerical work. At the NHERI SimCenter, we are introducing vetted, Disney-esque approaches into HydroUQ to aid students and researchers in pursuit of effective yet intuitive natural hazard simulations. HydroUQ, a user-facing portal for uncertainty quantification (UQ) in waterborne events, provides an abstracted GUI to wield a multi-physics coupling of Claymore MPM, OpenFOAM CFD, OpenSees FEA, and GeoClaw SWE. Scaling across Multi-CPU and Multi-GPU supercomputers (TACC Frontera, Lonestar6), HydroUQ is suited for both quick results and massive performance-based UQ workflows. The tool is validated against analytical examples and open-source / DesignSafe data-sets from wave-flumes at Oregon State University, Waseda University, and the University of Washington.

Rachel Hamburger

PhD Student
University of Notre Dame

Presentation Title: A unifying framework and a shared model library for hurricane wind damage and loss simulation

Co-Authors: Adam Zsarnóczay; Tracy Kijewski-Correa

Abstract: Increasingly severe and frequent hurricane events coupled with rising urbanization has led to rapid risk escalation and catastrophic losses in coastal regions. Reliable simulation of hurricane-induced damage and losses is essential to inform community-based mitigation strategies, assess the efficacy of coastal codes and construction practices, and provide public risk information to citizens and stakeholders. While conventional models are available in open-source software (e.g. Hazus, Florida Public Hurricane Loss Model), these approaches were designed and are used for large-scale, low-resolution studies. The more advanced models and datasets used to calibrate them are often part of proprietary systems that restrict public access to important know-how. There are many cases of research progressing beyond these conventional models, but the organic evolution of this area led to several co-existing state-of-the-art techniques without a common vocabulary and framework to describe and compare them. It is often unclear where approaches overlap and how an existing model could be leveraged in new analyses. This study presents a unifying framework for hurricane catastrophe modeling that facilitates community-wide discovery of gaps, needs, and capabilities, and accelerates future research in this area. Through an extensive review of relevant literature, existing methods were mapped into a cohesive schema with a clearly defined vernacular that aids data discovery, information sharing, and future expansion. Methods and model parameters were integrated into SimCenter’s computational simulation ecosystem using their Pelicun framework and they are added to their publicly available Damage and Loss Database.

Francisco A. Galvis

Project Engineer
Thornton Tomasetti

Presentation Title: Using Functional Recovery Simulations to Inform Stakeholder Decisions

Co-Authors: Barbara Gao; Julio Tupayachi; Julie Pietrzak; David Ojala; Andrew Altamirano

Abstract: Since the NIST-FEMA Report to Congress on Functional Recovery was issued in 2021, private and public institutions have become more interested in understanding the potential impacts of natural disasters on their ability to resume operations after an acute event. Recognizing this need, the resilience practice at Thornton Tomasetti has been using state-of-the-art functional recovery simulations—leveraging SimCenter tools—to facilitate decision-making in the prioritization of investments to enhance the resilience of assets exposed to earthquake and coastal flooding. This presentation discusses two recent case studies of a new and an existing building to highlight three major lessons learned in the effort to translate complex simulations into useful decision-making tools on real projects. The first lesson is that it is imperative to conduct a resilience workshop with all the stakeholders at the beginning of the project. This workshop should focus on understanding the stakeholder’s risk appetite , their constraints, and their requirements for function. The second lesson is that loss metrics by themselves are of little use for decision-making; a menu of incremental strategies that progressively reduce those losses with their associated cost is the most useful result. Lastly, we found that informative cost-benefit analyses that facilitate decision-making look different for each stakeholder and each project type. Some stakeholders care more for the absolute cost of the enhancements, others care more for the relative cost of the enhancements compared to a code-minimum investment, and others need a service life cash-flow analysis that considers insurance costs and discount rates.

Barbara Gao

Project Engineer
Thornton Tomasetti

Presentation Title: Using Functional Recovery Simulations to Inform Stakeholder Decisions

Co-Authors: Francisco Galvis; Julio Tupayachi; Julie Pietrzak; David Ojala; Andrew Altamirano

Abstract: Since the NIST-FEMA Report to Congress on Functional Recovery was issued in 2021, private and public institutions have become more interested in understanding the potential impacts of natural disasters on their ability to resume operations after an acute event. Recognizing this need, the resilience practice at Thornton Tomasetti has been using state-of-the-art functional recovery simulations—leveraging SimCenter tools—to facilitate decision-making in the prioritization of investments to enhance the resilience of assets exposed to earthquake and coastal flooding. This presentation discusses two recent case studies of a new and an existing building to highlight three major lessons learned in the effort to translate complex simulations into useful decision-making tools on real projects. The first lesson is that it is imperative to conduct a resilience workshop with all the stakeholders at the beginning of the project. This workshop should focus on understanding the stakeholder’s risk appetite , their constraints, and their requirements for function. The second lesson is that loss metrics by themselves are of little use for decision-making; a menu of incremental strategies that progressively reduce those losses with their associated cost is the most useful result. Lastly, we found that informative cost-benefit analyses that facilitate decision-making look different for each stakeholder and each project type. Some stakeholders care more for the absolute cost of the enhancements, others care more for the relative cost of the enhancements compared to a code-minimum investment, and others need a service life cash-flow analysis that considers insurance costs and discount rates.