NHERI Computational Symposium

May 28-29, 2026

Simulation of wind and water hazards

Session 8D: Brayton Community Room #162, 9:50am Host: Seymour M.J. Spence

 


Daniel Padron

Outline of a generic headshot

PhD Student University of Hawaii at Manoa

Site-specific Joint Depth–Velocity Tsunami Hazard Accounting for Spatial, Phase, and Magnitude Dependence

Co-Author: Mohammad Alam (University of Hawaii at Manoa)

Abstract:

Developing site-specific Joint hazards for vector-valued tsunami intensity measures (IM) is critical for accurate prediction of tsunami-induced structural risk and loss. This study presents a probabilistic framework for developing site-specific joint tsunami hazards of inundation depth and flow velocity using numerical tsunami simulations of full rupture scenario of the Cascadia Subduction Zone. Copula-based models are employed to quantify the dependence between depth and velocity, capturing their variations across space, flow phase, and earthquake magnitude. Joint depth-velocity distributions are developed using multiple copula families and appropriate marginal IM densities, with optimal copula model selection guided by log-likelihood, Akaike Information Criterion (AIC), and Bayesian Information Criterion (BIC). Joint mean annual rates (MARs) of IMS are evaluated for flow speeds (neglecting flow direction), run-up, and drawdown at multiple cross-shore and along-shore locations that feature varying topographic characteristics. Results reveal substantial spatial variability in joint MARs and non-monotonic dependence of IMs across different flow phases and magnitudes of tsunami events, underscoring the need to explicitly account for these effects in site-specific joint tsunami hazards development and reliable tsunami risk and loss estimation of distributed civil infrastructure.

Data DepotDesignSafe HPC

Benjamin Tsai

Benjamin Tsai

Assistant Professor Oregon State University

Large Eddy Simulation of Wave-Backwash Interactions and the Effects on Morphological Change

Abstract: Wave-backwash interactions play a crucial role in the morphological changes of beaches. These interactions occur when the wave runup interacts with the preceding swash event, leading to intense mixing and significant erosion. A 3D large eddy simulation coupled with a free surface tracking scheme was used to simulate cross-shore hydrodynamics of wave-backwash interactions as observed in a large wave flume experiment. The primary objective was to enhance the understanding of wave-backwash interactions and the effects for observed morphodynamics. Two simulation cases were carried out to elucidate key processes of wave-backwash interactions across two distinct stages: berm erosion and sandbar formation, during the early portion of a modeled storm event. The major difference between the two cases was the bathymetry: one featuring a berm without a sandbar (Case I), and the other, featuring a sandbar without a berm (Case II) at similar water depth. Good agreement (overall Willmott's index of agreement greater than 0.8) between simulations and measured data in free surface elevation, wave spectrum, and flow velocities validated the model skill. The findings indicated that the bottom shear stress was significant in both cases, potentially contributing substantial sediment transport. Notably, the occurrence of intense wave-backwash interactions was more frequent in the absence of a sandbar. These intense wave-backwash interactions resulted in a pronounced horizontal pressure gradient exceeding the criteria for momentary bed failure. These insights are pivotal in understanding the mechanisms underlying berm erosion and how sandbar formation serves to protect further beach erosion.

Yucheng Zhang

Outline of a generic headshot

PhD Student UT Austin

Learning the Exact Flux: Neural Riemann Solvers for Hyperbolic Conservation Laws

Co-Authors: Chayanon (Namo) Wichitrnithed (UT Austin), Shukai Cai (UT Austin), Sourav Dutta (UT Austin), and Clint Dawson (UT Austin)

Abstract: Riemann solvers are widely used to compute numerical fluxes in a variety of schemes, including finite volume and discontinuous Galerkin methods, for solving hyperbolic conservation laws. These laws, including the Shallow Water and Euler equations, are commonly solved using these schemes, which underpin large scale simulations in natural hazard research such as storm surge, tsunamis, and atmospheric extreme events. Accurately capturing these intercell fluxes is essential for resolving the nonlinear wave interactions that govern these events.

The exact Riemann solver faithfully reflects the underlying physics but requires solving a nonlinear root finding system that is prohibitively expensive for large scale models. For this reason, most large scale simulations employ approximate solvers that trade accuracy for speed. This work presents a hard-physics-constrained neural exact Riemann solver that learns the exact flux function for hyperbolic systems. The model is designed to preserve key physical properties, ensuring that the learned flux remains physically meaningful and stable when integrated into a Godunov-type scheme.

We show that when integrated into a finite volume scheme, the neural solver produces solutions that closely match the exact Riemann solution on classical test cases for the Shallow Water and Euler equations. At the same time, it reduces the computational cost of exact flux evaluations, achieving runtimes comparable to approximate Riemann solvers such as the Lax Friedrichs. These results demonstrate the potential of machine learned flux functions to enhance the accuracy, efficiency, and scalability of large scale natural hazard simulations.

DesignSafe HPC

Max Beeman

Outline of a generic headshot

PhD Student Stanford University

Numerical simulations of wind loading in the presence of waves

Co-Authors: Hanul Hwang (Stanford University), Jianyu Wang (Stanford University), and Catherine Gorle (Stanford University)

Abstract:

When modeling wind loads on coastal buildings, tropical cyclone (TC) wind flow approaching structures is typically conditioned over regularly spaced roughness elements in both numerical simulation and physical experiments, but this does not accurately represent the complex environment that occurs during TCs when wind, waves, and storm surge occur simultaneously. Under these conditions, phenomena such as flow separation over wave crests, periodic momentum transfer between wind and waves, and injection of turbulence into the wind due to wave breaking are expected to occur. The interactions of these different flow physics could substantially alter expected wind loading on coastal structures.

The aim of this work is to investigate the suitability of various Large Eddy Simulation (LES) approaches for modeling hurricane wind loading in the presence of waves. We compare a two-way coupled interface modeling approach (a two-phase phase-field method) to a one-way coupled modeling approach (such as an immersed boundary method, in which waves influence wind but the wind has no effect on the waves) to evaluate how each approach captures the incoming wind profile and loading on a sample house structure. We make this comparison under both old wave (low wind speed relative to wave speed) and young wave (higher wind speed, more TC-like) conditions to identify the advantages and disadvantages for each modeling approach under a range of wind-wave interaction regimes.

Haifeng Wang

Haifeng Wang

Assistant Professor Washington State University

Location-Conditioned HyperNetwork Modeling for Hurricane Track Simulation

Abstract: Accurate simulation of hurricane tracks is essential for regional hazard assessment and resilience analysis, yet hurricane motion exhibits strong spatial nonstationarity due to changes in large-scale atmospheric conditions, land interaction, and regional flow regimes. Conventional data-driven track models typically rely on fixed parameters, limiting their ability to adapt dynamically as storms move across different geographic regions. This study presents a HyperNetwork-based modeling framework for hurricane track simulation that explicitly accounts for location-dependent behavior.

In the proposed approach, a HyperNetwork conditions model parameters on storm location and environmental context, enabling continuous adaptation of track dynamics as the hurricane evolves. Rather than employing a single globally stationary trajectory model, the framework generates location-conditioned parameters that allow the model to transition smoothly between distinct dynamical regimes, such as open-ocean motion, coastal approach, and land interaction. This design captures spatial variability in hurricane behavior while maintaining a unified computational representation.

The HyperNetwork-driven model supports efficient generation of hurricane track ensembles, making it suitable for uncertainty-aware hazard simulation and downstream impact assessment workflows. Case studies using historical hurricane data demonstrate that the proposed framework reproduces key statistical characteristics of observed tracks while offering improved flexibility compared to fixed-parameter models.

By introducing spatially adaptive model behavior through HyperNetworks, this work advances computational approaches for hurricane track simulation and provides a scalable tool for regional hazard analysis. The proposed framework aligns with NHERI and SimCenter objectives by supporting data-driven, uncertainty-aware simulation of natural hazards and their impacts on the built environment.

Teng Wu

Teng Wu

Professor University at Buffalo

Knowledge-Enhanced Autoencoder for High-Resolution Downscaling of Hurricane Precipitation

Abstract: Operational forecasts of high-resolution hurricane precipitation are critical for pre-disaster emergency preparedness and response. However, the accuracy of current operational forecasting systems (e.g., integrated forecast system of European Centre for Medium-Range Weather Forecasts or global forecast system of National Oceanic and Atmospheric Administration) is limited by their coarse spatial resolution (~0.25 degree), without resolving important fine-scale structures of hurricane eyewalls and rainbands. While dynamical downscaling through numerical weather prediction (NWP) models shows great promise for improving accuracy, its high computational cost makes the real-time ensemble forecasting impractical. Traditional statistical downscaling methods offer efficiency but rely on predefined input-output relationships that fail to capture complex nonlinearities of hurricane dynamics and thermodynamics. To address this challenge, a specially designed autoencoder (consisting of an encoder and a decoder) is utilized in this study to effectively integrate multi-resolution (coarse, medium and high) data from nested grid of the NWP model into the training process. Specifically, the coarse-resolution, medium-resolution and high-resolution portions of the symmetric autoencoder are sequentially trained to fully leverage the multi-resolution data from the NWP model for network regularization. The obtained decoder is then used for the purpose of mapping low-resolution to high-resolution hurricane precipitation field. The domain knowledge in terms of water vapor conservation equation is incorporated into the loss function to further improve the data efficiency. The case study suggests that the developed knowledge-enhanced autoencoder can efficiently and accurately generate high-resolution (~2 km) hurricane precipitation fields from low-resolution global weather forecasts (~0.25 degree).

Data Depot

Yanmo Weng

Outline of a generic headshot

Postdoctoral Scholar Rice University

A Hybrid Physics-based and Machine Learning Approach for AI-generated Tropical Cyclone Rainfall Prediction

Co-Author: Avantika Gori (Rice University)

Abstract: Tropical cyclones (TCs), associated with high-speed winds, storm surges, and heavy rainfall, are extreme weather events that severely impact coastal areas across the US and globally. Accurately predicting TC-induced precipitation is essential for understanding flood risks and guiding emergency response and infrastructure planning. Due to their low computational cost, machine learning (ML) models have become a popular approach for TC rainfall prediction. However, common ML models often yield deterministic predictions and operate as black-box systems, neglecting the underlying physics. This lack of physical interpretability undermines the trustworthiness of their predictions. To address this, the authors adopt a Bayesian Neural Network (BNN) framework, which not only estimates the mean rainfall but also provides associated uncertainty through variance outputs. Building upon BNN, the authors integrate a reduced physics-based Model. By coupling BNN with TCRM, the resulting TCR-BNN model imposes physics-informed constraints on the learning process, improving both prediction accuracy and interpretability. Leveraging the strengths of both models, the proposed TCR-BNN framework offers a trustworthy, uncertainty-aware approach to TC rainfall prediction, supporting more reliable hazard estimation and future climate adaptation strategies. This work provides a valuable tool for the extreme climate research community in advancing risk-informed decision making under uncertainty.

Data DepotDesignSafe HPC

Kristen Blowes

Kristen Blowes

Postdoctoral Scholar University of Colorado Boulder

Future wind risk using nonstationary probability models

Co-Author: Abbie Liel (CU Boulder)

Abstract: Design standards in the United States currently rely on historical data to determine design loads for buildings, potentially leaving buildings vulnerable to evolving climate hazards. The ASCE 7 Subcommittee for Future Conditions for Environmental Hazards recently proposed using climate projections to determine design wind speeds in ASCE 7-28. This study explores the use of nonstationary probability models to simulate maximum yearly wind speeds from 2030 to 2080 and examines how the failure probability is expected to change over this period. We then compare the future performance of buildings designed for wind speeds based on historic data with that of buildings designed using climate projections for 10 US cities, highlighting the potential underestimation of risk when relying solely on historic data. Lastly, we assess the change in design wind speed needed to meet structural safety targets. Overall, this research underscores the limitations of assuming climate stationarity for ensuring long-term building performance.

Data Depot