NHERI Computational Symposium

February 5-7, 2025

Spotlight Presentations

Session 11


Lauren Mudd

https://www.swissre.com/.imaging/focalarea/500x500/dam/jcr:57001493-1a0c-416d-98a2-2f7ac438b1ec/2023-08-sri-lauren-mudd-profile.jpg?20230803013718PM

Senior Engineer
Applied Research Associate

Abstract: Coming Soon

 

 

Saeed Moghimi

https://lh3.googleusercontent.com/a-/ALV-UjUnH_FfvdaOKieq6Ndo9kGCz9oBli1eMfwUWFDaBwNSQqwZQV9p=s80-p

NOS Storm Surge Modeling Team Lead
NOAA

Success Stories from NSF Computing Infrastructure Supporting NOAA National Ocean Service Coastal Modeling Initiatives

Co-Authors: Tim Cockerill (TACC), Greg Seroka (NOAA), Ayumi Fujisaki-Manome (Univ. of Michigan), Ed Myers (NOAA), Shachak Pe'eri (NOAA), and Corey Allen (NOAA)

Abstract: Access to shared and collaborative high-performance computing (HPC) environments is essential for advancing NOAA's scientific research and operational capabilities. However, heightened security requirements have limited NOAA's internal HPC capacity for such collaboration. The support provided by the National Science Foundation's DesignSafe initiative and the Texas Advanced Computing Center (TACC) has been pivotal in enabling NOAA’s National Ocean Service (NOS) to enhance its collaborative modeling capabilities. These initiatives have derived partnerships across academic institutions, federal and local governments, and the private sector.

This presentation will showcase key successes from this enduring partnership, including NOAA Water Initiative projects, the development and pre-operational testing of the Surge and Tide Operational Forecast System (STOFS), and the UFS Coastal Application Team—a multi-university effort evaluating coastal models for diverse applications. Looking ahead, NOAA’s National Ocean Service seeks to broaden these collaborations through formal MOUs and agency-level partnerships, taking these successful initiatives to new heights.

Henry Burton

Associate Professor
UCLA

Benchmarking the performance of alternative strategies for gaining computational efficiency in regional seismic risk and resilience assessments

Co-Authors: Laxman Dahal (Arup) and Kuanshi Zhong (University of Cincinnati)

Abstract: Regional-scale seismic performance assessments are useful to a range of application contexts. Undoubtedly, the various use cases may lead to different computational demand requirements. For instance, evaluating alternative recovery-based design solutions requires a building-specific approach because of the need to understand how changes in specific seismic design parameters affect performance. On the other hand, if the goal is to identify vulnerable neighborhoods to inform emergency preparedness programs, the building-specific approach might not be necessary. As such, a one-size-fits-all regional seismic risk and resilience quantification solution is not advisable.

This study comparatively evaluates the performance of four strategies for gaining computational efficiency in regional risk and resilience assessment. The approaches vary based on the level of fidelity and resolution used in the simulation. Fidelity is used to describe the level of detail used to simulate the structural response and performance of a given asset (e.g., 3D MDOF versus SDOF structural model). Resolution refers to the extent to which spatially distributed hazard and assets are aggregated within the simulation environment. An inventory of over 15,000 single and multi-family woodframe residences in the City of Los Angeles is used as the testbed. Regional seismic performance is quantified using the probabilistic distribution of economic losses and functional recovery times. The four approaches are benchmarked against the results from a “high-fidelity”, “high-resolution” simulation-based assessment that uses 3D nonlinear structural modeling, component-level damage evaluation and site-specific hazard and building performance characterization. Both scenario- and stochastic event set-based regional assessments are considered in the benchmark performance evaluations.