Session Abstracts:
Postdoctoral Scholar
Stanford University
From Uncertainty Toward Action: Climate Risk Modeling and Robust Decision-Making Under Uncertainty
Co-Authors: Jack W. Baker (Stanford University)
Abstract: Extreme weather events such as tropical cyclones, heat waves or floods pose an imminent threat to environmental and social systems exposed to these hazards. Climate change is expected to increase the frequency and intensity of many such types of events, worsening the potential impacts on society from direct economic losses to metrics like human displacement. We need to understand and quantify these climate-related risks to improve the resilience of our societies in the face of extreme weather and climate change. However, these phenomena are complex and often there is no consensus in the scientific community about how to best model them. Instead, we end up with a multitude of different models or model simulations representing the same system. These uncertainties pose challenges for risk modelers and developers, who strive to refine and evaluate their models, as well as for decision-makers, who require reliable insights for planning and adaptation. My research focuses on quantifying regional economic losses and human displacement due to climate risks, with particular emphasis on uncertainty and sensitivity quantification. To ensure that these climate risk assessments are actionable, I incorporate decision-making frameworks under uncertainty. This work aims to bridge the gap between the inherent uncertainties in climate risk assessments and the need for robust decision-making.
Graduate Student Researcher
University of California, San Diego
Long Short-Term Memory Networks as Emulators for Nonlinear Structural Dynamic Finite Element (FE) Models
Co-Authors: Zhen Hu (University of Michigan-Dearborn), and Joel P. Conte (University of California, San Diego)
Abstract: Performance-based seismic design of structural systems relies on computationally expensive nonlinear time history analyses of high-fidelity FE models for predicting structural response to seismic excitation. For risk-based assessment, these FE simulations must be run thousands of times across different realizations of the uncertainty sources, making the computational cost significant and often prohibitive. Consequently, data-driven machine learning surrogate models have gained prominence as fast emulators for predicting structural responses in probabilistic analyses. This paper presents a comprehensive study of long short-term memory (LSTM) networks as global surrogate models for nonlinear dynamic systems, ranging from academic examples to real-world applications with varying degrees of material and geometric nonlinearities. While LSTM performs well as a surrogate model for academic (basic) examples, it struggles with the nonlinear dynamics of higher-dimensional, realistic FE models. To address this limitation, this paper introduces an enhanced LSTM model, Recursive Averaged Multistep Sequence-to-Sequence Forecasting (RAMuSS), which improves performance in several key aspects.
Alongside RAMuSS, a dilation strategy for LSTM is introduced to capture varying memory lengths in nonlinear structural dynamic systems. We demonstrate the effectiveness and versatility of our methods across a range of applications, from basic benchmark examples to realistic FE models of real-world structures, and an experimentally validated FE model of a full-scale bridge column tested on a shake table.
Additionally, we introduce a new approach to select earthquake ground motions records with diverse key characteristics for the input-output training set, aiming to maximize the accuracy and reliability of predictions from the calibrated surrogate models. This is achieved by combining (i) a convolutional autoencoder (CAE) and (ii) an LTSM network employing the nonlinear autoregressive model with exogenous input (NARX) formulation.
Professor
University of Florence
Leveraging multi-source data for enhanced flood damage modeling with explicit input uncertainty management
Co-Authors: Pradeep Acharya (University of L'Aquila), Daniela Molinari (Politecnico di Milano), and Anna Rita (University of L'Aquila)
Abstract: Accurate flood damage models are vital for supporting risk management decisions, but they rely heavily on detailed and reliable data on hazard, vulnerability, and exposure. Therefore, limited data availability, accessibility, and completeness can introduce substantial uncertainty into damage estimates.
This study introduces INSYDE 2.0, a multi-variable, physically based flood damage model for residential buildings and their contents, capable of handling missing input data uncertainty. Though originally developed and validated for Italy, this framework can be effectively applied to other regions with proper modifications.
INSYDE 2.0 includes built-in functions that replace missing values with data sampled from probability distributions designed to account for the hazard and building characteristics of the specific area under investigation. These distributions are based on a combination of official data, numerical simulations, and virtual surveys of households listed on real estate platforms.
A key advantage of this probabilistic approach is the possibility of explicitly accounting for uncertainty in computed damage, offering more informative estimations for decision-makers. This contrasts with deterministic models, which can sometimes provide a false sense of precision by delivering a single, definitive output.
This study was conducted within the RETURN Extended Partnership and received funding from the European Union Next-Generation EU (National Recovery and Resilience Plan – NRRP, Mission 4, Component 2, Investment 1.3 – D.D. 1243 2/8/2022, PE0000005).
Graduate Student Researcher
University of Michigan
Metamodeling of Dynamic Nonlinear Systems with Uncertainties Through Graph Neural Networks
Co-Authors: Seymour Spence (University of Michigan)
Abstract: In evaluating high-dimensional nonlinear dynamic structural systems under natural hazards, the incorporation of uncertainty is a challenging computational task. While neural networks have emerged as a promising metamodeling approach to address this challenge, significant hurdles remain. Studies have successfully incorporated uncertainties related to external loads from natural hazards, but few have simultaneously addressed model/parameter uncertainties within structural systems. Additionally, predicting the full system response across all degrees of freedom (DOF) rather than a limited subset poses difficulty. This study aims to introduce and validate a Graph Neural Network (GNN) framework combined with Long Short-Term Memory (LSTM) capable of accurately modeling the time history response at all DOFs of structural systems while explicitly considering uncertainties related to both the loading and the system, namely the material uncertainties. Specifically, the structural system is first represented as a graph, where structural joints serve as nodes and members as edges, with the properties of the structural members encoded. The GNN’s message-passing mechanism is then used to generate a graph-level representation of the system, which is subsequently combined with the uncertain loading in an LSTM network to learn the system’s dynamic behavior. The framework’s effectiveness is demonstrated through its application to a steel moment-resisting frame subjected to wind loads. The proposed framework exhibits high accuracy in simulating various responses of interest in the test dataset. Future research will look to extend the framework to the prediction of general systems that can be effectively represented as graph structures, such as power and water network systems.
Associate Professor
University of Michigan
Metamodeling of High-Dimensional Nonlinear Stochastic Systems through Autoencoders and LSTM Networks
Co-Authors: Liuyun Xu
Abstract: Performance quantification and probabilistic analysis for structural systems against severe natural hazards generally requires repeated evaluation of high-dimensional nonlinear dynamic systems to adequately propagate uncertainties. Notwithstanding recent advancements in high-fidelity modeling techniques, this still leads to intractable computational problems. To address this challenge, metamodeling is deemed as a potential solution to accelerate evaluations. In this work, a metamodeling approach, that combines deep learning with a nonlinear model order reduction technique, is proposed for characterizing nonlinear system performance subject to general stochastic excitation. The high-dimensional system within the physical space is reduced first by means of autoencoders that infer a latent space from output responses. The autoencoders serve as an invertible reduction basis for the nonlinear system. Subsequently, a long-short term memory (LSTM) deep learning network is trained to mimic the mapping from the excitation to the responses of the reduced model. To accelerate the efficiency of the autoencoder and LSTM networks, wavelet approximations of the excitation and the reduced output responses are incorporated. The potential of the metamodeling framework is illustrated through the application to a seismically excited steel building. Compared to reduction based on proper orthogonal decomposition, the autoencoder reduction scheme can capture essential nonlinear features and reconstruct the full space using lower dimensional latent spaces. The calibrated metamodel is also demonstrated to possess remarkable accuracy in reproducing the highly nonlinear dynamic response of the application problem.