NHERI Computational Symposium

February 5-7, 2025

Social and Health Impacts of Natural Hazards

Poster Presentations


Godfred Ababio

Graduate Student Researcher
UCLA

Sociodemographic Disparities in the Risk Reduction Benefits Derived from Cripple Wall Retrofits in the City of Los Angeles

Co-Authors: Henry Burton (UCLA)

Abstract: The recent increase in earthquake damage and disruption globally underscores the need for a more comprehensive understanding of the human impacts of such disasters. This article explores the inventory-level enhancements provided from retrofitting unbolted, unbraced cripple wall buildings in the City of Los Angeles (LA) and investigates how the distribution of the benefits derived from these retrofits aligns with the sociodemographic characteristics of the affected neighborhoods. The study focused on approximately 16,000 single-and two-family residential buildings. The relevant data were gathered from multiple sources, including the LA Department of Building and Safety (LADBS) and the LA Open Data Portal (LAOPD).

The Regional Resilience Determination Tool (R2D) is used to simulate the impact of a M_7.1 earthquake occurring on the Puente Hills fault. We use 255 archetypes to represent the inventory and evaluate losses for both the retrofitted and non-retrofitted inventory. The archetypes and loss functions used for the study were developed as part of the PEER-CEA project that assessed the building-specific performance enhancements provided by the cripple wall retrofits. A combination of spatial and statistical approaches are implemented at the regional, neighborhood, and census tract scale. At each scale, the number of retrofitted buildings normalized by the total number of pre-1980 one-and two-family residential buildings (or retrofit rate) as well as the benefit derived from the retrofit are the primary dependent variables. The latter is measured based on the reduction in earthquake-induced economic losses due to the structural enhancements provided by the retrofit.

Amin Enderami

Postdoctoral Researcher
University of Kansas

Measuring Post-Disaster Accessibility to Essential Goods and Services: Proximity, Availability, Adequacy, and Acceptability Dimensions

Co-Authors: Elaina Sutley (University of Kansas), Jennifer Helgeson (National Institute of Standards and Technology), Leonardo Dueñas-Osorio (Rice University), Maria Watson (University of Florida), and John W. van de Lindt (Colorado State University)

Abstract: Rapid restoration of access to essential goods and services is vital for community recovery, yet defining and measuring access remains ambitious. This study defines accessibility as the ability to use goods and services with reasonable effort and cost, evaluated across six dimensions: proximity, availability, adequacy, acceptability, affordability, and awareness. This paper introduces a spatio-temporal accessibility metric that combines four of these dimensions (proximity, acceptability, adequacy, and availability) while considering uncertainty and both user and provider perspectives. The metric's variation over time serves as a proxy for community recovery. The metric aligns with common engineering-oriented functionality-based resilience frameworks as the functionality level of the providers has been incorporated in its development. Operating at the household level, the metric calculates the ratio of post-disruption to pre-disruption access time, yielding a unitless ratio from zero (signifying that the product is no longer available with reasonable effort and cost) to one (indicating the same level of accessibility as pre-disruption). Scientifically principled yet practical, this metric is easily understood by non-experts and considers users’ perspective, which is often overlooked in common access metrics focusing merely on physical access and travel time. The metric is illustrated for schools and pharmacies using the Lumberton Testbed and data collected following the 2016 flood in Lumberton, North Carolina after Hurricane Matthew. Findings offer new insights into recovery plan prioritization and can trigger protective actions. The paper concludes by discussing challenges in developing and validating accessibility metrics and highlights areas for future research.

Pallab Mozumder

Associate Professor
Florida International University

Developing a Composite Index to Measure Health Disparity to Hydroclimatic Disasters in U.S. Coastal Areas

Co-Authors: Yu Han (University of Twente), Indranil Sengupta (FIU), Bafisa Halim (BU), and Shahnawaz Rafi (FIU)

Abstract: Hydroclimatic disasters (including hurricanes, storm surges, floods, and extreme heat events) significantly impact livelihoods and exacerbate health disparities in U.S. coastal regions. Vulnerable communities often face disproportionate risks due to limited access to essential services and facilities. Our objective is to develop a composite Social Disparity Index (SDI) to assess social and health disparity to hydroclimatic disasters. Utilizing remote sensing datasets from Google Earth Engine, critical facility data, and U.S. Census information, we will construct a composite SDI based on accessibility measurements to essential facilities such as healthcare centers, gas stations, emergency shelters, and food supply locations. Factors include travel distance, transportation availability, and service capacity will be considered to quantify accessibility. We will integrate this SDI with field survey results from three major U.S. hurricanes (Harvey, Irma and Maria all three made landfall in 2017 in TX, FL and PR) to validate our findings regarding the health disparities resulting from these disasters. This approach allows for the quantification of social and health disparities across diverse spatial and temporal dimensions, capturing dynamic vulnerability at the community level. The expected outcome is a composite index that reflects intra-community social disparities, can enhance the strategic deployment of public health resources to the most vulnerable populations. By identifying areas with significant health disparities, policymakers and emergency management agencies can prioritize interventions and allocate resources more effectively and equitably. This index can be useful in promoting equitable adaptive capacities in disaster-prone regions and strengthening community resilience against future hydroclimatic shocks.

Yi Victor Wang

Assistant Professor
Massachusetts Maritime Academy

Simulation for Verifying Empirical Predictive Models for Social Vulnerability

Co-Authors: Seung Hee Kim (Chapman University) and Menas Kafatos (Chapman University)

Abstract: Previously, a methodological paradigm of empirical predictive modeling (EPM) was proposed for quantifying social vulnerability as the expected loss rate of a societal entity given its potential local experience of hazard strength, such as peak ground acceleration. With the EPM approach, machine learning (ML) models can be trained on historical data to indicate social vulnerability as a function of vulnerability indicators. Such an empirically derived social vulnerability can be directly used to predict specific future losses due to a hazard. To test the EPM models of social vulnerability, however, field experiments on any societal system of humans must be strictly forbidden. To bypass this restriction, this presentation introduces a Monte Carlo simulation-based approach to experimentally verify the ML models of social vulnerability with the EPM paradigm. An eigenvalue-based method is adopted to simulate data on local experiences of hazard strengths as well as pre-event vulnerability indicators. With data on these input variables, true models are designed to generate output data on loss rates of an event. ML models are then trained on simulated input and output data to predict social vulnerability. The results confirm the utility of the EPM approach, as the trained ML models show highly similar predictive performance as the true models, particularly regarding interpolation of a vulnerability curve. For extrapolation for large hazard strengths, however, simple models tend to perform better. Further sensitivity analyses suggest that at least hundreds of data points are needed for the EPM approach to work properly with real-world data.