NHERI Computational Symposium

February 5-7, 2025

Lifeline systems: From inventory to recovery

Session 10C Chair: Matthew DeJong

 


Neetesh Sharma

Assistant Professor
Florida A&M Univeristy and Florida State University

Generating Synthetic Power Distribution Inventory for Regional Risk Assessment

Abstract: Infrastructure inventories are a fundamental prerequisite for conducting regional risk assessments, with power infrastructure inventories being among the most critical. However, the availability of detailed power distribution data is often limited due to security concerns and data ownership by private companies. While some techniques for generating synthetic transmission level power infrastructure data are available in literature, there is a significant gap in modeling distribution networks or feeder lines, which are crucial for building-level risk assessments, particularly in the face of hazards such as wildfires and storms. This research addresses this gap by combining confidential distribution level data, with statistical models that leverage publicly accessible datasets—such as structure density, transportation networks, demographic data, and other open resources—to predict the locations of feeder lines. By integrating these predictions with network analytics, we can create synthetic topologies that statistically represent actual infrastructure without compromising data security or proprietary interests. These synthetic power inventories provide a valuable resource for testing regional resilience methodologies, estimating economic losses, and assessing building functionality and community resilience, particularly in areas with data scarcity. This approach not only offers a secure method for developing comprehensive power distribution models but also enhances the ability to conduct robust regional risk assessments and improve community resilience against various hazards.

Emma Russin

Graduate Student Researcher
University of Kansas

Synthesizing drinking water distribution networks based on readily available data 

Co-Authors: Edward Peltier (University of Kansas) and Justin M. Hutchison (University of Kansas)

Abstract: Healthy communities require the delivery of safe and clean drinking water. Many utilities use open-source software like EPANET to create models of their systems; however, EPANET requires high-resolution data that may not be readily available to all communities. To address this lack of high-resolution data, we have developed a methodology that synthesizes a drinking water distribution network using readily available data.

The main assumption underlying this methodology is water distribution pipes run parallel with road networks. Road network information was collected from OpenStreetMaps and converted to links and nodes that represent the distribution system. Additional inputs to the model included elevation data, population data, and commercial data. A Python architecture was created to integrate these sources of data with EPANET. Initial methodology was developed and tested using an artificial community (CLARC). Currently, the method is being used to evaluate three systems in western Kansas (Garden City, Dodge City, and Liberal). The final model can simulate the system's demands, water age, pressure, velocity, and flow rate. Water quality parameters such as residual disinfection concentration can be simulated with the use of the Multi-Species eXtension (MSX).  

This work is part of a larger National Science Foundation funded effort called Adaptive and Resilient Infrastructure driven by Social Equity (ARISE). The ARISE project seeks to understand interrelated power, water, wastewater, and transportation systems in terms of resilience and equity. The goals of this project include the creation of tools that communities can use to assess the resilience and equity of their infrastructure systems. 

Bo Li

Graduate Student Researcher
Texas A&M University

Time-series clustering method reveals key properties of power system resilience curves

Co-Authors: Ali Mostafavi (Texas A&M University)

Abstract: This study examines hundreds of empirical resilience curves constructed from observational data of recent power outages during extreme weather events to reveal fundamental characteristics of power system resilience. Most of the existing studies examine the characteristics of infrastructure resilience curves based on analytical models constructed upon simulated system performance. Little empirical work has been done to delineate infrastructure resilience curve archetypes and their fundamental properties based on observational data. There is a dearth of empirical studies in the field, which hindered our ability to fully understand and predict resilience characteristics in infrastructure systems. To address this gap, this study examined more than two hundred power-grid resilience curves related to power outages in three major extreme weather events in the United States. Through time-series clustering method, we examined different curve archetypes, as well as the fundamental properties of each resilience curve archetype. The results show two primary archetypes for power grid resilience curves, triangular curves, and trapezoidal curves. Triangular curves characterize resilience behavior based on three fundamental properties: 1. critical functionality threshold, 2. critical functionality recovery rate, and 3. recovery pivot point. Trapezoidal archetypes explain resilience curves based on 1. duration of sustained function loss and 2. constant recovery rate. The longer the duration of sustained function loss, the slower the constant rate of recovery. The findings of this study provide novel perspectives enabling better understanding and prediction of resilience performance of power system infrastructure in extreme weather events.

Sina Naeimi

Postdoctoral Scholar
UC Berkeley

Integration of REWET into R2D Tool, Alameda Case Study

Abstract: Restoration of Water Network after Event Tool (REWET) is an open-source tool developed to facilitate the assessment of water distribution networks damaged by extreme events such as earthquakes, landslides, and/or failures in dependent infrastructure, such as power networks. REWET can be used in conjunction with other tools that assess hazards. Additionally, the Regional Resilience Determination Tool (R2D Tool) provides a flexible workflow to determine damage to infrastructure, including transportation, power, and water networks. REWET has been integrated into the R2D Tool, and a testbed network has been developed for Alameda, CA, to evaluate its capabilities. The network layout is based on the transportation network and engineered to meet water demand based on census tract population data. A restoration plan has been created to restore the network following potential damage. The R2D Tool has been used to generate hazard and damage scenarios, and a series of simulations have been conducted. The results demonstrate the impact of restoration policies under various hazard and damage scenarios.

Amin Enderami

Postdoctoral Researcher
University of Kansas

Simulating Communities using Hetero-Functional Graph Theory: An Interconnected Model of Physical, Social, Economic and Governance Systems

Co-Authors: Elaina Sutley (University of Kansas), Adaeze Okeukwu (Kansas State University), George Amariucai (Kansas State University), and Jason Bergtold (Kansas State University)

Abstract: This study demonstrates the Hetero-Functional Graph (HFG) theory as a robust methodology to model and investigate interactions among a community’s physical (including buildings and infrastructure), economic, social, and governance systems. Simulating (inter)dependencies across these systems poses significant challenges, particularly in complex systems such as communities that include both physical and non-physical components. HFGs decrease the degree of this complexity by mapping the functionality of resources as a network. Unlike traditional graph models that view nodes and edges merely as resources, in HFGs, nodes represent functions of resources within a community, whether physical or non-physical resources, and edges depict the linkage and relationships among these functions, including physical, geographic, cyber, and logical (inter)dependencies. This approach enables the integration of various graphs and networks into a unified model. The proposed methodology is illustrated through a toy model. This toy model leverages the simplicity of the Centerville virtual testbed, a novel Hypothetical employment allocation approach, and only includes a few organizations contributing to human capital generation including an elementary school, a hospital, a grocery store, and a neighborhood. Operands captured in the toy model include Students, Schools, Hospitals, and Commercial Staff, as well as Food and Medical Suppliers. The resulting HFG comprises 132 nodes and 150 edges, capturing the (inter)dependency of organizational functionality and staff-supply chain availability. This methodology holds the potential for creating high-resolution virtual testbeds or digital twins of real communities, offering invaluable tools for planning and enhancing community resilience through holistic evaluations and tailored mitigation and recovery plans.

Andrea Martí

Graduate Student Researcher
University of British Columbia

Optimization of Complex Interdependent Systems during Wildfire Response

Co-Authors: José Martí (University of British Columbia)

Abstract: We have developed a simulator of interdependencies (i2SIM-RT) that works across multiple critical infrastructures (power grid, water system, transportation, social services, communities, etc.) to optimize the response of the actors involved in responding to large disaster events, such as earthquakes and wildfires. For a given situation, i2SIM-RT finds the optimum response actions required to minimize the impact of the disaster and restore the livability of the community as soon as possible. i2SIM-RT uses the concept of a Digital Twin that has minute-by-minute information on the state of the system. The AI agent in the Digital Twin is pre-trained with real-time consequence scenarios. While the disaster evolves in real-time, and the actual on-the-ground situation becomes known, i2SIM’s optimizer, with the help of the AI agent, will determine the best possible actions. The suggested trajectory is continuously reassessed as the situation evolves. In i2SIM, each Critical Infrastructure (physical, social, or human) is modelled as an input-output non-linear transfer function where the inputs (e.g., electricity, level of human training, asset’s physical damage) are required for the infrastructure’s operation, and the output is the CI unit’s functionality. The functional relationship between each input and output is either physical or probabilistic. The optimization algorithm achieves very fast convergence for large complex systems. The recommendations from the AI agent are evaluated by the event commander, who makes the final decision and forwards the recommendation to the CI’s and response agencies. i2SIM-RT is currently applied within a Wildfire Digital Twin framework for early monitoring, detection and response.