I. Keywords
The following are keywords to be used by search engines and document catalogues.
climate change, climate change impact, FAIR, climate service, climate resilince, disaster management, natural hazard, weather event, risk management, Geo AI, typhoon, landslide, extreme heat, urban heat island, wildfire, open standards, OGC, OGC API, Ananlysis Ready Data, Decission Ready Information, Geospatial Reporting Indicator, data integration, impact analysis, hazard modeling, open science, earth observation
II. Preface
In this phase of the Climate and Disaster Resilience Pilot Project (CDRP), several solutions were found for the following challenges:
Developing workflows and components for tracking typhoons and extreme atmospheric events using analysis-ready data (ARD) compliant with OGC standards to improve predictions and early warnings of such emergency events. To achieve this, efforts focussed on improvements to real-time data integration, dynamic forecast models, air-sea energy exchange representations, and integrating machine learning models for better prediction and tracking, as well as typhoon data integration for enhanced predictive analytics.
Improved the use of satellite imagery and sensor data to predict landslides and map affected areas. Work in this Pilot implemented scientific algorithms and deep-learning models to enhance detection and prediction accuracy. All algorithms and models use OGC API-Processes to standardize execution, ensure reproducibility, and facilitate the sharing of models.
Development of workflows for assessing flood risks using high-resolution Digital Elevation Models (DEMs) and hydrological models to improve resolution and accuracy in flood simulations and impact assessment on a regional basis. This work leveraged GeoCube concepts and implementations for managing large-scale geospatial data and real-time observational stream computing to deliver improved results.
Modeled extreme heat events and their impacts on human health. By modeling the urban heat island effect and population demographics in Northern Manhattan, New York, spatial data and analytics were used to create ARD and DRI to simulate various warming scenarios, including the effect of extreme heat on artificial urban environments. The combined effects of power blackouts were also considered. This work highlighted the need for improved communication, backup power solutions, and government policies to mitigate heat-related risks and points to proactive measures that can be taken.
Developing weather event workflows that integrate weather sensor data for urban digital twins, enhancing real-time climate risk assessments and scenario analysis. These workflows provide critical environmental information to support, among other things, the modeling and reporting of extreme heat in New York City. All workflows are accessible via OpenAPI standards.
Explored advanced metadata retrieval methods for environmental data, promoting more efficient data discovery and integration across domains. This work included a review of approaches for transitioning from Spatial Data Infrastructures (SDIs) to Geospatial Knowledge Infrastructures (GKIs) using AI and knowledge graphs to support scalable, cross-sectoral climate analytics.
Engaging stakeholder communities and supporting the uptake of tools and services developed through effective visualizations, interactive websites, community hubs, and user needs assessments. This project involved simulating carbon emissions from 3D building datasets using real-time weather profiles. It also included creating 3D visualizations of urban environments, focusing on the role of vegetation in reducing urban heat islands and improving public understanding through interactive models. Additionally, the project aimed to streamline 3D visualization workflows using OGC standards.
III. Contributors
All questions regarding this document should be directed to the editor or the contributors:
Name | Organization | Role | ORCid |
---|---|---|---|
Dean Hintz | Safe Software | Editor | |
Rachel Opitz | OGC | Editor | |
Nils Hempelmann | OGC | Editor | |
Kailin Opaleychuk | Safe Software | Contributor | |
Vyron Antoniou | Hartis | Contributor | 0000-0002-7365-9995 |
Loukas Katikas | Hartis | Contributor | 0000-0003-1886-4125 |
Stelios Contarinis | Hartis | Contributor | 0000-0002-5789-4098 |
Thunyathep Santhanavanich | WiTech GmbH | Contributor | 0000-0001-9852-7000 |
Rushikesh Padsala | WiTech GmbH | Contributor | 0000-0003-2866-8443 |
Volker Coors | WiTech GmbH | Contributor | 0000-0002-0413-8977 |
Sarah Chang | Feng Chia University | Contributor | |
Simeon Wetzel | TU Dresden | Contributor | 0000-0001-7144-3376 |
Stephan Mäs | TU Dresden | Contributor | 0000-0002-9016-1996 |
Klaus Schneeberger | alpS Consult | Contributor | |
Jan Schmider | alpS Consult | Contributor | |
Franziska Allerberger | alpS Consult | Contributor | |
Joel Cline | NOAA | Contributor | |
David Green | S&T Innovation | Contributor | |
Ajay Gupta | HSR.health | Contributor | |
Paul Churchyard | HSR.health | Contributor | |
Bandana Kar | U.S. Department of Energy | Contributor | |
Jiin Wen | GISMO | Contributor | |
Theo Goetemann | Basil Labs | Contributor | |
Josh Lieberman | Josh Lieberman | Contributor | |
Alan Leidner | GISMO | Contributor | |
Matt Tricomi | Xentity Corporation | Contributor | |
Kevin Hope | Xentity Corporation | Contributor | |
Alex Bostic | Voyager Search | Contributor | |
Brian Goldin | Voyager Search | Contributor | |
Eldrich Frazier | USGS/FGDC | Contributor | |
Ian Tobia | FGDC | Contributor | |
Maria Gilmore | FGDC | Contributor | |
Nathan McEachen | Terraframe | Contributor | |
Justin Lewis | Terraframe | Contributor | |
Gérald Fenoy | GeoLabs | Contributor | |
Chetan Mahajan | Indian Institute of Technology Bombay, India | Contributor | |
Rajat Shinde | University of Alabama in Huntsville, US /OSGeo | Contributor | |
Surya S Durbha | Indian Institute of Technology Bombay, India | Contributor | |
Ingo Simonis | OGC | Contributor |
IV. Overview
The Open Geospatial Consortium (OGC) Climate and Disaster Resilience Pilot 2024 (CDRP24) was motivated by the critical need for new methods, tools, and systems to better understand, predict, and address phenomena, including intensification and changing patterns of typhoons, landslides, flooding, and extreme heat events.
To address these needs, the CDRP24 initiative brought together private sector, research, and government organizations to develop and prototype new information frameworks, data and analytical services, and software tools that can be used to improve forecasting and preparedness. In particular, this pilot addressed crucial gaps in Analysis-Ready Data (ARD) that are needed to support understanding natural hazard risks like landslides and extreme heat. It also looked at how these can be addressed by applying open standards to value-added data flows so that decision-makers and the wider community can be provided with the information they need to act in a timely manner, both to improve resilience and mitigate these hazards’ effects.
Geospatial information and technologies are essential to improving the climate resilience of communities worldwide. They also play a central role in designing and operationalizing the next generation of emergency and disaster management systems. This solutions-oriented pilot explored both aspects, leveraging OGC standards, OGC’s Agile Reference Architecture for geospatial systems, and the efforts of OGC’s member organizations.
Three main lessons can be learned from the CDRP24 project:
Some technical components required to assemble workflows that integrate diverse datasets are relatively mature and have been proven useful in practice. This is evidenced by their implementation in multiple application-specific workflows for tasks such as enhanced typhoon monitoring, landslide detection and prediction, and weather event modeling. Solidifying the shift to API-based architectures, pervasive throughout the applications developed in the CDRP, is vital to this maturation.
The expectation that model outputs will be available as Analysis-Ready Data (ARD), Decision-Ready Indicators (DRI), contextualized information, and comprehensible representations to non-technical/geospatial specialists has continued to increase. Integrating AI into the geospatial toolkit reshapes expectations for communicating analysis and modeling results.
The effective incorporation of contextual and domain-specific knowledge within information systems is considered necessary to improve the discoverability of relevant data and information. Effective incorporation includes ‘true’ semantic interoperability of workflows and interpretability of processes. Multiple methods to integrate contextual and domain information are being explored and have been proven as viable options, including knowledge graphs, AI generated syntheses of extensive bodies of knowledge, and processes for human experts producing documentation.
V. Value Proposition
CDRP24 ultimately makes communities more climate resilient and supports the upgrade of emergency and disaster management systems to cope with the increasing volumes, complexity, and velocity of hazard-related data streams. The results provide value to industry, government, science, and research users by facilitating the discovery of exemplary workflows and data services. It also created added value for new entrants into these domains by reviewing and evaluating existing technologies, providing an understanding of the state-of-the-art with an open-standards perspective.
There are a number of overarching value themes that this pilot addressed. First of all, climate and disaster impact management systems increasingly need to incorporate real-time data streams, along with the ability to integrate observations with combined model outputs in order to better assess both current and future risks. To support better analysis and communication of impact risks, a variety of analysis-ready data (ARD) structures and decision-ready indicators (DRI) were explored, including time series of storm tracks, urban heat distribution patterns, and DRI using time-interpolated heat distributions rendered on 3D buildings in NYC. Leading-edge AI and knowledge graph approaches were explored in the context of natural hazard assessments such as wildfire risks. These approaches will be crucial to amplifying the capability of humans to assimilate and assess greater volumes and complexities of environmental data. Finally, various methods were explored to interconnect related hazard impact models using open standards. This ability to automate the cumulative effects of combined climate and disaster impacts will be crucial to support integrated decision-making and planning to better mitigate these combined effects.
VI. Future Outlook
OGC Climate and Disaster Resilience Pilots will continue to promote shared technical advances based on collaborative research and support organizations to develop and demonstrate applications built on the community’s standards. Current activities will produce evaluations of the state-of-the-art in application-specific geospatial workflows and generate new workflow implementations, built on OGC standards. Through this work, demonstrations of new approaches to several key issues will be highlighted, including the production of decision-ready indicators (DRI), integration and communication of contextualized information, and the integration of AI into the geospatial toolkit.
AI tools, particularly generative AI (GenAI) tools, have particular requirements for the data provided. In the following CDRP stage, the pilot will develop artificial intelligence virtual assistants to explore these requirements, understand the capabilities of current generative AI components, and learn how the necessary context can be provided to make generative AI tools work. The virtual assistants will interact with various sources, service offerings, and platforms, which requires a set of standards-based APIs and information models to ensure reliable interaction and communication between all components.
To effectively use GenAI, all data services must be robust and have high levels of interoperability. As part of following pilot stages, GenAI’s functionality will be improved by analyzing existing services and their information models. This analysis will cover current operational services, Analysis-Ready Data products, and ontologies. The pilot will investigate whether current services and data products support the necessary level of the FAIR principles, which are essential for GenAI. This will involve crosswalks between ontologies and standard API data models.
The challenge of successfully using generative AI tools is more than purely technical. It is a matter of integrating a wide range of interests and requirements, and applying them in an ecosystem of users, policies, services and data. Therefore, this initiative aims to engage a broad cross-section of stakeholders, including data and service providers and users, researchers, responders, and community members, to shape a connected ecosystem of data, technologies and practices. We encourage anyone interested in improving understanding and increasing capacity for informed action by integrating spatial information and technologies across climate, disaster and emergency management-related domains to participate in this pilot.
Going forward, additional work is needed to continue to develop open standards that support data collaboration, integration, automation. In upcoming pilots, it is expected that wrapping components with OpenAPI OGC APIs such as OGC API Processes or OGC API EDR (Environmental Data Retrieval) will become increasingly the norm, and help make pilot components more easily sharable, modular and interchangeable long after the pilots wrap up. Significant work was already done to this effect for several components of this pilot, so this trend towards deploying components via an open API enterprise service architecture is a trend that is well underway already.
On the other hand, bBesides making pilot component services more accessible, much work still needs to be done to develop domain specific standards in relation to climate change impacts and disaster management. Several domain specific open standards were utilized in this pilot such as OASIS CAP XML (Common Alerting Protocol), OGC API EDR and C3S Climate Data Store to name a few. However, thus far the OGC has only just begun to explore climate and disaster domain standards related to Analysis Ready Data (ARD) and Decision Ready Information or Indicators (DRI) for example. The continued work by relevant domain working groups such as Climate Resilience, Emergency and Disaster Management, and new groups such as Geospatial Reporting Indicators will be key to moving this forward and providing progressive continuity across pilots. Collaboration with other related sustainability organizations at the state and international level will also be important, such as UNEP (e.g. UNCCD), EU (e.g. CLINT), North American and other environmental or disaster management agencies.
1. Supplementary Documents
In addition to this report, additional documents have been produced that provide more detail on specific technical aspects. These include . These documents will be served as the following URLs: the URLs will be modified to adapt to the latest relaunch of the OGC website. All documents will be made available via the OGC website and potentially other locations such as Zenodo and OSF
OGC Disaster and Resilience Pilot 2024: Generative AI in wildland fire management
OGC Disaster and Resilience Pilot 2024: Extreme Heat Management
OGC Climate Disaster and Resilience Pilot 2024: Landslide demonstrators
OGC Climate and Disaster Resilience Pilot 2024: ARD to DRI Supplementary Report — Safe Software
2. Introduction
Organizations and individuals worldwide strive to address the challenges caused by climate change and increasingly frequent and severe disasters associated with extreme weather events. While the links between increasing climate resilience and managing disasters and emergencies are widely acknowledged, the geospatial data, services, and systems used in applications across these domains are poorly integrated. The central goal of this pilot was to build the capacity to create products, services, and platforms that leverage open standards to provide actionable information relevant to emergency and disaster management and feed lessons learned into climate resilience planning. Building on the results of previous pilot initiatives, the OGC Climate and Disaster Resilience Pilot 2024 (CDRP24) assembled a consortium of partners to address these challenges and to bridge gaps by using geospatial data and technologies in the wider Climate Resilience and Emergency and Disaster Management context. On the climate impact side, the focus was on the nexus of Typhoon activity and the increased risk of flooding and landslides. Use cases included study areas in Taiwan and Greece, where data integration and data processing pipelines have been established to produce the appropriate climate resilience information resulting in early warning systems.
In terms of climate and health, one focus of the CDRP24 was extreme heat events. These events are associated with increasing mortality, especially in cities, and are identified as one of the most severe impacts of climate change in the near future. Besides these explorations, there was also a strong focus on real-time systems for observations to monitor extreme events and alert affected local communities.
3. Work Areas
3.1. Extreme Weather Event monitoring workflows applied to Typhoon, Flooding, Heat and Landslides
Prototype workflows that use OGC standards and standards-based APIs to provide data-driven services, including modeling and analytical services, can now be developed and deployed rapidly. This maturation is evidenced by Hartis’ review of Typhoons and Atmospherics models, GeoLab’s implementation of a Landslide model, and further examples under development.
Hartis’ Typhoons and Atmospherics Report documents their approach to developing key components and workflows for typhoons, atmospheric rivers (AR), and other extreme atmospheric phenomena. They provide a comprehensive review, encompassing:
analysis ready data (compliant with OGC standards)
workflow models (based on meteorological data and process effectiveness)
identification of key processing steps (data aggregation, normalization, accuracy)
Examining Tropical Cyclones (TCs) through extensive global simulations covering multiple decades necessitates the objective and automated identification and monitoring of these storms, a task achieved through specialized “TC trackers”. These trackers are algorithms designed to recognize cyclonic formations characterized by warm cores within gridded datasets and connect them into coherent trajectories (Bourding et al. 2022). Key parameters and modeling schemes according to Liu et al. (2023) can be identified as:
Typhoon track: Large-scale weather patterns, the sea surface and atmosphere temperatures, features of land topography, structure, and intensity.
Typhoon intensity: Maximum wind speed near typhoon center at standard 10-m height and minimum sea level pressure (MSLP) at typhoon center. Wind and gust forecasting and early warning information are very important.
Typhoon-induced rainfall: Quantitative Precipitation Estimation (QPE) and Quantitative Precipitation Forecasts (QPF) for hydrological forecasting modeling are critical to forecasting the quantity, location, time, and duration of TCs.
Their work focuses on analyzing different Numerical Weather Prediction (NWP) datasets and TC trackers used for predicting TC’s paths and intensities. A range of TC trackers have been tested for their performance including those detailed in the following reports: (Arthur et al. 2021; Bourdin et al. 2022; Kitamoto et al. 2024; Pérez-Alarcón et al. 2024). Their literature review demonstrates there is a large pool of observed TCs that all trackers detect: There were 3510 tracks in total in IBTrACS catalog and about 2400 of them are detected by almost all trackers. Notably, the detection skills of all trackers are identical and nearly perfect for the strongest tracks in a simulation. This means that no detected track beyond category 2 (included) is a false alarm, and TC observed in categories 4-5 are found regardless of the tracker used. This situation suggests the need for interoperable TC models which can be compared or used as an ensemble.
To progress this, workflows and tools for numerical weather prediction data fusion and interoperability are being examined by the Hartis team. This review of OGC Standards, libraries, and APIs focuses on the capabilities needed to support User-Defined Processes for utilizing OGC API Processes to allow users to define and deploy custom processing algorithms in the cloud. This is particularly useful for researchers and meteorologists to run custom models and simulations based on specific needs or experimental setups. This enables Machine Learning Models used in the TC trackers reported above (see Fig. below) to be integrated directly through an API endpoint. This enhances the capabilities of OGC API Processes and/or OGC Environmental Data Retrieval (EDR) API to support deploying and managing machine learning models that can predict typhoon behavior based on historical data.
The detailed report and Review of Typhoons and Atmospheric Models is published on Zenodo. The Flood Impact Analysis Workflow, which was developed for the realization of the calculation and analysis of areas that will be affected by a flood incident, is available on an Open Science Framework (OSF) node. It can be downloaded and used for further experimentation.
Figure 1 — Potential User-Defined Processes for Track Forecasts (red rectangle), utilizing OGC API Processes to allow users to define and deploy custom processing algorithms, for example by deploying machine learning in typhoon forecasts in different TC trackers (Liu et al. 2023)
Also focused on this challenge, Wuhan University’s Early Warning Flood Demonstrator deploys an overlapping set of tools to meet region-specific requirements in China. Due to complex geographic and climatic conditions in the region, flash floods exhibit characteristics of wide distribution, large quantity, strong suddenness, and high destructiveness. The extensive monitoring of floods poses challenges in organizing and managing geospatial observation data. The team proposed a new geospatial infrastructure called GeoCube Gao et al. (2022), which facilitates the management and analysis of massive Earth observation data. Compared to previous work on data cubes, GeoCube extends the capacity of data cubes to accommodate large vector and raster data volumes from multiple sources, making it suitable for complex flood warning analysis. Additionally, to address the system’s support for data stream processing, we also propose a real-time modeling observational stream computing model Shangguan et al. (2019). This model integrates sensor web with models in a streaming computing environment to provide timely decision support information.
Building on previous work, the team proposed a system architecture (see diagram below) supporting automated data retrieval and preprocessing workflow that fetches data from multiple sources and storage in different databases by type. The system uses the Spark distributed computing framework and a variety of third-party geographic analysis libraries to enrich operation. Operators and data are provided to the frontend via the OGC API, where users can interact with a visualization interface based on Cesium to obtain thematic map results as needed. This process uses OGC API-Features to provide web services for vector data, OGC API-Coverages for web services of raster data, and OGC API-Processes to provide web services for operators. With these services, users can call their own data and create their own workflows on the frontend of our system.
Figure 2 — Overview of the system developed by Wuhan University
Their workflow establishes an interface for hourly data retrieval and preprocessing for the warning system, enabling the system to automatically update with the latest observation data, ensuring the timeliness of the alert system. It sets up a flood simulation workflow that implements several hydrological models based on a foundational geographic computing library and simulates floods by coupling these models together. This process enabled tracking changes in surface water accumulation during rainfall events.
On this basis, the team developed comprehensive flood warning indicators. For the indicators, ponding analysis was conducted using depressions of different scales. The analysis considered that during rainfall events, surface water first accumulates in local depressions and then gradually converges towards larger depressions across the entire area. This work allowed the simulation of the process of water accumulation from small to large depressions and calculated a weighted average of multi-scale ponding risks based on this process. Consequently, the system is capable of producing a comprehensive flood risk index for the entire area.
Figure 3 — Wuhan U’s Early Warning Flood Detection Model.
Hartis’s Early Warning Flood demonstrator highlights another path, building a workflow in ESRI’s software suite and publishing the final workflow as an OGC standards conformant service. This work illustrates use of regional higher resolution DEMs in automated workflows developed for flood modeling at a regional scale, which may be adapted and reused in new regions. To support reuse, their objective is to present end-users with an abstraction of the workflow, requiring only a few essential inputs to run in a python environment.
Figure 4 — Hartis’s early warning flood detection workflow.
This approach was paralleled by Safe Software. Their weather event workflow component took existing publicly available weather service data streams such as from the National Weather Service and enriched them with the context of local hazard information to increase the resolution and relevance of warning products for local residents and responders. This experimental application was implemented for North Manhattan in New York City as part of Safe’s contribution to the NYC GISMO led Extreme Heat project. This could also be readily deployed for other urban settings given the appropriate foundational data layers and services.
For example, temperature observations and forecasts were combined with urban heat index information so that areas more affected by urban heat island effects could receive localized warnings that take these temperature stresses into account. This workflow was authored on the FME platform. It accepted GeoRSS requests from client applications, consumed weather observations from the National Weather Service API on api .weather .gov, and queried the Urban Heat Index ARD component via OGC Feature API for locations of interest. Alert notifications were transmitted back to the GeoRSS client and also were available via OASIS Common Alerting Protocol (CAP) XML or HTML messages.
Figure 5 — Weather Event Service & ARD workflow to support urban heat scenarios.
Weather Event Service Workflow:
Accept client requests by location
Query NWS API to get area observations for requested location.
Query CDRP Urban Heat ARD service via OGC Features API for requested location
Combine NWS observation with urban heat index from ARD service to determine projected localized heat level for requested location.
Apply heat threshold DRI rules to determine appropriate warning level for requested location.
Transmit appropriate warning via GeoRSS feed response, HTML or CAP XML message.
This localized urban weather notification tool added to the work Safe Software has contributed in previous OGC disaster and climate pilots related to heat, drought and flood modeling. Safe Software’s ARD to DRI contribution and component provides further details as to how data from this weather event service feeds decision indicators and notifications.
GeoLabs standardized approach for identifying potential landslide-affected areas utilizing satellite imagery from Sentinel 1 and Sentinel 2 in GeoDataCube format provides another example. They use sensor data on rainfall and displacement to predict future displacements and Sentinel 1 & 2 imagery to build a deep-learning model that predicts landslide-damage risk zones. The output is a thematic map for areas potentially affected by landslides. Utilizing data cubes makes data management easier and further helps in analysis.
Their work provides a regional implementation in Taiwan, which is highly prone to earthquakes, due to the intersection of three tectonic plates. The area is known as “Ring of Fire”, and is the most seismically active zone in the world. The region’s highly mountainous landscape magnifies ground shaking, leading to landslides.
The proposed prototype is a server implementation of a standardized framework built on Common Workflow Language (CWL) to support workflows for landslide susceptibility mapping and analyzing corresponding Decision Ready Indicators. The workflows are comprised of the following primary components supporting OGC API Processes — Part 2: Deploy, Replace, Undeploy.
Figure 6 — Schematic view of a landslide event.
Processes for landslide detection and prediction are implemented in a inference-as-a-service framework for passing inputs to the model for generating landslide detection maps and rainfall displacement.
Figure 7 — A conceptual workflow (left) and an implementation (right), showing the processes description for the services required to be developed for this landslide modeling task.
Two weather event modeling workflows consolidate the maturing of this toolset. WiTech GmbH developed a prototype capable of simulating carbon emissions from existing 3D building stock datasets that uses future weather profiles in real-time on the web, leveraging the SimStadt simulation engine integrated through OGC API: Processes and taking OGC API 3D GeoVolumes as an input data format.
On opening the web application, the client fetches default OGC API – 3D GeoVolumes endpoints, displaying available collections and their extents. Users can add custom 3D GeoVolumes endpoints. On selecting a 3D GeoVolume, the client retrieves the associated 3D city model from the server, centering the map on its spatial extent. Users can inspect building information in the 3D Tiles batch table. They can then initiate a simulation process for registered city models, defining process parameters and styling preferences. Configuration parameters are submitted via HTTP post request to the OGC API – Processes endpoint, where the process is executed server-side and results are returned to the client.
The overall benefit of this central standard interface is that it can help streamline multiple data operations into one online end-to-end web client which ultimately will allow even non-simulation experts such as architects or urban designers to make faster and better decisions about the effects of climate change on carbon emissions of building stocks, including for future climatic scenarios.
Figure 8 — User Interaction Workflow Diagram for Witech’s model (Santhanavanich et. al, 2023)
Feng Chia University’s weather event data flow component is designed to provide information on typhoons and precipitation, specifically that relate to typhoon tracks, location, intensity, and time ranges of interest, by using OpenAPI and OGC API Processes. By utilizing historical typhoon data and integrating it with historical meteorological data such as wind speed and direction, their aim is to train a model that can predict typhoon paths and estimate the intensity of the storm radius. This model is crucial to enhancing the ability to prepare for and respond to typhoons, ultimately reducing the potential impact on communities and infrastructure. This approach involved analyzing the patterns and trends in past typhoon movements and their corresponding meteorological conditions to develop a predictive model. They developed two models containing Time Series Models and LSTM models, using historical data.
Figure 9 — Feng Chia University’s prototype typhoon modeling system
This approach’s advantages are the integration of information to establish a detailed database of typhoon events, and enabling the use of historical typhoon information to train models. An API interface was provided, allowing users to input keywords to obtain detailed typhoon information, retrieve similar typhoon paths, and access predicted typhoon intensity data. This also facilitates future research. Several of these efforts underscore the role of a growing set of OGC standards based APIs. Hartis’s study of Typhoon trackers illustrates the shift to API-based architectures. Their review of multiple implemented workflows highlighted uses of:
OGC API Coverages for exchange of geospatial information as ‘coverages’, including GetCapabilities to return from a WCS server a list of what operations and services (“capabilities”) are offered and DescribeCoverage to return additional information (CRS, the metadata, the domain, the range and the formats available).
OGC API – Environmental Data Retrieval v1.1 for accessing a wide range of data (i.e. for TCs: in-situ, satellite and NWP) and for processing capabilities e.g. data fusion, quantile correction methods, cross-validation, distribution fitting, wind extreme values analysis (GEV and GPD fitting, Mean Excess Plots (MEP), Peak Over Thresholds (POT)).
C3S Climate Data Store (CDS, OGC-compliant interface): Toolbox infrastructure/workflows for processing options such as computation of statistics, sub-setting, averaging, workflows that combine the output of tools as input into other tools.
GeoWATCH API (OGC-compliant interface, Eylander et al. 2023): GeoWATCH API enables access to GeoWATCH products through the GeoWATCH web console.
GeoLab’s landslides demonstrator equally highlights usability of the APIs. It uses OGC API Processes Parts 1, 2 and 3 and the OGC GDC API (OGC TB19 Output) as the basis for its landslide detection and prediction models, together with OGC Geo-Datacube and OGC Training Data Markup Language standards. Its core contributions include implementing scientific algorithms as CWL processes enabling standardized execution of algorithms over the web (utilizing OGC API Processes) and execution of Processes chains based on OGC API Processes Part 3: Workflow draft specification.
Use of OGC API Processes is seen in most ‘maturing’ implementations, including those developed by Feng Chia University (diagram below) and WiTech.
Figure 10 — Feng Chia University’s prototype typhoon modeling system
3.2. From Analysis Ready Data to Decision Ready Indicators: Visualization of Climate and Disaster Impacts to Support Decision Making
With 3D interactive environments increasingly common, cyclical interest in AR and VR on the rise, and AI creating volumes of visual information from text, expectations for the visual and graphical communication of geospatial analysis and modeling results are in focus again. WiTech’s visualization work demonstrates a 3D visualization capability through system architectures for 3D visualization applications based on OGC API web services (see diagram below).
Figure 11 — WiTech’s 3D visualization system.
Their system integrates diverse (1) “Data sources” which undergo processing via an ETL or middleware processing tool (2) for tasks such as data cleaning, data conflating, data conversion, etc. Some datasets may go through data analytics and processing layers (3). OGC API Processes may be deployed as a wrapper around these layers. Following analysis, data typically returns to the second component (2) for delivery as OGC API web services, such as Features or 3D GeoVolumes API. The client application (4) uses the OGC API web services.
They deployed a 3D web-based application for visualizing flood risk zones in Germany, developed using the CesiumJS application framework. The application currently utilizes three datasets: Flood risk zones in Germany, served through the PyGeoAPI python library as OGC API Features and integrated into the client; 3D building models in Level of Detail 2, prepared in 3D Tiles format and served through the OGC API 3D GeoVolumes service; and 3D terrain data, prepared in Quantized Mesh format and served through the OGC API 3D GeoVolumes.
Figure 12 — Witech’s 3D visualization interface.
Connected to the Extreme Heat team’s collaboration, Navteca used 3D visualization of data in the context of urban areas to synthesize information from across data sources to better represent the risk of extreme heat in cities. This urban heat island phenomena is a serious threat to health across the planet and understanding these risks has critical implications. One important factor for understanding the health risks associated urban heat island effects and extreme heat is understanding a person’s exposure through their vertical placement in a building. Using Open Street Map building footprints, and realistic textures, their system overlayed heat index information and modeled heat exposure data from Safe Software’s UHI ARD component, and rendered this for New York City’s Manhattan Island using the Unity game engine.
Figure 13 — Navteca’s visualization of a heat map from Safe Software’s urban heat ARD component draped over a 3D rendering of buildings in NYC.
By “bringing the data to life” within a visual context, their system aims to enable decision makers, city planners, and other stakeholders to better understand the risk of urban heat, and better prepare for extreme heat events in New York City and beyond. This 3D animation helps illustrate the propagation of urban heat over 12 hours of daylight. A time series of urban heat grids was draped over NYC buildings from CityGML and rendered via Unity.
Navteca also undertook related work representing information about landslides in Taiwan for decision makers. Taiwan faces serious challenges with landslides and has a robust sensor network in place for early detection and early warning. Beneath Taiwan, the intersection of three tectonic plates forms a highly active seismic zone. The region’s mountainous landscape magnifies the effects of ground shaking, which in turn leads to landslides. The dramatic terrain lends itself to 3D visualization to better understand seismic activity and landslide risk in the region.
For this work, they evaluated OGC API 3D GeoVolumes to convey Taiwan landslide data in a mountainous context, shown below. An additional challenge involved ingesting near real-time data from site sensors for visualization. To accomplish this, data needed to be converted into a format where it could be processed on the GPU so that the simulation could be run in real-time. In practice, this required embedding data in the channels of 2D or 3D textures so it could be manipulated by compute shaders and integrated into the rendering pipeline. Conversion into the pcache format, which has a native importer in Unity, means data could be used to drive shader graphs and other effects, but was a move away from open standards-based workflows.
Figure 14 — A 3D render of a landscape for use in CDRP applications.
Likewise focused, on visualization, Laubwerk is used a partly overlapping set of tools to achieve different communication objectives. Building on their work for the Climate Resilience Pilot 2023, their prototype extended users’ ability to visualize vegetation in an urban context using more generic inputs and additional information layers describing ecosystem services, with applications in real-time and interactive scenarios.
Their system uses terrain and building geometry from different data sources as well as plant inventory data as a starting point. Detailed 3D plant models enriched with a broad range of metadata from Laubwerk’s own extensive plant database are then inserted into into the terrain and building visualization with as little manual intervention as possible. This can be used to create detailed representations of the plant cover of any urban area, as long as the source data is available. Beyond just an interactive photorealistic representation, this can also be used to add information about ecosystem services and consequently the impact plants have in urban environments.
Figure 15 — A flowchart showing how different elements come together for a high quality 3D visualization of urban scanarios with a high quality data rich vegetation layer.
The system interfaces with OGC standards such as 3D Tiles for terrain geometry and GeoJSON to ingest plant inventory information. While their work shows these standards are well-established and easy to use, it also highlights two key areas of future work.
3D Tiles enable feature classification, which is useful for the identification and replacement of plant geometries. However, most data sources we found did not have this kind of classification or the quality was insufficient for this application, preventing automation of data pipelines and increasing manual processes.
GeoJSON enables a user to attach arbitrary properties to features and consequently different data sources use different properties or have different names for the same properties. Systematization or cross-mapping could enable the use of GeoJSON data from multiple sources through semantic alignment of key properties.
Figure 16 — Example 3D visualizations of trees in urban modeling and planning applications.
Figure 17 — Examples for integrating ecosystem services information into visualizations.
Making ecosystem service estimations readily available in visualization and planning environments through the Laubwerk tool could help leverage plants to mitigate change in local climate, more effectively identify and avoid urban heat islands, and plan cities to be more resilient in the face of weather events such as strong heat, drought, or intense rain. The visualization, especially when presented interactively, can be a powerful way to communicate such planning considerations to stakeholders and to build public support. These visualization systems could also improve ecosystem services planning, because factors such as height, crown width, or biomass can be represented more accurately.
Beyond visualization, communication via DRI is a priority to support applications in Climate and Disaster Resilience. The development of systems to produce trustworthy DRIs is an ongoing sector-wide challenge.
In the context of this pilot Safe Software also explored key aspects and challenges related to building data value chains, from raw data to analysis ready data (ARD) and then to decision ready indicators (DRI), in order to better support disaster response and climate resilience applications. While ARD principles originated from the earth observation community, ARD approaches hold promise in developing data products that are more readily sharable and broadly useful for assessing a range of impacts for disaster and climate impact scenarios.
In order to evaluate and validate these approaches, Safe included an urban heat ARD component to test against. This ARD test component supports ARD and DRI generation related to disaster and climate resilience indicators such as urban heat island effects. This adds to our past contributions made by participating in DP21, DP23 and CRP23, where Safe Software gained experience building flood, drought and heat indicators.
For this pilot, Safe Software supplied an ARD component that supports the Weather Event service described above. Weather observations were combined with other data related to the local climate and natural hazard context. The FME platform was used to aggregate relevant data from a range of data sources relevant to the chosen climate resilience scenario, namely extreme urban heat in NYC, and ARD information based on this was shared via OGC Features API. Surface type, impervious surfaces, buildings data such as orientation height, density and materials, were some of the source data incorporated. The goal was to construct a limited urban digital twin that was focussed on providing extreme heat warnings for urban areas, including evaluation of effects related to urban heat islands.
Figure 18 — Urban Heat Index grid based on urban heat factors including surface type, building type, and cooling factors such as green and blue spaces. Color ramp shows degree of urban heat island effect. Blue and green areas have minimal effect (0-2F) while the red and purple have a much larger urban heat impact (8-11F). The location of interest in red is estimated to be 8.3' F hotter than the predicted ambient air temperature in Central Park.
The above provides a simple but useful example as to how an ARD to DRI data value chain can support disaster response decision making. The National Weather Service already has a comprehensive set of services and APIs for distributing weather observations and forecasts. However, there are still many potential gaps between this available weather and climate data and those who could most benefit from it. By combining these real time data streams with richer local contextual information such as urban heat affects, and then applying business rules related to potential health hazards, these observations can be transformed into actionable information targeted to specific user groups. Warnings provided for specific locations in a form suitable to that target audience in a timely manner are more likely to be effective at enabling the audience to take action in time to minimize adverse effects. Ultimately, it is hoped this will help develop more standardized ARD that can be used by a variety of components to drive decision ready indicators in order to support timely decision making. These approaches and associated workflows could also be readily deployed to a variety of other urban settings.
For more information on exploring applications of Analysis Ready Data approaches to climate resilience during this pilot, see the ARD to DRI Supplementary Report by Safe Software.
Following work the 2023 work on the US Climate Mapping for Resilience and Adaptation portal and work to incorporate it into the OGC Climate Resilience Pilot — NOAA, US GCRP and ESRI are collaborating to provide additional applications with new data and workflows to support the broad needs of the climate resilience community. The Climate Resilience Information System (CRIS) improves access to both climate and non-climate data via cloud services that support and adhere to open science standards. Multiple federal initiatives, such as CMRA and NCA, can access this authoritative source, and is made openly accessible to empower community initiatives. The data and APIs utilize OGC APIs where appropriate. The processing workflows (ARD to DRI) and application code will be provided in open source GitHub repositories with permissive licensing to encourage reuse.
Figure 19 — Schema of the ESRI system.
The data for CRIS will sit across several locations. The National Climate Organizations in ArcGIS Online (hosting CMRA, FFRMS, Atlas) will host the geographic summaries of all of the downscaled models (members and ensembles) and observed climatologies. These are polygon summaries for county, tribal, and HUC8 boundaries for 30 climate indices (e.g. Days over 105 F). Those features will be provided as OGC API Features and Esri REST, providing visualization, graphing, and analysis capabilities. Expanding from CMRA last year, annual data will also be provided for all geographies and variables allowing users to build their own time range summaries (e.g. 10 or 30 yr summary centered on the year of their choice. Additionally we will have Image Services (pixel-level data) for the Blended Ensemble (NCA5 data) and nClimGrid. For those who need more advanced capabilities all of the harmonized data will be available from AWS as GeoTIFF or netCDF.
Figure 20 — Schema of the CRIS system developed by ESRI.
One of the applications leveraging CRIS data is the National Climate Assessment Atlas. The NCA Interactive Atlas provides digital access to downscaled climate projection maps used in the Fifth U.S. National Climate Assessment (NCA5). The Atlas is an extension of NCA5, offering interactive maps that show projections of future conditions in the United States. NCA5 is a static report of limited length, and sample maps are presented within each chapter. With the NCA Interactive Atlas, users can access and explore climate data for locations across the United States, even if those data were not explicitly presented in NCA5. The NCA Interactive Atlas also includes features to help users interpret and compare maps. The Atlas shows regional climate conditions projected to occur if Earth’s long-term average temperature reaches specific levels of warming. These Global Warming Levels (GWLs) correspond to global average temperature increases of 1.5, 2, 3, and 4 degrees Celsius above pre-industrial levels. The list and description of 15 map layers, an explanation of the 4 warming levels, as well as information on the models, downscaling, and validation are available here. The NCA Atlas data is available through a mapping and download site which allow the user to filter and symbolize on 21 different attributes as well as interactive spatial selection, which can then be downloaded in numerous formats: GeoPackage, GeoJSON, SQLite Geodatabase, File Geodatabase, Feature Collection, Shapefile, Excel, CSV.
A series of surveys have been conducted to analyze the process from Analysis Ready Data (ARD) to Decision Ready Indicators (DRI), in order to identify the gaps and provide recommendations for enhancing the process. The methods used in the surveys include a literature review (survey of typical stakeholders such as NASA GES DISC data usages and application domains, USDA), a survey of existing decision-ready indicators (review of relevant policy frameworks and their decision-ready indicators to identify data and analysis requirements for decision making, monitoring, reporting, and validation), a survey of value chain mapping (analysis of existing data infrastructure, tools, and typical processes from data acquisition to DRI), and an in-depth analysis of workflows/frameworks from ARD to DRI.
One survey, for example, on GES DISC data usage over the last decade (2013-2023) highlights the important decision areas (data source: GES DISC Publications). The data center hosts a wide range of analysis-ready datasets, including many Essential Climate Variables. Figure (Figure 21) displays the ranking of application domains over the decades. Figure (Figure 22) presents the application domain ranking of GES DISC data usage for each year during 2013-2023. Figure (Figure 23) illustrates the usage changes of major applications over the period from 2013 to 2023. The survey of dataset usages reveals that air quality, floods, and droughts are among the most studied application domains that require decision-ready indicators from analysis-ready data.
Figure 21 — Application domain ranking of the usages of GES DISC data during 2013-2023.
Figure 22 — Application domain ranking of GES DISC data usage over each year during 2013-2023.
Figure 23 — Usage changes of major applications over the period of 2013-2023.
Surveys of existing decision-ready indicators related to climate change and disasters have been carried out. The IMF’s climate change indicators include GHG, mitigation, adaptation, transition to a low-carbon economy, climate finance, and actual climate & weather (source: IMF Climate Data). Climate readiness indicators from CGIAR cover governance, knowledge, the climate-smart agriculture framework, national capabilities, and the information system at the national level (source: CGIAR).
The Sendai Framework for disaster risk reduction identifies 38 indicators for measuring decision goal areas. These include indicators for substantially reducing global disaster mortality, the number of affected people, direct disaster economic loss, and disaster damage to critical infrastructure and disruption of basic services. They also include indicators for substantially increasing the number of countries with national and local disaster risk reduction strategies, enhancing international cooperation to developing countries through adequate and sustainable support to complement their national actions, and substantially increasing the availability of and access to multi-hazard early warning systems and disaster risk information and assessments to the people (source: UNDRR Sendai Framework).
FAO SDG (Sustainable Development Goals) indicators include areas of water use, forest, land degradation, and disaster economic loss (source: FAO SDG Data Portal). Many of the decision-ready indicators are directly served through decision tools for specialized application domains, such as the Digital Coast, the US Drought Monitor, and the Harmful Algal Bloom Tracker.
The value chain from ARD to DRI has been studied. Figure (Figure 24) illustrates the general value chain. Figure (Figure 25) depicts a specialized derivation of decision indicators for agricultural applications, as used in CropSmart.
Figure 24 — General Value Chain from ARD to DRI.
Figure 25 — Value Chain from ARD to DRI for agricultural applications.
Analyses of survey results and the typical value chain from ARD to DRI reveal several gaps. These gaps include data availability and accessibility (such as fragmented datasets, data latency, missing data sources, and inadequate metadata), data quality and consistency (such as inaccurate data, incompatible formats, and inconsistency in standardization), processing capabilities and analytical tools (such as the unavailability of specialized processing tools and inconsistency of data), and critical time latency for real-time decision support (such as product delays and lack of interoperable access). The diagram below (Figure 26) presents the prioritized gap action plan to address the identified gaps in the processes to produce DRI from ARD.
Figure 26 — Prioritized Gap Action Plan from ARD to DRI.
3.3. Contextual, domain-specific knowledge from stakeholders, metadata and catalogs to improve discoverability, semantic interoperability, and interpretability
The integration of contextual knowledge and information beyond conventional spatial data is an important part in the CDRP24. alpS Consult engaged in direct stakeholder and user survey work, illustrating the current emphasis on the incorporation of diverse user perspectives and context. Their project aimed to 1) identify knowledge and data gaps/needs along climate impact chains for the health and energy sectors in close collaboration with stakeholders from various fields and actor groups, 2) determine how respective knowledge and data should be visualized and prepared in order to best support decision-making processes ranging from short-term to long-term decisions and 3) derive Decision Ready Indicators/Information. The focus of the initiative was the local/regional scale, whereby particular attention was given to rural/mountain communities as they are severely affected by climate change and are highly susceptible to direct and indirect climate impacts on the other. The region Tyrol, Austria, with its alpine-urban city Innsbruck, served as a case study.
Figure 27 — Contribution by alpS Consult - Summary of results compiled by alpS Consult 2024.
Impacts on health e.g. due to (urban) heat waves were one focus area of this OGC pilot and interviewees stated that such heat waves, in particular in combination with blackouts or due to a so called heat dome caused by omega-weather situations, are a key climate risk for urban areas. In this regard, apart from aspects such as the need for modelling indoor humidity levels and more in-depth assessments of the potentially adverse effects of greening and standing waters, future work should pay particular attention to compound/cascading risks. Such an approach would allow for addressing complexities in disaster and emergency planning. For instance, when dealing with heat waves, interlinkages with wild fires and potential associated panic attacks within society could be taken into account. Within the hydro power sector, there is great demand for forecast products which cover general weather situations (medium-term forecasts) or have a very fine spatio-temporal resolution. For users, platforms serving as “one-stop-shops” providing all relevant information/data were considered as important by the interviewees.
This perspective is also mirrored in the provided recommendations for visualizing data as this should be done as interactive as possible (see e.g. en-roads). Besides this, options to compare various scenarios or models at a glance as well as allowing different user groups to make connections with their own life reality is considered as crucial as well. According to the interview results, there might be no “one size fits it all” approach for Decision Ready Indicators and Information respectively. On the one hand, DRIs have to cover sector specifics, on the other hand user/target groups and respective needs have to be addressed. As there is a need for an analysis/modelling of compound/cascading risks, DRIs should also take this aspect into account.
Even though spatial data is at the core of OGC contributions, the interviews also clearly showed that non-spatial data, information and general circumstances are of importance in the context of improving disaster and risk management and planning. According to the interviews, among these general data/information needs are awareness raising and training (e.g. staff of hospitals, general public), long-term strategies how to adapt to and cope with heat events, legal foundations as well as communication and sharing of information (particular among authorities). This observation leads to the question whether and how spatial data/OGC standards could best support addressing these identified gaps.
With respect to lessons learned, it should be noted that standards as well as use cases developed and used in the OGC community may not be well-known by relevant stakeholders. Furthermore, when incorporating a social science-orientated approach as applied by alpS Consult within the scope of this pilot, it is crucial that there is a strong linkage with contributions by other pilot participants and/or overall OGC objectives. Thereby, the abovementioned aspect regarding the general knowledge about and awareness of OGC standards should be kept in mind.
As noted, insights provided by the stakeholders being interviewed serve as crucial and key sources of information. This was complemented by a review of relevant literature on climate change with respect to mountain regions, energy, and health as well as on climate impact chains, e.g. reports by the Intergovernmental Panel on Climate Change (IPCC); 5°C, Special Report on Ocean and Cryosphere in a Changing Climate; Reports by the Austrian Panel on Climate Change (APCC) as the proposed initiative focuses on a local/regional level in Austria.
Within the scope of this initiative, the results of this explorative analysis feed into the work carried out by other institutions participating in the pilot and can support the further development of standards they use/develop (e.g. GISMO and HSR.health+ – Health Impact Workflows, Safe Software – Weather Event Workflows). Potential insights gained through the interviews that might be useful for the work done by others within this initiative could be e.g. feedback on the envisaged heat index or other aspects with respect to data/knowledge/models that would be important to better prepare for climate related risks in both sectors.
In a related effort, TU Dresden is evaluating how to identify relevant and domain-appropriate data for accurate analysis and decision-making. Traditionally, metadata catalogs rely on lexical search architectures to retrieve metadata records (Hervey et al., 2020). While lexical search offers high generalizability, its primary limitation lies in its inability to comprehend the semantic meaning of queries and returned records (Formal et al., 2021; Thakur et al., 2021). In contrast, state-of-the-art search engines such as Google or Bing implement AI-based language models (e.g. BERT: Devlin et al., 2019) to provide a deeper understanding of query and search results (Singh, 2021). This approach enables seamless cross-referencing and integration of data from different sub-domains within the SDI framework. The use of dense retrieval encourages collaboration across sub-domains, leading to better decision-making and improved resilience in managing climate-related risks and disasters.
However, the ability of dense retrieval models to generate meaningful embeddings depends heavily on the model’s pre-training (Thakur et al., 2021). For domain-specific tasks like environmental metadata retrieval, the models will likely not perform well because they encounter highly specialized terminology in metadata records that likely wasn’t present in their training data. Commonly used metadata cataloging software such as GeoNetwork, GeoNode, or CKAN, which adhere to the OGC CSW standard, do not provide dense retrieval capabilities by default. The accommodation of such functionalities requires some architectural adjustments.
Figure 28 — Components required to facilitate dense retrieval in catalog services, developed by TU Dresden.
There are numerous pre-trained models available online that can be used for information retrieval tasks (e.g. distilbert/distilbert-base-uncased, sentence-transformers/all-MiniLM-L6-v2 or Sakil/sentence_similarity_semantic_search). These can be domain-adapted to the geospatial domain using unsupervised training methods, such as TSDAE (Wang, Reimers, et al., 2021) or GPL (Wang, Thakur, et al., 2021). In the context of SDIs, the existing OGC standards can be used to effectively harvest metadata via the CSW interface or via an API endpoint and then generate a text-corpus consisting of the harvested records (e.g. dataset descriptions, keywords, etc. After domain adaptation, the dense retrieval model can be implemented in the search workflow of a catalog. To generate query and metadata embeddings, it can be deployed as an API service. Implementing architectural adjustments for catalog retrieval should result in improved search for data and information, improved user experience, enhanced interoperability across domains, and ease of advanced analytics and decision-making.
Figure 29 — Overview of the search pipeline for traditional lexical search (above) and dense retrieval (below). Each approach comprises an indexing stage (a), utilized upon uploading or updating metadata within the catalog, followed by a retrieval stage (b).
3.4. Urban Heat Effect Modeling and Health Impacts — Application for Northern Manhattan in NYC
Climate change has resulted in an increase in the number, extent, and intensity of extreme heat events when the combination of temperature and humidity can result in conditions that cause widespread illness, hospitalization, and death. NOAA recognizes that extreme heat causes more deaths in the U.S. than any other type of weather-related disaster, and predicts that deaths will rise significantly over the next fifty years ( Weather Related Fatality and Injury Statistics). Academic studies have pointed out the possibility of an extreme heat event that is accompanied by a power blackout. (See here). Blackouts are more likely to occur during extreme heat events because when electric loads are higher, electric components can overheat, transmission lines are more likely to sag and short out, and wildfires/high winds/lightning may damage electric infrastructure.
The growing risk and impacts of these coupled scenarios motivated *NYC GISMO, HSR.Health, Safe Software, and collaborators from NASA, NOAA, DOE, Hunter College, Basil Labs, and NY State to develop an updated operational model and prototype tool for 1) assessing the impacts of coupled extreme heat and power loss events, and 2) communicating actionable information (DRIs) based on the model’s outputs.
To create a model and tool that are fit-for-purpose, reflect recent scientific developments, and take advantage of all available data, their team undertook a critical review of available data and algorithms currently in use to identify those that are suitable for their coupled heat and power loss modeling application. The data and algorithms assessed covered a broad spectrum, from those used to model temperatures, heat island effects, and felt heat in urban contexts, to data on census tract level demographics, historic death and hospitalization levels, and socio-economic data. In parallel, they worked with stakeholders in emergency management, public health, urban planning, and other groups to understand the kinds of decisions and actions taken which could be informed by the DRI provided by the model and prototype tool. This work is essential to designing the DRI information model and ensuring its outputs will be useful for decision-making. The current data review, modeling, and prototyping project uses New York City (NYC) as an example, and is designed to be adapted by other cities where data and scenarios are comparable.
The same process underscored that current general-purpose systems do not sufficiently account for the impacts of extreme heat on populations. For example, while the National Weather Service (NWS) can include models of urban heat islands, these models do not sufficiently account for heat differentials between lower and upper floors of buildings, nor do they sufficiently account for variability caused by construction style and material, spatial distribution of buildings and infrastructure, and other factors.
The group’s work to date has documented the serious implications for human health that comes from using unsuitable data and models for heat and power loss event modeling. Most people who die during extreme heat events die in non-air-conditioned apartments, or on the street when they are in direct contact with non-reflective, impervious surfaces whose temperatures can rise above 150 degrees (see here). Therefore, more realistic and accurate modeling of spatial and vertical heat gradients within UHEIs, and modeling of surface temperatures among other factors, is essential to planning and decision-making that links to the health risks of extreme heat events.
To create a working system accounting for the extreme heat and power outage scenario, experimental 2D and 3D models of the built environment for the New York City case study area were assembled on the FME platform, and Safe Software used them, with input from Hunter College, to generate more granular spatial distribution models of temperatures. This integrated a variety of data including surface type, impervious surfaces, building materials, and shade to model distributions of heat that account for more relevant factors and therefore produce more useful, localized information. Urban heat island intensity data from Climate Central was also utilized.
The second component in the working system, complementing the modeling workflow in development by Safe Software, was an access point to the Heat Health Risk layer, built on HSR’s GeoMD Platform, a Multi-stack, Cloud-native, Health-focused Spatial Data Infrastructure (SDI). The Heat Health Risk Index, identified in advance the health and medical needs of an extreme heat-impacted population. The heat health risk index was calculated through a weighted combination of the prevalence of social and health factors that described the vulnerability of populations to extreme heat including low income, internet/phone access, over the age of 65, under the age of 5, chronic kidney disease and diabetes prevalence. Their work highlighted the diversity of the data needed to produce DRI for this project’s scenarios. Examples included data on:
Age: Age-based populations most vulnerable to extreme heat are adults aged 65 and older (Fouillet et al, 2006) and children 0-4 years of age (Wöhl et al., 2019). Young children have underdeveloped thermoregulatory systems and are more likely to dehydrate during extreme heat events, and older adults have attenuated thermoregulatory control and thus experience a decrease in sweating (Baldwin et al., 2023).
Land cover: Prior research has shown an association between extreme heat vulnerability and the percentage of non-green space. Further, there is a widely accepted association between the percentage of non-green space and poverty, which may contribute to increased vulnerability on its own (Reid et al., 2009).
Physical health conditions: People with preexisting chronic health conditions have been found to be at increased susceptibility to complications due to extreme heat events, including those with cardiovascular disease, diabetes, renal disease, nervous disorders, cerebrovascular disease, and pulmonary conditions (Reid et al, 2012).
The Heat Health Risk and built environment models together can provide decision-ready information about the spatial distribution of populations affected by heat events. Based on this modeling work, HSR.health has developed a heat-related hospitalization estimate applied to two scenarios within an identified pilot area on Manhattan Island in New York City: 100-degree Fahrenheit weather for 5 consecutive days as measured from Central Park with 50% humidity and the same weather conditions with the addition of 2 days without power. Outcomes of the modeling to date highlight the risk to occupants of upper floors of buildings, which are harder to cool, supply with water, and evacuate, and provide a more complete view of climate change-caused rising temperatures on health outcomes and its economic impacts.
Figure 30 — Heat Health Risk Index for the pilot region, showing the distribution of at risk populations.
The team also developed an estimation of heat-related hospitalizations through a combination of the Heat Health Risk Index and Extreme Heat Data from NOAA, surface type and 3D Building modeling data from GISMO and their collaborators. The estimate may be compared against the National Death Index and existing heat-related hospitalization data for NYC, which can be accessed on the same platform.
Figure 31 — Heat Health Risk Index and Hospitalization Estimates Workflow, developed by HSR.Health
The Extreme Heat team’s review and evaluation process highlighted that the integration of multiple lines of data to create improved models of the distribution of heat, at risk populations, power infrastructure, and power demand is essential to producing DRI for the scenario. Relevant types of spatially enabled Analysis Ready Data (ARD) include satellite imagery, terrestrial sensor information, framework data layers (imagery, elevation, streets, buildings, etc.), census data, and health information.
The completed operational prototype, connecting the components developed by Safe Software and HSR, provides a working example of how to implement the updated modeling workflows and recommended data layers. Guidance on the interpretation of the workflows outputs and recommendations for associated actions based on model results are being developed to accompany the technical toolkit. Full results of the analysis for the case study in NYC will be made available to illustrate the significant differences in model outcomes and, consequently in the information used for planning and decision-making, when a full range of factors including the UHIE are taken into account.
For a complete report of the activities of the Extreme Heat Team during this pilot see the Extreme Heat Supplemental Engineering Report.
3.5. AI and Knowledge Graphs to support SDI for disaster and climate applications including wildfire and flooding
A key objective of the OGC CDRP is to support innovation that can achieve at scale the ability to obtain and integrate data across multiple domains including environmental, human population, and health for climate resilience and emergency management analysis and decision support. AI is is currently seeing intense innovation, and holds promise towards supporting automated information management that empowers disaster managers and climate planners to manage increasing volumes and diversity of data. For this pilot, Xentity investigated the potential for generative AI in the context of wildfire response. TerraFrame explored the possibilies and challenges related to applying knowledge graphs to emergency management for flooding.
Xentity’s work focuses on exploring where Generative AI (Gen AI), while early in its technical evolution, can with a human-in-the-loop approach offer transformative potential for in the Wildland Fire Community by enabling scalability in data processing beyond human capabilities. This can further augment where the Wildland Fire community leverages data and advanced tools to enhance planning and operational decision-making, aiming to supplement rather than replace experiential knowledge and anecdotal evidence.
Figure 32 — Schema of Wildland Fire NIMS process in overview.
It is imperative that GenAI is explored particularly in scenarios where significant efficiency gains are feasible. Integrating GenAI into wildfire management necessitates a comprehensive strategy encompassing data, process, technical, and ethical considerations. The diagram above outlines the overall flow of:
Where Gen AI can support Wildland Fire as part of NIMS and ICS framework across the overall mitigation, preparedness, response, and recovery Disaster Management lifecycle.
Identification of Potential Gen AI Wildland Fire Field capabilities and core datasets needed in areas of Safety, Fuels, Topo, Weather, Fire Behavior and Planning.
Discussion on how the current state of the art in the ever evolving Gen AI space will require multiple technologies (RAG, AI Agents NLP, GANs, etc.) to augment LLMs to support real-time and highly contextual Wildland Fire data to provide acute and accurate responses, sourced from dozens of systems, document repositories, and APIs.
The Generative AI in Wildland Fire Management Report summarizes key considerations for Gen AI model development, challenges in leveraging Gen AI, and Wildland Fire Gen AI recommendations, across stakeholder, governance, data, technology and enterprise categories.
TerraFrame developed an alternative systematic approach to provide context, emphasizing semantic interoperability and knowledge graphs. Information systems and data management practices used to address these climate-related challenges are typically siloed, making it difficult to share knowledge and understand the cumulative effects (CE) and regional impacts due to varying geography, environments, and populations. Their work aimed to automate the integration of needed data for climate analytics and publish the results of such analytics as knowledge representations so that they can be integrated into other analytics efforts using machine-readable interfaces.
This approach is particularly important for the multiple and interrelated challenges across sectors impacted by climate change, which collectively produce a larger negative effect, and that organizations must overcome barriers to achieving cross-sectoral collaboration and analysis at scale. Their project built on prior work developing a spatial knowledge graph (SKG) for the U.S. Army Corps of Engineers (USACE) to support climate resiliency and project planning. Formally defining feature classes as geo-object types and their semantic relationships using geo-ontologies provides metadata for publishing SKGs as products that are interoperable by common geography within a knowledge sharing ecosystem. By using the spatial knowledge mesh architecture, which is an extension of a data mesh, organizations can independently develop their own SKGs that reference graphs published by authoritative sources and automatically receive updates as dependencies change over time (McEachen, et. al 2023).
The left side of the diagram below depicts a geo-ontology defining a SKG to capture the semantic geospatial relationships of interest for determining flood CE (Cumulative Effects) related to their work with USACE. The right side illustrates how one can associate tabular datasets with their corresponding geo-object type and edge definitions to a geo-ontology. The Leveed Area table has a reference to Levees that map to the edge type protects. The calculation to determine structures at risk of flooding is mapped to the contains relationship connecting Leveed Areas and Structures.
Figure 33 — Geo-ontology for defining a graph to discover the cumulative effects of flooding and how to populate it with tables published in an SDI.
The following diagram depicts a SKG defined by the example USACE geo-ontology. By populating the graph with ARD and DRI according to the mapping between tables and the geo-ontology, the transitive relationships between geo-objects (i.e., geo–features) are discoverable across those tables. In this case, although Populated Area 2 is not impacted by the breached levee, it has nonetheless lost access to healthcare services, as it is in the catchment area of a hospital that has been flooded. Also depicted is how the concept could be extended to determine disruptions due to compromised transportation infrastructure.
Figure 34 — Spatial Knowledge Graph for discovering cumulative effects of flooding across domains.
This model may be connected to initiatives such as the OGC Rainbow, which shares the concept of using a metamodel for defining semantic structures but does not support the concept of versioned periods of validity and directed and undirected semantic edge types. More detail about this work can be found in the full supplemental engineering report from TerraFrame, including an algorithm for merging SKGs and bringing geospatial knowledge to LLMs (https://doi.org/10.17605/OSF.IO/CE34A).
4. Conclusion
In conclusion, this report underlines the following aspects:
Enhanced Predictive Capabilities: The integration of geospatial data with machine learning models has improved the tracking and forecasting of extreme weather events, including typhoons, floods, landslides, and extreme heat. By leveraging real-time data integration and dynamic forecast models, the report demonstrates enhanced predictive capabilities for better preparedness and response to these hazards.
Advancements in Analysis-Ready Data (ARD) and Decision-Ready Indicators (DRI): Developing workflows to convert raw geospatial data into ARD and DRI has proven essential for decision-makers. These indicators help assess climate and disaster risks and support more informed decision-making in sectors like urban planning, public health, and emergency management.
Interoperability Through Open Standards: OGC’s emphasis on open standards enables interoperability across various data sources and analytical models. The use of APIs and standardized processes enhances data accessibility, enabling diverse stakeholders to collaboratively work on solutions for climate resilience and disaster management.
Urban Heat and Health Impacts: The project highlighted the significant public health risks associated with urban heat, especially in densely populated urban areas like Northern Manhattan. The study underscored the importance of detailed spatial modeling of heat islands and emphasized the need for emergency systems to consider combined heat and power outage scenarios.
Targetted Real-time Alerts and Warnings: The increasing likelihood of extreme weather incidents drives the need for early warnings that focus more accurately on the spatial areas and population types likely to be effected, so more effective steps to mitigate adverse affects can be taken. An example of this was achieved in the CDRP24 pilot by combining lower resolution weather or climate forecasts with higher resolution local spatial and demographic data. This helped tune impacts estimates such as temperature forecasts with local variables and other contributing factors such as ARD models of Urban Heat Island effects as described above.
Application of AI and Knowledge Graphs: AI, particularly knowledge graphs and generative AI, holds significant potential for cross-sector climate and disaster management. These technologies facilitate enhanced data discovery, integration, and analysis, supporting complex scenarios, such as wildfire and flood management, and promoting scalable solutions.
Stakeholder Engagement and Visualization: Engaging stakeholders through interactive visualizations and accessible data has been vital in translating technical analyses into actionable insights. Tools developed within the pilot, like 3D urban models for heat exposure, have improved public and institutional understanding of climate risks and fostered support for resilience planning.
Future Outlook on AI-Driven and Real-Time Data Applications: Looking ahead, the pilot emphasizes the importance of expanding AI integration and real-time data streaming to further refine climate and disaster models. AI-driven virtual assistants and high-interoperability data services are expected to play crucial roles in future applications, improving emergency response and resilience against escalating climate risks.
The advancements in AI and blockchain technologies will significantly enhance SDIs, particularly in metadata management and data discovery. These concepts will improve data quality, efficiency, and usability, providing valuable insights for climate and disaster resilience efforts. Future work will continue to pilot these approaches, expanding their application and impact.
Overall, the CDRP24 pilot demonstrates the value of cross-domain collaboration, innovative technology, and standardized data integration to enhance climate and disaster resilience across urban, rural, and at-risk regions globally.