Draft

OGC Engineering Report

OGC Climate and Disaster Resilience Pilot 2024 Report
Dean Hintz Editor Ingo Simonis Editor
OGC Engineering Report

Draft

Document number:24-043
Document type:OGC Engineering Report
Document subtype:
Document stage:Draft
Document language:English

License Agreement

Use of this document is subject to the license agreement at https://www.ogc.org/license



I.  Overview

The Open Geospatial Consortium (OGC) Climate and Disaster Resilience Pilot 2024 (CDRP24) was driven by the urgent need for new methods, tools, and systems to better understand, predict, and address phenomena such as the intensification and changing patterns of typhoons, landslides, flooding, and extreme heat events.

To meet these needs, the CDRP24 initiative brought together private sector, research, and government organizations to develop and prototype new information frameworks, data and analytical services, and software tools that can enhance forecasting and preparedness. In particular, this pilot addressed critical gaps in Analysis-Ready Data (ARD) needed to understand natural hazard risks like landslides and extreme heat. It also explored how these gaps can be addressed by applying open standards to value-added data flows, ensuring that decision-makers and the broader community receive the timely information they need to act—both to improve resilience and to mitigate the effects of these hazards.

Geospatial information and technologies are essential for enhancing the climate resilience of communities worldwide. They also play a central role in designing and implementing the next generation of emergency and disaster management systems. This solutions-oriented pilot explored both dimensions, leveraging OGC standards, OGC’s Agile Reference Architecture for geospatial systems, and the contributions of OGC’s member organizations.

Three main lessons can be learned from the CDRP24 project:

  • Some technical components required to assemble workflows that integrate diverse datasets are relatively mature and have been proven useful in practice. This is evidenced by their implementation in multiple application-specific workflows for tasks such as enhanced typhoon monitoring, landslide detection and prediction, and weather event modeling. Solidifying the shift to API-based architectures, pervasive throughout the applications developed in the CDRP, is vital to this maturation.

  • The expectation that model outputs will be available as Analysis-Ready Data (ARD), Decision-Ready Indicators (DRI), contextualized information, and comprehensible representations to non-technical/geospatial specialists has continued to increase. Integrating AI into the geospatial toolkit reshapes expectations for communicating analysis and modeling results.

  • The effective incorporation of contextual and domain-specific knowledge within information systems is considered necessary to improve the discoverability of relevant data and information. Effective incorporation includes ‘true’ semantic interoperability of workflows and interpretability of processes. Multiple methods to integrate contextual and domain information are being explored and have been proven as viable options, including knowledge graphs, AI generated syntheses of extensive bodies of knowledge, and processes for human experts producing documentation.


II.  Executive summary

In this phase of the Climate and Disaster Resilience Pilot Project (CDRP), innovative solutions were developed to address key challenges in climate risk monitoring, prediction, and response. The following highlights summarize the major achievements:

  1. Developed workflows and components for tracking typhoons and extreme atmospheric events using analysis-ready data (ARD) compliant with OGC standards to improve predictions and early warnings of such emergency events. To achieve this, efforts focussed on improvements to real-time data integration, dynamic forecast models, air-sea energy exchange representations, and integrating machine learning models for better prediction and tracking, as well as typhoon data integration for enhanced predictive analytics.

  2. Improved the use of satellite imagery and sensor data to predict landslides and map affected areas. Work in this Pilot implemented scientific algorithms and deep-learning models to enhance detection and prediction accuracy. All algorithms and models use OGC API-Processes to standardize execution, ensure reproducibility, and facilitate the sharing of models.

  3. Created workflows for assessing flood risks using high-resolution Digital Elevation Models (DEMs) and hydrological models to improve resolution and accuracy in flood simulations and impact assessment on a regional basis. This work leveraged GeoCube concepts and implementations for managing large-scale geospatial data and real-time observational stream computing to deliver improved results.

  4. Modeled extreme heat events and their impacts on human health. By modeling the urban heat island effect and population demographics in Northern Manhattan, New York, spatial data and analytics were used to create ARD and DRI to simulate various warming scenarios, including the effect of extreme heat on artificial urban environments. The combined effects of power blackouts were also considered. This work highlighted the need for improved communication, backup power solutions, and government policies to mitigate heat-related risks and points to proactive measures that can be taken.

  5. Produced weather event workflows that integrate weather sensor data for urban digital twins, enhancing real-time climate risk assessments and scenario analysis. These workflows provide critical environmental information to support, among other things, the modeling and reporting of extreme heat in New York City. All workflows are accessible via OpenAPI standards.

  6. Explored advanced metadata retrieval methods for environmental data, promoting more efficient data discovery and integration across domains. This work included a review of approaches for transitioning from Spatial Data Infrastructures (SDIs) to Geospatial Knowledge Infrastructures (GKIs) using AI and knowledge graphs to support scalable, cross-sectoral climate analytics.

  7. Engaged stakeholder communities and supporting the uptake of tools and services developed through effective visualizations, interactive websites, community hubs, and user needs assessments. This project involved simulating carbon emissions from 3D building datasets using real-time weather profiles. It also included creating 3D visualizations of urban environments, focusing on the role of vegetation in reducing urban heat islands and improving public understanding through interactive models. Additionally, the project aimed to streamline 3D visualization workflows using OGC standards.

III.  Keywords

The following are keywords to be used by search engines and document catalogues.

climate change, climate change impact, FAIR, climate service, climate resilince, disaster management, natural hazard, weather event, risk management, Geo AI, typhoon, landslide, extreme heat, urban heat island, wildfire, open standards, OGC, OGC API, Ananlysis Ready Data, Decission Ready Information, Geospatial Reporting Indicator, data integration, impact analysis, hazard modeling, open science, earth observation

IV.  Contributors

All questions regarding this document should be directed to the editor or the contributors:

NameOrganizationRoleORCid
Dean HintzSafe SoftwareEditor
Ingo SimonisOGCEditorhttps://orcid.org/0000-0001-5304-5868
Kailin OpaleychukSafe SoftwareContributor
Vyron AntoniouHartisContributor0000-0002-7365-9995
Loukas KatikasHartisContributor0000-0003-1886-4125
Stelios ContarinisHartisContributor0000-0002-5789-4098
Thunyathep SanthanavanichWiTech GmbHContributor0000-0001-9852-7000
Rushikesh PadsalaWiTech GmbHContributor0000-0003-2866-8443
Volker CoorsWiTech GmbHContributor0000-0002-0413-8977
Sarah ChangFeng Chia UniversityContributor
Simeon WetzelTU DresdenContributor0000-0001-7144-3376
Stephan MäsTU DresdenContributor0000-0002-9016-1996
Klaus SchneebergeralpS ConsultContributor
Jan SchmideralpS ConsultContributor
Franziska AllerbergeralpS ConsultContributor
Joel ClineNOAAContributor
David GreenS&T InnovationContributor
Ajay GuptaHSR.healthContributor
Paul ChurchyardHSR.healthContributor
Bandana KarU.S. Department of EnergyContributor
Jiin WenGISMOContributor
Theo GoetemannBasil LabsContributor
Josh LiebermanJosh LiebermanContributor
Alan LeidnerGISMOContributor
Matt TricomiXentity CorporationContributor
Kevin HopeXentity CorporationContributor
Brian GoldinVoyager SearchContributor
Alex BosticVoyager SearchContributor
Shruti SureshVoyager SearchContributor
Jason PardyVoyager SearchContributor
Eldrich FrazierUSGS/FGDCContributor
Ian TobiaFGDCContributor
Maria GilmoreFGDCContributor
Nathan McEachenTerraframeContributor
Justin LewisTerraframeContributor
Gérald FenoyGeoLabsContributor
Chetan MahajanIndian Institute of Technology Bombay, IndiaContributor
Rajat ShindeUniversity of Alabama in Huntsville, US /OSGeoContributor
Surya S DurbhaIndian Institute of Technology Bombay, IndiaContributor
Rachel OpitzOGCContributor
Nils HempelmannOGCContributor

V.  Value Proposition

CDRP24 ultimately enhances community climate resilience and supports the modernization of emergency and disaster management systems to handle the increasing volume, complexity, and speed of hazard-related data streams. The results deliver value to industry, government, science, and research communities by enabling the discovery of exemplary workflows and data services. It also adds value for newcomers to these fields by reviewing and evaluating existing technologies, offering insights into the state-of-the-art from an open-standards perspective.

This pilot addressed several overarching value themes. First, climate and disaster impact management systems increasingly require the integration of real-time data streams with observational and model-based outputs to better assess both current and future risks. To improve the analysis and communication of impact risks, various analysis-ready data (ARD) structures and decision-ready indicators (DRI) were explored. These included time series of storm tracks, urban heat distribution patterns, and DRIs using time-interpolated heat distributions rendered on 3D buildings in New York City.

Cutting-edge AI and knowledge graph approaches were also explored in the context of natural hazard assessments, such as wildfire risks. These technologies are essential for enhancing human capacity to process and evaluate large volumes of complex environmental data. Finally, the pilot investigated methods for interconnecting related hazard impact models using open standards. This capability to automate the assessment of cumulative climate and disaster impacts is critical for supporting integrated decision-making and planning to mitigate these combined effects.

VI.  Future Outlook

The OGC Climate and Disaster Resilience Pilots (CDRP) will continue to promote shared technical advancements through collaborative research and support organizations in developing and demonstrating applications built on community standards. Current activities will evaluate the state-of-the-art in application-specific geospatial workflows and generate new workflow implementations based on OGC standards. These efforts will showcase new approaches to several key challenges, including the production of Decision-Ready Indicators (DRI), the integration and communication of contextualized information, and the incorporation of Artificial Intelligence (AI) into the geospatial toolkit.

AI tools—particularly Generative AI (GenAI)—have specific requirements for the data they consume. In the next stage of the CDRP, the pilot will develop AI virtual assistants to explore these requirements, assess the capabilities of current GenAI components, and determine how to provide the necessary context to make these tools effective. These virtual assistants will interact with various data sources, services, and platforms, necessitating a set of standards-based APIs and information models to ensure reliable communication and interoperability among all components.

To effectively leverage GenAI, all data services must be robust and highly interoperable. In future pilot stages, GenAI functionality will be enhanced by analyzing existing services and their information models. This analysis will include operational services, Analysis-Ready Data (ARD) products, and ontologies. The pilot will assess whether current services and data products meet the FAIR principles (Findable, Accessible, Interoperable, Reusable), which are essential for GenAI. This will involve creating crosswalks between ontologies and standard API data models.

The challenge of successfully using GenAI tools extends beyond technical considerations. It involves integrating diverse interests and requirements within an ecosystem of users, policies, services, and data. Therefore, this initiative aims to engage a broad cross-section of stakeholders—including data and service providers, users, researchers, responders, and community members—to help shape a connected ecosystem of data, technologies, and practices. We encourage anyone interested in enhancing understanding and building capacity for informed action through the integration of spatial information and technologies across climate, disaster, and emergency management domains to participate in this pilot.

Looking ahead, further work is needed to develop open standards that support data collaboration, integration, and automation. In upcoming pilots, it is expected that wrapping components with OpenAPI OGC APIs—such as OGC API Processes or OGC API Environmental Data Retrieval (EDR)—will become increasingly common. This approach will make pilot components more shareable, modular, and interchangeable long after the pilots conclude. Significant progress has already been made in this direction for several components, and the trend toward deploying components via an open API enterprise service architecture is well underway.

However, while making pilot component services more accessible is important, much work remains in developing domain-specific standards related to climate change impacts and disaster management. Several domain-specific open standards were utilized in this pilot, including OASIS CAP XML (Common Alerting Protocol), OGC API EDR, and the C3S Climate Data Store. Nonetheless, the OGC has only just begun to explore standards in the climate and disaster domains related to ARD and DRI. Continued efforts by relevant domain working groups—such as Climate Resilience, Emergency and Disaster Management, and the new Geospatial Reporting Indicators group—will be essential for advancing this work and ensuring continuity across pilots. Collaboration with other sustainability-focused organizations at the national and international levels—such as UNEP (e.g., UNCCD), the EU (e.g., CLINT), and North American and other environmental or disaster management agencies—will also be critical.

1.  Supplementary Documents

In addition to this report, additional documents have been produced that provide more detail on specific technical aspects. These include:

2.  Introduction

Organizations and individuals worldwide are striving to address the challenges posed by climate change and the increasing frequency and severity of disasters associated with extreme weather events. While the connections between enhancing climate resilience and managing disasters and emergencies are widely recognized, the geospatial data, services, and systems used across these domains remain poorly integrated.

The central goal of this pilot was to build the capacity to develop products, services, and platforms that leverage open standards to provide actionable information relevant to emergency and disaster management, while also feeding lessons learned into climate resilience planning. Building on the outcomes of previous pilot initiatives, the OGC Climate and Disaster Resilience Pilot 2024 (CDRP24) brought together a consortium of partners to address these challenges and bridge existing gaps by utilizing geospatial data and technologies within the broader context of Climate Resilience and Emergency and Disaster Management.

On the climate impact side, the focus was on the intersection of typhoon activity and the increased risk of flooding and landslides. Use cases included study areas in Taiwan and Greece, where data integration and processing pipelines were established to generate climate resilience information, resulting in early warning systems.

In terms of climate and health, one of the key focuses of CDRP24 was extreme heat events. These events are linked to rising mortality rates, particularly in urban areas, and are identified as one of the most severe near-term impacts of climate change. In addition to these investigations, the pilot also emphasized the development of real-time observation systems to monitor extreme events and alert affected local communities.

3.  Work Areas

3.1.  Extreme Weather Event monitoring workflows applied to Typhoon, Flooding, Heat and Landslides

Prototype workflows that utilize OGC standards and standards-based APIs to deliver data-driven services—including modeling and analytical services—can now be developed and deployed rapidly. This advancement is demonstrated by Hartis’ review of typhoon and atmospheric models, GeoLab’s implementation of a landslide model, and additional examples currently in development.

Hartis’ Typhoons and Atmospherics Report outlines their approach to developing key components and workflows for typhoons, atmospheric rivers (ARs), and other extreme atmospheric phenomena. The report provides a comprehensive review, including:

  • analysis ready data (compliant with OGC standards)

  • workflow models (based on meteorological data and process effectiveness)

  • identification of key processing steps (data aggregation, normalization, accuracy)

Examining Tropical Cyclones (TCs) through extensive global simulations covering multiple decades necessitates the objective and automated identification and monitoring of these storms, a task achieved through specialized “TC trackers”. These trackers are algorithms designed to recognize cyclonic formations characterized by warm cores within gridded datasets and connect them into coherent trajectories (Bourding et al. 2022). Key parameters and modeling schemes according to Liu et al. (2023) can be identified as:

  • Typhoon track: Large-scale weather patterns, the sea surface and atmosphere temperatures, features of land topography, structure, and intensity.

  • Typhoon intensity: Maximum wind speed near typhoon center at standard 10-m height and minimum sea level pressure (MSLP) at typhoon center. Wind and gust forecasting and early warning information are very important.

  • Typhoon-induced rainfall: Quantitative Precipitation Estimation (QPE) and Quantitative Precipitation Forecasts (QPF) for hydrological forecasting modeling are critical to forecasting the quantity, location, time, and duration of TCs.

The work focused on analyzing different Numerical Weather Prediction (NWP) datasets and TC trackers used for predicting TC’s paths and intensities. A range of TC trackers have been tested for their performance including those detailed in the following reports: (Arthur et al. 2021; Bourdin et al. 2022; Kitamoto et al. 2024; Pérez-Alarcón et al. 2024). Their literature review demonstrates there is a large pool of observed TCs that all trackers detect: There were 3510 tracks in total in IBTrACS catalog and about 2400 of them are detected by almost all trackers. Notably, the detection skills of all trackers are identical and nearly perfect for the strongest tracks in a simulation. This means that no detected track beyond category 2 (included) is a false alarm, and TC observed in categories 4-5 are found regardless of the tracker used. This situation suggests the need for interoperable TC models which can be compared or used as an ensemble.

To progress this, workflows and tools for numerical weather prediction data fusion and interoperability are being examined by the Hartis team. This review of OGC Standards, libraries, and APIs focuses on the capabilities needed to support User-Defined Processes for utilizing OGC API Processes to allow users to define and deploy custom processing algorithms in the cloud. This is particularly useful for researchers and meteorologists to run custom models and simulations based on specific needs or experimental setups. This enables Machine Learning Models used in the TC trackers reported above (see Fig. below) to be integrated directly through an API endpoint. This enhances the capabilities of OGC API Processes and/or OGC Environmental Data Retrieval (EDR) API to support deploying and managing machine learning models that can predict typhoon behavior based on historical data.

The detailed report and Review of Typhoons and Atmospheric Models is published on Zenodo. The Flood Impact Analysis Workflow, which was developed for the realization of the calculation and analysis of areas that will be affected by a flood incident, is available on an Open Science Framework (OSF) node. It can be downloaded and used for further experimentation.

 

Figure 1 — Potential User-Defined Processes for Track Forecasts (red rectangle), utilizing OGC API Processes to allow users to define and deploy custom processing algorithms, for example by deploying machine learning in typhoon forecasts in different TC trackers (Liu et al. 2023)

Also focused on this challenge, Wuhan University’s Early Warning Flood Demonstrator deploys an overlapping set of tools to meet region-specific requirements in China. Due to complex geographic and climatic conditions in the region, flash floods exhibit characteristics of wide distribution, large quantity, strong suddenness, and high destructiveness. The extensive monitoring of floods poses challenges in organizing and managing geospatial observation data. The team proposed a new geospatial infrastructure called GeoCube Gao et al. (2022), which facilitates the management and analysis of massive Earth observation data. Compared to previous work on data cubes, GeoCube extends the capacity of data cubes to accommodate large vector and raster data volumes from multiple sources, making it suitable for complex flood warning analysis. Additionally, to address the system’s support for data stream processing, we also propose a real-time modeling observational stream computing model Shangguan et al. (2019). This model integrates sensor web with models in a streaming computing environment to provide timely decision support information.

Building on previous work, the team proposed a system architecture (see diagram below) supporting automated data retrieval and preprocessing workflow that fetches data from multiple sources and storage in different databases by type. The system uses the Spark distributed computing framework and a variety of third-party geographic analysis libraries to enrich operation. Operators and data are provided to the frontend via the OGC API, where users can interact with a visualization interface based on Cesium to obtain thematic map results as needed. This process uses OGC API-Features to provide web services for vector data, OGC API-Coverages for web services of raster data, and OGC API-Processes to provide web services for operators. With these services, users can call their own data and create their own workflows on the frontend of the system.

 

Figure 2 — Overview of the system developed by Wuhan University

The workflow established an interface for hourly data retrieval and preprocessing for the warning system, enabling the system to automatically update with the latest observation data, ensuring the timeliness of the alert system. It sets up a flood simulation workflow that implements several hydrological models based on a foundational geographic computing library and simulates floods by coupling these models together. This process enabled tracking changes in surface water accumulation during rainfall events.

On this basis, the team developed comprehensive flood warning indicators. For the indicators, ponding analysis was conducted using depressions of different scales. The analysis considered that during rainfall events, surface water first accumulates in local depressions and then gradually converges towards larger depressions across the entire area. This work allowed the simulation of the process of water accumulation from small to large depressions and calculated a weighted average of multi-scale ponding risks based on this process. Consequently, the system is capable of producing a comprehensive flood risk index for the entire area.

 

Figure 3 — Wuhan University’s Early Warning Flood Detection Model.

Hartis’s Early Warning Flood demonstrator highlights another path, building a workflow in ESRI’s software suite and publishing the final workflow as an OGC standards conformant service. This work illustrates use of regional higher resolution DEMs in automated workflows developed for flood modeling at a regional scale, which may be adapted and reused in new regions. To support reuse, their objective is to present end-users with an abstraction of the workflow, requiring only a few essential inputs to run in a python environment.

 

Figure 4 — Hartis’s early warning flood detection workflow.

This approach was paralleled by Safe Software. Their weather event workflow component took existing publicly available weather service data streams such as from the National Weather Service and enriched them with the context of local hazard information to increase the resolution and relevance of warning products for local residents and responders. This experimental application was implemented for North Manhattan in New York City as part of Safe’s contribution to the NYC GISMO led Extreme Heat project. This could also be readily deployed for other urban settings given the appropriate foundational data layers and services.

For example, temperature observations and forecasts were combined with urban heat index information so that areas more affected by urban heat island effects could receive localized warnings that take these temperature stresses into account. This workflow was authored on the FME platform. It accepted GeoRSS requests from client applications, consumed weather observations from the National Weather Service API on api .weather .gov, and queried the Urban Heat Index ARD component via OGC Feature API for locations of interest. Alert notifications were transmitted back to the GeoRSS client and also were available via OASIS Common Alerting Protocol (CAP) XML or HTML messages.

 

Figure 5 — Weather Event Service & ARD workflow to support urban heat scenarios.

Weather Event Service Workflow:

  1. Accept client requests by location

  2. Query NWS API to get area observations for requested location.

  3. Query CDRP Urban Heat ARD service via OGC Features API for requested location

  4. Combine NWS observation with urban heat index from ARD service to determine projected localized heat level for requested location.

  5. Apply heat threshold DRI rules to determine appropriate warning level for requested location.

  6. Transmit appropriate warning via GeoRSS feed response, HTML or CAP XML message.

This localized urban weather notification tool added to the work Safe Software has contributed in previous OGC disaster and climate pilots related to heat, drought and flood modeling. Safe Software’s ARD to DRI contribution and component provides further details as to how data from this weather event service feeds decision indicators and notifications.

GeoLabs standardized approach for identifying potential landslide-affected areas utilizing satellite imagery from Sentinel 1 and Sentinel 2 in GeoDataCube format provides another example. They use sensor data on rainfall and displacement to predict future displacements and Sentinel 1 & 2 imagery to build a deep-learning model that predicts landslide-damage risk zones. The output is a thematic map for areas potentially affected by landslides. Utilizing data cubes makes data management easier and further helps in analysis.

Their work provides a regional implementation in Taiwan, which is highly prone to earthquakes, due to the intersection of three tectonic plates. The area is known as “Ring of Fire”, and is the most seismically active zone in the world. The region’s highly mountainous landscape magnifies ground shaking, leading to landslides.

The proposed prototype is a server implementation of a standardized framework built on Common Workflow Language (CWL) to support workflows for landslide susceptibility mapping and analyzing corresponding Decision Ready Indicators. The workflows are comprised of the following primary components supporting OGC API Processes — Part 2: Deploy, Replace, Undeploy.

 

Figure 6 — Schematic view of a landslide event.

Processes for landslide detection and prediction are implemented in a inference-as-a-service framework for passing inputs to the model for generating landslide detection maps and rainfall displacement.

 

Figure 7 — A conceptual workflow (left) and an implementation (right), showing the processes description for the services required to be developed for this landslide modeling task.

Two weather event modeling workflows consolidate the maturing of this toolset. WiTech GmbH developed a prototype capable of simulating carbon emissions from existing 3D building stock datasets that uses future weather profiles in real-time on the web, leveraging the SimStadt simulation engine integrated through OGC API: Processes and taking OGC API 3D GeoVolumes as an input data format.

On opening the web application, the client fetches default OGC API – 3D GeoVolumes endpoints, displaying available collections and their extents. Users can add custom 3D GeoVolumes endpoints. On selecting a 3D GeoVolume, the client retrieves the associated 3D city model from the server, centering the map on its spatial extent. Users can inspect building information in the 3D Tiles batch table. They can then initiate a simulation process for registered city models, defining process parameters and styling preferences. Configuration parameters are submitted via HTTP post request to the OGC API – Processes endpoint, where the process is executed server-side and results are returned to the client.

The overall benefit of this central standard interface is that it can help streamline multiple data operations into one online end-to-end web client which ultimately will allow even non-simulation experts such as architects or urban designers to make faster and better decisions about the effects of climate change on carbon emissions of building stocks, including for future climatic scenarios.