Data-driven analysis and counterfactual frameworks for forward-looking risk assessment

Data are pivotal to understand critical changes within and around Cities and organizations: although Covid-19 pandemic contributed to accelerate the journey towards a more structured use of available information for decision-making, much more should be done to have truly data-driven resilient Cities.

A specific domain where data can be further exploited is risk assessment. Traditionally, risk is calculated as the combination of hazard, exposure, and vulnerability – this has been a powerful framework for disaster risk management for many years, and it is still applied when dealing with risk transfers and insurance-related matters. But these risk assessment tools find some limits in the vulnerable world we live in.

“We have amazing tools to understand and manage risk, but these tools need a paradigm shift. These include accounting for climate change, rapid urbanization, long-horizon planning, including “Black Swan” events,” told Dr. David Lallemant, Head of Disaster Analytics Society Lab at Nanyang Technological University, during a recent event by Resilient Cities Network.

Cities needs to consider a longer timeframe when making decisions, as today’s major climate investments will impact urban development for the next 50+ years. And here is where data come into play. Data-driven dynamic models can provide smarter, forward-looking risk assessment since working on time-varying hazard rates, urban development simulations, and time-dependent fragility models.

Moreover, Dr. Lallemant highlighted that current risk management and emergency preparedness programs are mostly shaped on past events, learning from past adverse events such as earthquakes, floodings, or extreme weather phenomena to identify how likely they are to occur again, how to prepare and mitigate their impact. But past experiences and existing data may not be enough in a context of extreme uncertainty: Cities need a different risk assessment approach to tackle low probability and totally unexpected events.

A recent research paper suggests an innovative framework for incorporating downward counterfactual thought in disaster risk modeling for both natural and human hazards. In a nutshell, once an event has been identified, its historical data and parameters are collected and correlated to identify the small changes that might result in downward, worsening consequences. The framework was tested on five case studies in Singapore, including Mt. Pinatubo eruption in 1991 with the extensive dispersal of volcanic ash over the airspaces of Southeast Asia, and the tropical cyclone Vamei that hit the region in 2001.

In recreating those past events and exploring the many alternatives that might have happened, researchers managed to define plausible high-consequence downward counterfactual scenarios and identify priorities for disaster preparedness, also acknowledging some areas where data are currently unavailable or under-available, and future investigation is to be recommended.

The idea behind these studies is that we should not wait until experiencing a disaster to learn how to build more resilient Cities. The application of data-driven dynamic models in concert with counterfactual frameworks can enable communities to perform better risk assessment and make better decisions for their good.

Latest Articles