Quantifying Risk Before Disasters Occur: Hazard Information for Probabilistic Risk Assessment

03 November 2014

By Manuela Di Mauro, Risk Knowledge Section, United Nations Office for Disaster Risk Reduction (UNISDR)1


Risk is a forward looking concept that implies an eventuality of something that can occur. Therefore, assessing risk means looking at the possible events that can occur, quantifying how likely they are to happen and appraising the potential consequences should they occur.

Assessing risk only based on past events does not provide complete information on the current state of the risk, for several reasons:

  • Records of past events cover a limited amount of time and therefore might not include infrequent but severe hazards which are present but which simply might not have occurred previously within the time covered by the catalogue;
  • Observed events do not reflect the full distribution of possible future events; in fact, no two events are exactly the same, thus basing the risk assessment only on past events might not adequately anticipate future events of greater magnitude, of different durations, in different locations, etc.; and
  • Record of past events do not usually provide complete temporal and spatial information about the event and detailed records of consequences, especially linked with the local severity of the hazard.1

It is important to use an approach that builds on past records but that also takes into account events that may occur in the future although they have not been mentioned in catalogues or captured in loss databases. Such an approach allows a better coverage of the possible events and provides an improved estimation of the probability of occurrence of each event and associated losses. Decision-makers use probabilistic risk assessment to know which events and losses can possibly occur as well as their likelihood and frequencies of occurrence.

Although specific applications are strongly dependent on the scale of the assessment, probabilistic risk assessment is generally used for:

  • Designing risk reduction interventions, using probabilistic information on hazard intensities, exposure and vulnerability;
  • Disaster risk reduction financing and budgeting; and
  • Cost/benefit analysis, comparing the cost of specific interventions with the reduction of losses following the implementation of these interventions.

Hazard

In a probabilistic risk assessment, the hazard is usually represented through a stochastically generated set of all the events that could possibly occur, each associated with a frequency of occurrence. In this way the model is able to statistically represent the probability of events that have not yet occurred at a given location.

From the set of hazard events that are built for the probabilistic risk it is possible to reconstruct the hazard curve2, which relates a value of intensity to the probability of exceeding that value. This curve is necessary for developing local risk reduction strategies for building resilient infrastructure – roads, bridges etc. – land use planning – identification of low risk areas for development – defining appropriate building codes, etc. However, these applications require a description of the hazard with good spatial resolution, subject to the quality of the input and the scale of the analysis. For example, to design an earthquake-resistant building, the magnitude of a possible earthquake at its epicentre would not suffice, it is necessary to describe the propagation of the seismic wave and the actual “ground shaking” that would affect a structure. Similarly, to design a bridge, a hazard curve describing probable rainfall at a particular point within a catchment is not sufficient; a reconstruction of how this rainfall translates into runoff and then into the river flow that propagates in different parts of the domain is required.


Damaged buildings after an earthquake in Sichuan Province, China in 2008 /  © Wu Zhiyi / World Bank

Designing such interventions and infrastructure also requires knowing the intensity of the hazard that should be used as reference. For example, when building infrastructure in a flood prone area, one might ask: How wide should the bridge be? How much drainage will the road need? How far from the river should the school be situated? As different possible events correspond to different values of flood depth, we need to know the expected value of the flood depth in each point of the domain, as well as the likelihood of this value being surpassed.

In other words, designing risk reduction interventions, as well as risk-proofing infrastructure, requires the knowledge of the hazard’s spatial variability, in the form of the probability of exceeding a certain value of intensity in each point, or in specific points, of the domain. This information is needed with a resolution that varies between a few centimetres to hundreds of metres, depending on the requirements of the application. This knowledge can only be achieved through reconstructing – modelling – the spatial variability of the hazard for a stochastically generated set of events.

The hazard curve at each point of the studied, modelled, domain can be built by applying this stochastically generated set of events to the domain, each event linked with its probability of occurrence. As the events are considered independent and mutually exclusive, the resulting probability of exceeding an intensity a can be calculated as:

p(a)=ΣNi=1 P(A > a|ei) ∙ f(ei)

Where: p(a) is the probability of exceeding an intensity a, P is the probability of exceeding an intensity a, given the occurrence of the event ei, f (ei) is the annual frequency of occurrence of the event ei, and N is the total number of events e. Equation (1) implies that, for each event, the intensity in one point is expressed as a probability distribution. In this way the uncertainty in the estimation of each event ei is integrated in the hazard curve. If only one value of intensity is available for the event ei (i.e. P(A>a|ei) = 1) and there is only one event that exceeds the intensity a among those modelled, then the intensity excess rate p(a) equals the annual frequency of occurrence of the event.

The inverse of the annual frequency of the event is the “Return Period.” The return period should be regarded as the inverse of the annual frequency of occurrence and not as a recurrence interval. For example, a return period of 1 in 250 years does not correspond to an event that will occur exactly every 250 years, but to an event that has a 0.4% chance of occurring in any given year.

However, the hazard assessment alone is not sufficient to appraise the risk and design risk reduction interventions. In fact, such interventions, including designing structures to withstand hazards or regulating land use planning, can be costly. To appraise the direct benefits of risk reduction, for instance in terms of return on investments, it is fundamental to quantify the likely losses if the interventions are not implemented or structures are under-designed, and compare these losses with those that could occur should the intervention be implemented. For this, a probabilistic hazard assessment needs to be paired with the full assessment of the risk, which includes taking into account of the impact of the hazard on the exposed elements.

Exposure and vulnerability

To assess the impact of the hazard, the first step is analyze and reconstruct the environment that can be affected. In general, exposure data identify the different types of physical entities that are on the ground, including built assets, infrastructure, agricultural land and people. The characteristics to be assessed depend on the scope of the analysis. If the risk is assessed in terms of losses in the built environment, structural types and construction characteristics are needed. If the risk assessment includes damages to agricultural land, types of crops and their seasonality has to be considered. An analysis on mortality risk will require demographic and socioeconomic characteristics of the population.

The exposure data should contain the physical location of the asset as well as the characteristics of the asset that influence its vulnerability and enable the assessment of the damage or loss to the asset. These characteristics typically can include:

  • geographical location of each exposed element,
  • structural characteristics,
  • replacement values,
  • human occupation/population density/number of people in each location, and
  • socio-economic characteristics of the population at each location
      

  Example of loss exceedance curve

The exposed elements are usually classified based on their typologies, for example, by building taxonomy, by age group, etc. This classification is relevant to assign the vulnerability to each exposed element.

Once the physical characteristics for each exposed element are defined, it is possible to establish and assign the likely damage, and subsequently losses to that element subjected to a specific hazard. This is done by defining relationships between a measurement parameter of the hazard – water depth in case of flooding or the spectral acceleration in the case of earthquakes – to the likely damage of the particular element or type of elements. The damage may be expressed in percentage or in term to their replacement value. The name given to these relationships – between hazard and loss – varies from one field to another. In earthquake engineering, they are often called “vulnerability functions”; in flood and dam engineering, they are often referred as “fragility curves”; and in other publications “damage functions”. For each hazard and each element’s typology, one vulnerability function is defined. For very detailed analyses, and for exposed elements that might not fall into a general class like a dam, a bespoke vulnerability curve can be developed. Each point of the curve links a characteristic of the hazard – intensity – to the likely loss in terms of mean and variance, representing the probability distribution of the losses that are likely to occur following a hazard event of a given intensity.
  


Probability distribution of losses for one hazard event

Risk

Once the hazard exposure and vulnerabilities of the exposed elements are defined, it is then possible to calculate the losses related to each of the possible events. A probability distribution of the hazard intensity for certain return periods can be associated to each point of the domain. As each point of the vulnerability curve is itself a probability distribution, a different probabilistic distribution of damages is calculated at each point for each event and for each exposed element.

Therefore, at each point of the space, for each modelled event, and for each exposed element (or class of elements), we obtain a probability distribution of losses. For each value of losses X, the area underneath the probability curve represents the probability to exceed this value P(x > X).

The combination of all these distributions, for all the building classes and the points of the exposure database, produce the probability distribution of losses in the country. This distribution is call a “loss exceedance curve.” The curve usually constitutes the key output of a fully probabilistic risk assessment.

Each point of the curve is not associated with a specific event, but rather represents the absolute probability of having a loss equal or higher than X in any given year (“Excess Rate”). Similarly to the hazard curve, as each of the events are considered to be independent and mutually exclusive, the resulting probability of exceeding a loss x (constituting one point of the loss exceedance curve) can be calculated as:

r(x) = ΣNi=1 R(X > x |ei) ∙ f(ei)

Where: r(x) is the probability of exceeding a loss x, R is the probability of exceeding a loss x given the occurrence of the event ei, f (ei) is the annual frequency of occurrence of the event ei, and N is the total number of events e.
 


Example of loss exceedance curve

The integral of the loss exceedance curve – the area underneath the curve – is the Annual Average Loss (AAL), which represent the expected losses in any given year, averaged over a long period of time. For example, if the losses are expressed as monetary value in terms of replacement cost of urban buildings, these results provide a picture of the extent of monetary losses the country is likely to face, on average, in one year.

Each point of the curve is what is usually called the “Probable Maximum Loss,” which is the maximum loss that could be experienced in the occurrence of a disaster with a particular return period.

Although the probable maximum loss is not related to a single event, this metric can be used as a proxy to assess losses should the design return period be exceeded. Thus, it can provide a strong argument in terms of cost/benefit analysis for specific return periods. In general, the probable maximum loss represents the actual return period of losses, hence it is used to provide information on how to address the different level of risk. Risks with high to medium probability of losses ccurring can be addressed through interventions such as prospective and corrective risk management measures – that is, codes and norms. Risks with low probability of high losses may be addressed through risk transfer mechanisms. Risks of very high losses with very low probability of occurrence are “residual” risks that decision-makers may not be able to address nor transfer. The decision on where to set the level of this “residual risk” can be economic but also political, resulting in what is sometimes called “acceptable risk”.
  


Example of probable maximum loss


Recommendations concerning hazard data for probabilistic risk assessment

To design a good risk assessment, it is necessary first to ask the right questions – that is, define the specific scope of the risk assessment. This will inform the choice of the best resolution and scale of the analysis. These factors are also dependent on the time, resources and type/resolution of the data available for the analysis. The choice of the hazards to include in the analysis may depend on the context of the specific assessment – the questions asked – but also by the resources available. If this is the case, a pre-assessment of the risks in the area is necessary to prioritize which hazards should be included in the analysis.

Carrying out a probabilistic risk assessment requires a considerable amount of input data on hazards, exposure and vulnerability. Usually, hazard information from meteorological offices is used as input for the models that are needed to reconstruct the intensity of the hazard, its spatial variability and probability. Although the data requirements strongly depend on the scope and scale of analysis, some general recommendations can be made:

  1. Guidelines and standards for probabilistic hazard and risk assessments
     
    Risk assessment is one of the key indicators of progress for the Hyogo Framework for Action. However, there are nogeneral guidelines for assessing the quality of a probabilistic hazard and risk assessment, nor for identifying the minimum requirement for such assessment. Without such information, resources may be producing sub-standard or uninformative risk assessments. Such guidelines would require extensive consultation with various institutions.
     
  2. Baseline data produced, updated and made available for hazard modelling
     
    Baseline data, such as topographic, land cover or bathymetric data have to be systematically produced and updated, with different spatial resolutions and information on their accuracy, and made available for hazard and risk modelling.
     
  3. Time series of hydro-meteorological data systematically collected and stored, following standardized, quality-controlled, formats
     
    Time series of hydro-meteorological data (e.g. rainfall, flow discharges, wind gusts etc.) should be systematically and continuously collected, as they should cover a sufficient temporal span to be used in the analysis. For different time series to be usable for analysis, it is important to ensure coherence in the way the data are collected and measured. Modellers require data collection that follows coherent formats and methods.
     
    Such data should be collected providing an appropriate spatial coverage to enable the modellers to produce a usable description of the hazard. Parameters include the magnitude, timing, location and the duration of each hazard or extreme event.
     
  4. Data quality, resolution and uncertainty provided together with the datasets
     
    The input data for risk modelling should also be provided with information on their quality. If this information is lacking or cannot be assessed, it is difficult to evaluate the uncertainty related to the input data and hence to calculate the propagation of this uncertainty into the output.
     
  5. In case of flooding, conduct post event surveys to record water depths, and possibly velocity, at different points within the affected areas
     
    Vulnerability curves are mostly based on laboratory experiments and subsequently validated with real data. Recorded flood depths and velocity at different points within the affected area are extremely important to validate hazard models, but also to develop vulnerability curves when coupled with the damages/losses at the same oints and the physical characteristics of the damaged elements.

Other issues

Additional issues regarding exposure and vulnerability data include:

  1. Exposure data should be systematically collected and updated
     
    Geo-referenced demographic and socio-economic data – population, age classes, income levels, etc. – are generally collected through censuses. Building censuses, including structural characteristics of buildings and infrastructure, are less common. Exposure data should include geo-referenced locations of buildings and infrastructure, structural characteristics, replacement values, or characteristics relevant for reconstruction such as building use. These data are fundamental for quantifying losses and prioritizing interventions. Given the potential sensitivity of such data, they may be collected by government agencies to be made available to risk modellers.
      
  2. Vulnerability curves should include uncertainty levels
     
    Model results are sensitive to the vulnerability curves adopted and their uncertainty. These curves often embed a high level of uncertainty. For instance, they may depend on the construction techniques used in the analysis and, therefore, be area-specific. Structural characteristics of exposed elements are also complex to assess, requiring detailed information on design, building codes, construction techniques, etc. that might be imprecise, unavailable or not assessable. Thus, the level of uncertainty in the vulnerability curve should be appropriately represented.
     
  3. Forums could be created to share and validate vulnerability functions
     
    Physical vulnerability data are often not available. It is important that practitioners share such curves and jointly contribute to their improvement and validation. A forum is a mechanism for sharing information and improving the knowledge-base.
     
  4. Further research should be devoted to developing and validating vulnerability curves, especially regarding human vulnerability
     
    The characterization of the human vulnerability is an open research question. While the physical vulnerability contributing to the consequence of a building collapse might be more easily assessed, factors such as the ontribution of early warnings to vulnerability levels of exposed populations can be more difficult to estimate, although they may greatly affect the mortality associated with some hazards.
      

A lone house remained standing after Hurricane Ike (2008) devastated Gilchrist and Galveston, Texas. Taking advantage of lessons learned from Hurricane Rita in 2005, the house was built on elevated ground and designed to withstand winds of up to 209 km per hour. / © Jocelyn Augustino / FEMA


Conclusion

Historical hazard data is essential for assessing risks of future losses. Yet, for many hazards such data has not been systematically collected, is catalogued in different formats, is inaccessible and lacks metadata. Recording the magnitude, location, duration and timing of each hazard or extreme event is a crucial component in the process of documenting and cataloguing damage and losses. Accumulated over time, these data provide a basis for calibration and validation of the hazard models needed for (ex ante) probabilistic risk assessment.

Disaster risk reduction has risen to the top of the international agenda. WMO and its members would make a valuable contribution to this agenda both internationally and in terms of reducing losses at country level by giving this important issue the attention it deserves.
  

References

Cardona, O. D., Ordaz, M. G., Yamin, L. E., Marulanda, M. C. and Barbat, A. H., 2008, Earthquake Loss Assessment for Integrated Disaster Risk Management, Journal of Earthquake Engineering, 12:1, 48–59

CEH, 1999 Flood Estimation Handbook, 5 volumes and associated software. Institute of Hydrology, Wallingford

Dickson, E., Baker, J., Hoornweg, D. and Tiwari, A., 2012, Urban Risk Assessment: understanding disasters and climate risk in cities. The World Bank

Kumamoto, H. and Henley, E. J., 1996, Probabilistic Risk Assessment And Management For Engineers And Scientists, IEEE Press, ISBN 0-7803100-47

Marulanda Fraume, M. C., 2013, Modelacion probabilsita de perdidas economicas por sismo para la estimacion de la vulnerabilidad fiscal del estado y la gestion dinanciera del riesgo soberano, PhD thesis for the Universitat Politecnica de Catalunya

Manuele, F. A., 2010, Acceptable Risk, Professional Safety, v. 55, 5, 30-38

Prahl, B. F., Rybski, D. Kropp, J. P., Burgho, O. and Held, H., 2012, Applying stochastic small-scale damage functions to erman winter storms, Geophysical Research Letters, 39

Rossetto, T. and Elnashai A., 2003, Derivation of vulnerability functions for European-type RC structures based on observational data Engineering Structures 25 1241–1263

Vorogushyn, S., Merz, B. and Apel, H., 2009, Development of dike fragility curves for piping and micro-instability breach mechanism, Natural Hazards and Earth System Science, 9, 1383–1401

Nasim Uddin, Alfredo H.S. Ang. (eds.), 2012, Quantitative risk assessment (QRA) for natural hazards, American Society of Civil Engineers CDRM Monograph no. 5

Yamin, L. E., Guesquiere, F., Cardona, O. D., Ordaz, M. G., 2013. Modelación probabilista para la gestiòn del riesgo de disaster: el caso de Bogotà, Colombia. Banco Mundial, Universidad de los Andes

UNISDR, 2009, Terminology on disaster risk reduction, Geneva, Switzerland.

UNISDR, 2011. Global Assessment Report on Disaster Risk Reduction: Revealing Risk, Redefining Development. United Nations International Strategy for Disaster Reduction. Geneva, Switzerland: UNISDR.

USGS, 1982, Guidelines for determining flood flow frequency, Bulletin 17B of the Hydrology Subcommittee, Interagency Advisory Committee on Water Data

1 Manuela Di Mauro was at UNISDR when this article was drafted, but has since resigned.

2 This curve can take different names depending on the hazard and the application, e.g. “flood” or “flow” frequency curves (CEH, 1999; USGS, 1982), “intensity exceedance” curves etc.

    Share: