Your browser has javascript turned off or blocked. This will lead to some parts of our website to not work properly or at all. Turn on javascript for best performance.

The browser you are using is not supported by this website. All versions of Internet Explorer are no longer supported, either by us or Microsoft (read more here: https://www.microsoft.com/en-us/microsoft-365/windows/end-of-ie-support).

Please use a modern browser to fully experience our website, such as the newest versions of Edge, Chrome, Firefox or Safari etc.

Improving disaster response evaluations : Supporting advances in disaster risk management through the enhancement of response evaluation usefulness

Author

Summary, in English

Future disasters or crises are difficult to predict and therefore hard to prepare for. However, while a specific event might not have happened, it can be simulated in an exercise. The evaluation of performance during such an exercise can provide important information regarding the current state of preparedness, and used to improve the response to future events. For this to happen, evaluation products must be perceived as useful by the end user. Unfortunately, it appears that this is not the case. Both evaluations and their products are rarely used to their full extent or, in extreme cases, are regarded as paper-pushing exercises.

The first part of this research characterises current evaluation practice, both in the scientific literature and in Dutch practice, based on a scoping study, document and content analyses, and expert judgements. The findings highlight that despite a recent increase in research attention, few studies focus on disaster management exercise evaluation. It is unclear whether current evaluations achieve their purpose, or how they contribute to disaster preparedness. Both theory and practice tend to view, and present evaluations in isolation. This limited focus creates a fragmented field that lacks coherence and depth. Furthermore, most evaluation documentation fails to justify or discuss the rational underlying the selected methods, and their link to the overall purpose or context of the exercise. The process of collecting and analysing contextual, evidence-based data, and using it to reach conclusions and make recommendations lacks methodological transparency and rigour. Consequently, professionals lack reliable guidance when designing evaluations.

Therefore, the second part of this research aimed to gain an insights into what make evaluations useful, and suggest improvements. In particular, it highlights the values associated with the methodology used to record and present evaluation outcomes to end users. The notion of an ‘evaluation description’ is introduced to support the identification of four components that are assumed to influence the usefulness of an evaluation: its purpose, object description, analysis and conclusion. Survey experiments identified that how these elements – notably, the analysis and/ or conclusions – are documented significantly influences the usefulness of the product. Furthermore, different components are more useful depending on the purpose of the report (for learning or accountability). Crisis management professionals expect the analysis to go beyond the object of the evaluation, and focus on the broader context. They expect a rigorous evaluation to provide them with evidence-based judgements that deliver actionable conclusions and support future learning.

Overall, this research shows that the design and execution of evaluations should provide systematic, rigorous, evidence-based and actionable outcomes. It suggests some ways to manage both the process and the products of an evaluation to improve its usefulness. Finally, it underlines that it is not the evaluation itself that leads to improvement, but its use. Evaluation should, therefore, be seen as a means to an end.

Publishing year

2021

Language

English

Document type

Dissertation

Publisher

Division of Risk Management and Societal Safety, Faculty of Engineering, Lund University

Topic

  • Other Civil Engineering
  • Other Social Sciences not elsewhere specified

Keywords

  • crisis
  • disaster
  • emergency
  • disaster risk management (DRM)
  • preparedness
  • exercise
  • simulation
  • response
  • performance
  • evaluation
  • usefullness
  • design
  • The Netherlands

Status

Published

Supervisor

ISBN/ISSN/Other

  • ISBN: 978-91-7895-922-8
  • ISBN: 978-91-7895-923-5

Defence date

3 September 2021

Defence time

10:15

Defence place

Lecture hall V:B, building V, John Ericssons väg 1, Faculty of Engineering LTH, Lund University, Lund. Zoom: https://lu-se.zoom.us/j/65999762322?pwd=dnJ0Q1pOdlVWdVk2MndEZjg1akpyUT09

Opponent

  • Björn Ivar Kruke (Ass. Prof.)