Tag Archives: Failure of Imagination

The human factor in flood warnings: a failure of imagination.

The Human Factor in Flood Warnings: A Failure of Imagination

In July 2021 devastating flood impacted northern Europe, including Belgium, Germany, and the Netherlands. The floods caused over 10 billion Euros worth of damage and caused extensive damage to properties and communities over large areas. Tragically, nearly 200 people lost their lives.

This is despite the flooding being well forecast by the European Flood Awareness System (EFAS), which provided warnings 3 to 4 days in advance, seemingly giving organisations and individuals enough time to prepare. Even if they could keep their homes and businesses safe, they should have had time to keep themselves safe.

Professor Hannah Cloke of the University of Reading, who specialises in flooding, wrote an article for The Conversation following the flooding examining the reasons for why warnings were not as effective as expected. Hannah was involved in setting up EFAS, so was well positioned to comment and I think you might expect her to pass the buck, to say the science was right and it was not the fault of the forecasters that those warnings were not heeded. But she doesn’t.

Quote by Paul Virilio: When you invent the ship, you also invent the shipwreck; when you invent the plane you also invent the plane crash; and when you invent electricity, you invent electrocution... Every technology carriers its own negativity, which is invented at the same time as technical progress.

The philosopher Paul Virilio wrote about technology: “When you invent the ship, you also invent the shipwreck…“. As scientists, when we create anything we need to imagine what could go wrong and own that. It is not enough to put together an early warning system, however world leading, accurate, and timely, if no one acts on it.

Six mountains right to left, joined by bridges. Each is labelled, from left to right, Observation (sensor technology), weather forecast (atmospheric modelling), hazard forecast (environmental modelling), impact forecast (socio-economic modelling), warning (communication science), decision (behavioural psychology).

Golding et al (2019) described how early warning systems are made up of steps together in a chain. At each step, value is built as a mountain, between each step the value is lost in the ‘valleys of death’. Bridges of communication, understanding, and knowledge transfer ensure that value is retained and passed forward. The only value of an early warning system emerges when people respond to it appropriately.

Hannah described in her article how the failure laid in the way that warnings were produced, disseminated, and interpreted. The EFAS relies on public agencies to respond to their warnings – as happened in some places but not others – they are not available to the public. Professor Linda Speight, University of Oxford, who also specialises in flooding, described the difficulty of issuing warnings with the right message, especially when working with numerous different groups and organisations – a one-size fits all approach does not work for effective warning.

Both Hannah and Linda conclude that flood warnings are only effective if people understand the potential impacts on them. Linda described the benefits on impact-based forecasting, for example: “river levels will rise rapidly causing widespread flooding. Damage to roads and property is expected”. Hannah summarised the job of a flood warning (and science more widely) as “helping people see the invisible” – it is helping people imagine those potential impacts in response to the warning so they are compelled to take action. To Hannah, this failure in imagination was the missing bridge in the early warning chain, between warning and decision, where that value, tragically and literally, fell into the valley of death.

How would you help people see the invisible?

This post originally appeared in the Imagination Engines newsletter. To read this content a few weeks earlier, subscribe to the newsletter below.