Innovative simulation model set to transform flood planning and response
Researchers from the University of Melbourne have developed a new simulation model that can predict flooding during an ongoing disaster more accurately and 1000 times more quickly, with the innovation promising a game-changing tool for flood planning and response.
Published in Nature Water, researchers say the new model has major potential benefits for emergency responses, reducing flood forecasting time from hours or days to just seconds, and enabling flood behaviour to be accurately predicted as an emergency unfolds.
The Low-Fidelity, Spatial Analysis and Gaussian Process Learning (LSG) model was developed over two years by University of Melbourne PHD student Niels Fraehr, Professor Q. J. Wang, Dr Wenyan Wu and Professor Rory Nathan.
When put to the test, the LSG model was able to predict floods with 99% accuracy on the Chowilla floodplain in Southern Australia in 33 seconds, instead of 11 hours. On the Burnett River in Queensland, the model produced predictions in 27 seconds, instead of 36 hours.
Research lead Professor Q.J. Wang said that while high-resolution hydrodynamic models already produce accurate predictions, the research challenge was to find a way to produce the same level of accuracy, but a lot more quickly.
“The best available technology uses high-resolution hydrodynamic models. But this approach is too slow. Before this research, we also developed machine learning approaches that could produce faster results, but they were not always accurate when tested in new situations. We wanted to consistently achieve both accuracy and speed,” he said.
“High-resolution models are like having a really good camera. Coarse hydrodynamic models are like having a low-resolution camera that’s also much cheaper. We’ve been looking for a way to match the pictures coming from both cameras.
“We want to be able to use a cheap camera, but then translate grainy images into high-quality photos.”
Fraehr said the issue with using high-resolution models alone is the amount of data that needs to be processed to produce accurate predictions.
“Hydrodynamic models work by dividing the floodplain into subareas. And, the higher the resolution, the more subareas,” he said.
“To determine how water moves between subareas, you have to solve some pretty complex differential equations. And if you start having millions of subareas, with millions of these equations to solve, it takes a long time.
“We are trying to overcome this computational overload so we can run models much more quickly and get the same result.”
Fraehr said the LSG model uses mathematical transformations and a sophisticated statistical approach to rapidly take advantage of enormous amounts of data while using commonly available computing systems.
“When using the simpler model, or cheap camera as QJ says, we know it's going to be wrong, but we also know from our research that it’s going to be consistently wrong in the same way. That’s why we are able to match the two models,” he said.
“There is a lot of mathematics involved, all commonly used and proven methodologies, we just found a different way of combining these different approaches.
“We use Empirical Orthogonal Functions Analysis, which is a way of reducing the dimensionality of a spatial-temporal data set into a few key features. Then we ask how those features of the simple model compare to the corresponding key features of the high-resolution model.
“We match them through a statistical model, called a Gaussian Process Model. It’s a very fast model used to match those two sets of features. This is how we are able to reconstruct the full inundation picture.”
Professor Wang said there is an array of different areas of forecasting, planning and response that will be able to take advantage of this new innovation.
“Primarily, we see this work being used by flood forecasters. Up until now, the Bureau of Meteorology has focused mainly on river flows coming in during flooding, and what the water height at the river might be,” he said.
“But figuring out what areas of the town, for example, might get flooded, that’s the council’s job.
The best they can do is look at historical events, or maps that have been generated before.
And while some councils and emergency response agencies have set up good hydrodynamic models, they are still too slow.
“We designed this model so that emergency response agencies and water authorities can run weather scenarios to generate flood inundation results very quickly.”
Wang said the tool is also likely to be very helpful for decision-making, particularly around dam asset management.
“In an emergency situation, we want to be able to make decisions about dam water releases, and investigate many different water release strategies. Previously this was not possible, but now it is,” he said.
“We see this model being utilised in high-value areas. It could be very useful to set up models in places where we know we have bad floods, particularly in places where flood damage is very expensive.”
Fraehr said that while the LSG model is ready to start providing fast and accurate results, there is also room for the tool to be utilised in other innovative ways in the future.
“A lot of councils get high-resolution models developed to do pre-flood calculations, but these models cannot be run during flood events as they are too slow. But with our approach, it can provide information the entire time – it’s continuous,” he said.
“At some point in the future, we could use this to produce flood predictions that work similarly to the Bureau’s radar weather maps. We could be able to see spatially which areas will be flooded within the next day or two. This would be very beneficial.
“That’s still in the making. But that’s the direction we hope to go in.”