Assessing the applicability of Transformer-based architectures as rainfall-runoff models

Master Thesis (2023)
Author(s)

K. Mao (TU Delft - Civil Engineering & Geosciences)

Contributor(s)

Riccardo Taormina – Mentor (TU Delft - Sanitary Engineering)

Markus Hrachowitz – Graduation committee member (TU Delft - Water Resources)

Jacopo De Stefani – Graduation committee member (TU Delft - Information and Communication Technology)

Anaïs Couasnon – Graduation committee member (Deltares)

Ruben Dahm – Graduation committee member (Deltares)

Jonathan Nuttall – Graduation committee member (Deltares)

Faculty
Civil Engineering & Geosciences
Copyright
© 2023 Kangmin Mao
More Info
expand_more
Publication Year
2023
Language
English
Copyright
© 2023 Kangmin Mao
Graduation Date
27-01-2023
Awarding Institution
Delft University of Technology
Programme
['Civil Engineering']
Related content

The GitHub repository of the Transformer-based rainfall-runoff modeling.

https://github.com/Numpy-Panda/neuralhydrology_Transformer
Faculty
Civil Engineering & Geosciences
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

Modeling the relationship between rainfall and runoff is a longstanding challenge in hydrology and is crucial for informed water management decisions. Recently, Deep Learning models, particularly Long short-term memory (LSTM), have shown promising results in simulating this relationship. The Transformer, a newly proposed deep learning architecture, has also demonstrated the ability to outperform LSTM in machine translation, text classification, etc. However, there has been limited research on applying Transformers for rainfall-runoff modeling.

The research examined the performance of using Transformer architecture, including its time series forecasting variants, to develop rainfall-runoff models using the CAMELS (US) data set. These models were compared to the LSTM regional rainfall-runoff models, with a particular focus on snow-driven basins as the attention mechanism in Transformer is believed to allow it to attend to the earlier precipitation events in the meteorological forcing. Additionally, the Transformer's potential as a global rainfall-runoff model was also tested using the global Caravan data to determine if it could learn and generalize a wide range of rainfall-runoff behaviors, allowing it to potentially be applied in ungauged basins.

The results suggest that while Transformer and its variants may not be able to fully replace LSTM for rainfall-runoff modeling, the variant called Reformer has shown promise for daily discharge forecasting in snow-driven basins, particularly in terms of peak flow and low flow prediction. However, using the global Caravan data for building a global rainfall-runoff model was not successful due to uncertainty in the forcing data, particularly precipitation. The code for Transformer-based rainfall-runoff modeling is available publicly at https://github.com/Numpy-Panda/neuralhydrology_Transformer.

Files

MSc_Thesis_Kangmin_Mao.pdf
(pdf | 23.3 Mb)
License info not available