Back to Search Start Over

RR-Former: Rainfall-runoff modeling based on Transformer.

Authors :
Yin, Hanlin
Guo, Zilong
Zhang, Xiuwei
Chen, Jiaojiao
Zhang, Yanning
Source :
Journal of Hydrology. Jun2022, Vol. 609, pN.PAG-N.PAG. 1p.
Publication Year :
2022

Abstract

• A Transformer based rainfall-runoff model is proposed for the first time. • RR-Former can connect arbitrary positions directly and is more flexible than LSTM. • RR-Former achieves excellent performance on individual rainfall-runoff modeling. • RR-Former achieves excellent performance on regional rainfall-runoff modeling. • RR-Former suits for big datasets and provides multi-step-ahead runoff predictions. Recently, the long short-term memory (LSTM) based rainfall-runoff models have achieved good performance and thus have received many attentions. In this paper, we propose a novel rainfall-runoff model named RR-Former based on the Transformer, which is entirely composed of attention mechanisms. Compared with a LSTM-based model, the architecture of RR-Former can connect two arbitrary positions in a time series process directly by using attention modules. It can strengthen or weaken the connection of two arbitrary positions and thus is more flexible than a LSTM-based model. Therefore, the RR-Former has potential to achieve better performance. By employing the Catchment Attributes and Meteorology for Large-Sample Studies (CAMELS) dataset, we test the performance of RR-Former in two tasks: individual rainfall-runoff modeling and regional rainfall-runoff modeling. In the first task, our RR-Former outperforms two LSTM-based sequence-to-sequence models significantly for 7-day-ahead runoff predictions. For example, the median and the mean of Nash–Sutcliffe efficiency for the 673 basins provided by our RR-Former achieve 0.8265 and 0.7904 , respectively, while those provided by the benchmark model (the better one between two benchmark models) are 0.7448 and 0.6952 , respectively. In the second task, our RR-Former also shows its power and suits for a big dataset better. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00221694
Volume :
609
Database :
Academic Search Index
Journal :
Journal of Hydrology
Publication Type :
Academic Journal
Accession number :
157047519
Full Text :
https://doi.org/10.1016/j.jhydrol.2022.127781