Back to Search Start Over

EGFormer: An Enhanced Transformer Model with Efficient Attention Mechanism for Traffic Flow Forecasting

Authors :
Zhihui Yang
Qingyong Zhang
Wanfeng Chang
Peng Xiao
Minglong Li
Source :
Vehicles, Vol 6, Iss 1, Pp 120-139 (2024)
Publication Year :
2024
Publisher :
MDPI AG, 2024.

Abstract

Due to the regular influence of human activities, traffic flow data usually exhibit significant periodicity, which provides a foundation for further research on traffic flow data. However, the temporal dependencies in traffic flow data are often obscured by entangled temporal regularities, making it challenging for general models to capture the intrinsic functional relationships within the data accurately. In recent years, a plethora of methods based on statistics, machine learning, and deep learning have been proposed to tackle these problems of traffic flow forecasting. In this paper, the Transformer is improved from two aspects: (1) an Efficient Attention mechanism is proposed, which reduces the time and memory complexity of the Scaled Dot Product Attention; (2) a Generative Decoding mechanism instead of a Dynamic Decoding operation, which accelerates the inference speed of the model. The model is named EGFormer in this paper. Through a lot of experiments and comparative analysis, the authors found that the EGFormer has better ability in the traffic flow forecasting task. The new model has higher prediction accuracy and shorter running time compared with the traditional model.

Details

Language :
English
ISSN :
26248921
Volume :
6
Issue :
1
Database :
Directory of Open Access Journals
Journal :
Vehicles
Publication Type :
Academic Journal
Accession number :
edsdoj.2d87c5d3b60a417dbf07b5cb981230d1
Document Type :
article
Full Text :
https://doi.org/10.3390/vehicles6010005