Back to Search Start Over

Recurrent neural network from adder's perspective: Carry-lookahead RNN.

Authors :
Jiang, Haowei
Qin, Feiwei
Cao, Jin
Peng, Yong
Shao, Yanli
Source :
Neural Networks. Dec2021, Vol. 144, p297-306. 10p.
Publication Year :
2021

Abstract

The recurrent network architecture is a widely used model in sequence modeling, but its serial dependency hinders the computation parallelization, which makes the operation inefficient. The same problem was encountered in serial adder at the early stage of digital electronics. In this paper, we discuss the similarities between recurrent neural network (RNN) and serial adder. Inspired by carry-lookahead adder, we introduce carry-lookahead module to RNN, which makes it possible for RNN to run in parallel. Then, we design the method of parallel RNN computation, and finally Carry-lookahead RNN (CL-RNN) is proposed. CL-RNN takes advantages in parallelism and flexible receptive field. Through a comprehensive set of tests, we verify that CL-RNN can perform better than existing typical RNNs in sequence modeling tasks which are specially designed for RNNs. Code and models are available at: https://github.com/WinnieJiangHW/Carry-lookahead_RNN. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
08936080
Volume :
144
Database :
Academic Search Index
Journal :
Neural Networks
Publication Type :
Academic Journal
Accession number :
153338290
Full Text :
https://doi.org/10.1016/j.neunet.2021.08.032