Back to Search Start Over

Minimal gated unit for recurrent neural networks.

Authors :
Zhou, Guo-Bing
Wu, Jianxin
Zhang, Chen-Lin
Zhou, Zhi-Hua
Source :
International Journal of Automation & Computing; Jun2016, Vol. 13 Issue 3, p226-234, 9p
Publication Year :
2016

Abstract

Recurrent neural networks (RNN) have been very successful in handling sequence data. However, understanding RNN and finding the best practices for RNN learning is a difficult task, partly because there are many competing and complex hidden units, such as the long short-term memory (LSTM) and the gated recurrent unit (GRU). We propose a gated unit for RNN, named as minimal gated unit (MGU), since it only contains one gate, which is a minimal design among all gated hidden units. The design of MGU benefits from evaluation results on LSTM and GRU in the literature. Experiments on various sequence data show that MGU has comparable accuracy with GRU, but has a simpler structure, fewer parameters, and faster training. Hence, MGU is suitable in RNN's applications. Its simple architecture also means that it is easier to evaluate and tune, and in principle it is easier to study MGU's properties theoretically and empirically. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
14768186
Volume :
13
Issue :
3
Database :
Complementary Index
Journal :
International Journal of Automation & Computing
Publication Type :
Academic Journal
Accession number :
132063791
Full Text :
https://doi.org/10.1007/s11633-016-1006-2