Back to Search Start Over

Fallback Variable History NNLMs: Efficient NNLMs by precomputation and stochastic training.

Authors :
Zamora-Martínez, Francisco J.
España-Boquera, Salvador
Castro-Bleda, Maria Jose
Palacios-Corella, Adrian
Source :
PLoS ONE; 7/26/2018, Vol. 13 Issue 7, p1-13, 13p
Publication Year :
2018

Abstract

This paper presents a new method to reduce the computational cost when using Neural Networks as Language Models, during recognition, in some particular scenarios. It is based on a Neural Network that considers input contexts of different length in order to ease the use of a fallback mechanism together with the precomputation of softmax normalization constants for these inputs. The proposed approach is empirically validated, showing their capability to emulate lower order N-grams with a single Neural Network. A machine translation task shows that the proposed model constitutes a good solution to the normalization cost of the output softmax layer of Neural Networks, for some practical cases, without a significant impact in performance while improving the system speed. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
19326203
Volume :
13
Issue :
7
Database :
Complementary Index
Journal :
PLoS ONE
Publication Type :
Academic Journal
Accession number :
130925280
Full Text :
https://doi.org/10.1371/journal.pone.0200884