Back to Search
Start Over
ADVANCED ADAPTIVE NONMONOTONE CONJUGATE GRADIENT TRAINING ALGORITHM FOR RECURRENT NEURAL NETWORKS.
- Source :
-
International Journal on Artificial Intelligence Tools . Oct2008, Vol. 17 Issue 5, p963-984. 22p. 1 Diagram, 14 Charts, 10 Graphs. - Publication Year :
- 2008
-
Abstract
- Recurrent networks constitute an elegant way of increasing the capacity of feedforward networks to deal with complex data in the form of sequences of vectors. They are well known for their power to model temporal dependencies and process sequences for classification, recognition, and transduction. In this paper we propose an advanced nonmonotone Conjugate Gradient training algorithm for recurrent neural networks, which is equipped with an adaptive tuning strategy for both the nonmonotone learning horizon and the stepsize. Simulation results in sequence processing using three different recurrent architectures demonstrate that this modification of the Conjugate Gradient method is more effective than previous attempts. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 02182130
- Volume :
- 17
- Issue :
- 5
- Database :
- Academic Search Index
- Journal :
- International Journal on Artificial Intelligence Tools
- Publication Type :
- Academic Journal
- Accession number :
- 35167848
- Full Text :
- https://doi.org/10.1142/S0218213008004242