Back to Search Start Over

Training Multi-layer Perceptrons Using MiniMin Approach.

Authors :
Hao, Yue
Liu, Jiming
Wang, Yu-Ping
Cheung, Yiu-ming
Yin, Hujun
Ma, Jianfeng
Jiao, Yong-Chang
Bo, Liefeng
Wang, Ling
Jiao, Licheng
Source :
Computational Intelligence & Security; 2005, p909-914, 6p
Publication Year :
2005

Abstract

Multi-layer perceptrons (MLPs) have been widely used in classification and regression task. How to improve the training speed of MLPs has been an interesting field of research. Instead of the classical method, we try to train MLPs by a MiniMin model which can ensure that the weights of the last layer are optimal at each step. Significant improvement on training speed has been made using our method for several big benchmark data sets. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISBNs :
9783540308188
Database :
Supplemental Index
Journal :
Computational Intelligence & Security
Publication Type :
Book
Accession number :
32962231
Full Text :
https://doi.org/10.1007/11596448_135