1. A New Sparse Learning Machine
- Author
-
Reza Monsefi, Alaleh Maskooki, and Mojtaba Nayyeri
- Subjects
0209 industrial biotechnology ,Computer Networks and Communications ,Computer science ,Generalization ,business.industry ,General Neuroscience ,Feed forward ,Approximation algorithm ,Computational intelligence ,Pattern recognition ,02 engineering and technology ,Sparse approximation ,020901 industrial engineering & automation ,Artificial Intelligence ,0202 electrical engineering, electronic engineering, information engineering ,Feedforward neural network ,020201 artificial intelligence & image processing ,Pruning (decision trees) ,Gram–Schmidt process ,Artificial intelligence ,business ,Software - Abstract
Many algorithms have been proposed so far for pruning and sparse approximation of feedforward neural networks with random weights in order to obtain compact networks which are fast and robust on various datasets. One drawback of the randomization process is that the resulted weight vectors might be highly correlated. It has been shown that ensemble classifiers’ error depends on the amount of error correlation between them. Thus, decrease in correlation between output vectors must lead to generation of more efficient hidden nodes. In this research a new learning algorithm called New Sparse Learning Machine (NSLM) for single-hidden layer feedforward networks is proposed for regression and classification. In the first phase, the algorithm creates hidden layer with small correlation among nodes by orthogonalizing the columns of the output matrix. Then in the second phase, using $$L_1$$ -norm minimization problem, NSLM makes the components of the solution vector become zero as many as possible. The resulted network has higher degree of sparsity while the accuracy is maintained or improved. Therefore, the method leads to a new network with a better generalization performance. Numerical comparisons on several classification and regression datasets confirm the expected improvement in comparison to the basic network.
- Published
- 2016