Back to Search Start Over

Selección eficiente de arquitecturas neuronales empleando técnicas destructivas y de regularización

Authors :
Andrés Eduardo Gaona Barrera
Dora María Ballesteros Larrotta
Source :
Tecnura, Vol 16, Iss 33, Pp 158-172 (2012)
Publication Year :
2012
Publisher :
Universidad Distrital Francisco Jose de Caldas, 2012.

Abstract

This article shows a detailed comparison of both theoretical and practical Ontogenetic Neural Networks obtained through pruning and regularization algorithms. We initially deal with the concept of a regularized error function and the different ways to modify such a function (weight decay (WD), soft weight sharing, and Chauvin penalty). Then some of the most representative pruning algorithms are considered, particularly the OBD (Optimal Brain Damage) algorithm. We select OBD and WD within the problem of the XOR function with the purpose of analyzing pruning techniques and regularization algorithms. The basic back-propagation algorithm is used in both WD and the inverse Hessian matrix in OBD. According to the results, WD is faster than OBD, but it deletes a smaller number of weights. Additionally, OBD reduces the complexity of the neural-network architecture, but its computational cost is still high.

Details

Language :
Spanish; Castilian
ISSN :
0123921X and 22487638
Volume :
16
Issue :
33
Database :
Directory of Open Access Journals
Journal :
Tecnura
Publication Type :
Academic Journal
Accession number :
edsdoj.3be4e9f911404f7bad42b30ed4554b5f
Document Type :
article