Back to Search Start Over

SIRe-Networks: Convolutional neural networks architectural extension for information preservation via skip/residual connections and interlaced auto-encoders.

Authors :
Avola, Danilo
Cinque, Luigi
Fagioli, Alessio
Foresti, Gian Luca
Source :
Neural Networks. Sep2022, Vol. 153, p386-398. 13p.
Publication Year :
2022

Abstract

Improving existing neural network architectures can involve several design choices such as manipulating the loss functions, employing a diverse learning strategy, exploiting gradient evolution at training time, optimizing the network hyper-parameters, or increasing the architecture depth. The latter approach is a straightforward solution, since it directly enhances the representation capabilities of a network; however, the increased depth generally incurs in the well-known vanishing gradient problem. In this paper, borrowing from different methods addressing this issue, we introduce an interlaced multi-task learning strategy, defined SIRe, to reduce the vanishing gradient in relation to the object classification task. The presented methodology directly improves a convolutional neural network (CNN) by preserving information from the input image through interlaced auto-encoders (AEs), and further refines the base network architecture by means of skip and residual connections. To validate the presented methodology, a simple CNN and various implementations of famous networks are extended via the SIRe strategy and extensively tested on five collections, i.e., MNIST, Fashion-MNIST, CIFAR-10, CIFAR-100, and Caltech-256; where the SIRe-extended architectures achieve significantly increased performances across all models and datasets, thus confirming the presented approach effectiveness. • Designing an interlaced multi-task learning approach to affect the gradient. • Further improving the proposed interlaced approach via skip/residual connections. • Applying the proposed SIRe methodology to other well-known architectures. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
08936080
Volume :
153
Database :
Academic Search Index
Journal :
Neural Networks
Publication Type :
Academic Journal
Accession number :
158208718
Full Text :
https://doi.org/10.1016/j.neunet.2022.06.030