Back to Search Start Over

Norm-Preservation: Why Residual Networks Can Become Extremely Deep?

Authors :
Zaeemzadeh, Alireza
Rahnavard, Nazanin
Shah, Mubarak
Source :
IEEE Transactions on Pattern Analysis & Machine Intelligence. Nov2021, Vol. 43 Issue 11, p3980-3990. 11p.
Publication Year :
2021

Abstract

Augmenting neural networks with skip connections, as introduced in the so-called ResNet architecture, surprised the community by enabling the training of networks of more than 1,000 layers with significant performance gains. This paper deciphers ResNet by analyzing the effect of skip connections, and puts forward new theoretical results on the advantages of identity skip connections in neural networks. We prove that the skip connections in the residual blocks facilitate preserving the norm of the gradient, and lead to stable back-propagation, which is desirable from optimization perspective. We also show that, perhaps surprisingly, as more residual blocks are stacked, the norm-preservation of the network is enhanced. Our theoretical arguments are supported by extensive empirical evidence. Can we push for extra norm-preservation? We answer this question by proposing an efficient method to regularize the singular values of the convolution operator and making the ResNet’s transition layers extra norm-preserving. Our numerical investigations demonstrate that the learning dynamics and the classification performance of ResNet can be improved by making it even more norm preserving. Our results and the introduced modification for ResNet, referred to as Procrustes ResNets, can be used as a guide for training deeper networks and can also inspire new deeper architectures. [ABSTRACT FROM AUTHOR]

Subjects

Subjects :
*CONVOLUTIONAL neural networks

Details

Language :
English
ISSN :
01628828
Volume :
43
Issue :
11
Database :
Academic Search Index
Journal :
IEEE Transactions on Pattern Analysis & Machine Intelligence
Publication Type :
Academic Journal
Accession number :
153710025
Full Text :
https://doi.org/10.1109/TPAMI.2020.2990339