Back to Search Start Over

Critical Initialization of Wide and Deep Neural Networks through Partial Jacobians: General Theory and Applications

Authors :
Doshi, Darshil
He, Tianyu
Gromov, Andrey
Publication Year :
2021

Abstract

Deep neural networks are notorious for defying theoretical treatment. However, when the number of parameters in each layer tends to infinity, the network function is a Gaussian process (GP) and quantitatively predictive description is possible. Gaussian approximation allows one to formulate criteria for selecting hyperparameters, such as variances of weights and biases, as well as the learning rate. These criteria rely on the notion of criticality defined for deep neural networks. In this work we describe a new practical way to diagnose criticality. We introduce \emph{partial Jacobians} of a network, defined as derivatives of preactivations in layer $l$ with respect to preactivations in layer $l_0\leq l$. We derive recurrence relations for the norms of partial Jacobians and utilize these relations to analyze criticality of deep fully connected neural networks with LayerNorm and/or residual connections. We derive and implement a simple and cheap numerical test that allows one to select optimal initialization for a broad class of deep neural networks; containing fully connected, convolutional and normalization layers. Using these tools we show quantitatively that proper stacking of the LayerNorm (applied to preactivations) and residual connections leads to an architecture that is critical for any initialization. Finally, we apply our methods to analyze ResNet and MLP-Mixer architectures; demonstrating the everywhere-critical regime.<br />Comment: Accepted (spotlight) at NeurIPS2023. Additional ResNet results. 42 pages, 12 figures

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2111.12143
Document Type :
Working Paper