Back to Search Start Over

AutoInit: Automatic Initialization via Jacobian Tuning

Authors :
He, Tianyu
Doshi, Darshil
Gromov, Andrey
Publication Year :
2022
Publisher :
arXiv, 2022.

Abstract

Good initialization is essential for training Deep Neural Networks (DNNs). Oftentimes such initialization is found through a trial and error approach, which has to be applied anew every time an architecture is substantially modified, or inherited from smaller size networks leading to sub-optimal initialization. In this work we introduce a new and cheap algorithm, that allows one to find a good initialization automatically, for general feed-forward DNNs. The algorithm utilizes the Jacobian between adjacent network blocks to tune the network hyperparameters to criticality. We solve the dynamics of the algorithm for fully connected networks with ReLU and derive conditions for its convergence. We then extend the discussion to more general architectures with BatchNorm and residual connections. Finally, we apply our method to ResMLP and VGG architectures, where the automatic one-shot initialization found by our method shows good performance on vision tasks.<br />Comment: 22 pages, 5 figures

Details

Database :
OpenAIRE
Accession number :
edsair.doi.dedup.....b2694c9347f0f482292d1ad896257638
Full Text :
https://doi.org/10.48550/arxiv.2206.13568