Back to Search Start Over

NORMALIZATION EFFECTS ON DEEP NEURAL NETWORKS.

Authors :
JIAHUI YU
SPILIOPOULOS, KONSTANTINOS
Source :
Foundations of Data Science; Sep2023, Vol. 5 Issue 3, p389-465, 77p
Publication Year :
2023

Abstract

We study the effect of normalization on the layers of deep neural networks of feed-forward type. A given layer i with Ni hidden units is allowed to be normalized by 1=N<superscript>γi</superscript><subscript>i</subscript> with γi ∊ [1=2, 1] and we study the effect of the choice of the i on the statistical behavior of the neural network's output (such as variance) as well as on the test accuracy on the MNIST data set. We find that in terms of variance of the neural network's output and test accuracy the best choice is to choose the i's to be equal to one, which is the mean-field scaling. We also find that this is particularly true for the outer layer, in that the neural network's behavior is more sensitive in the scaling of the outer layer as opposed to the scaling of the inner layers. The mechanism for the mathematical analysis is an asymptotic expansion for the neural network's output. An important practical consequence of the analysis is that it provides a systematic and mathematically informed way to choose the learning rate hyperparameters. Such a choice guarantees that the neural network behaves in a statistically robust way as the Ni grow to infinity. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
26398001
Volume :
5
Issue :
3
Database :
Complementary Index
Journal :
Foundations of Data Science
Publication Type :
Academic Journal
Accession number :
173067127
Full Text :
https://doi.org/10.3934/fods.2023004