Back to Search Start Over

Implicit Bias of Gradient Descent for Non-Homogeneous Deep Networks

Authors :
Cai, Yuhang
Zhou, Kangjie
Wu, Jingfeng
Mei, Song
Lindsey, Michael
Bartlett, Peter L.
Publication Year :
2025

Abstract

We establish the asymptotic implicit bias of gradient descent (GD) for generic non-homogeneous deep networks under exponential loss. Specifically, we characterize three key properties of GD iterates starting from a sufficiently small empirical risk, where the threshold is determined by a measure of the network's non-homogeneity. First, we show that a normalized margin induced by the GD iterates increases nearly monotonically. Second, we prove that while the norm of the GD iterates diverges to infinity, the iterates themselves converge in direction. Finally, we establish that this directional limit satisfies the Karush-Kuhn-Tucker (KKT) conditions of a margin maximization problem. Prior works on implicit bias have focused exclusively on homogeneous networks; in contrast, our results apply to a broad class of non-homogeneous networks satisfying a mild near-homogeneity condition. In particular, our results apply to networks with residual connections and non-homogeneous activation functions, thereby resolving an open problem posed by Ji and Telgarsky (2020).<br />Comment: 96 pages

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2502.16075
Document Type :
Working Paper