Back to Search Start Over

ASAM: Adaptive Sharpness-Aware Minimization for Scale-Invariant Learning of Deep Neural Networks

Authors :
Kwon, Jungmin
Kim, Jeongseop
Park, Hyunseo
Choi, In Kwon
Publication Year :
2021

Abstract

Recently, learning algorithms motivated from sharpness of loss surface as an effective measure of generalization gap have shown state-of-the-art performances. Nevertheless, sharpness defined in a rigid region with a fixed radius, has a drawback in sensitivity to parameter re-scaling which leaves the loss unaffected, leading to weakening of the connection between sharpness and generalization gap. In this paper, we introduce the concept of adaptive sharpness which is scale-invariant and propose the corresponding generalization bound. We suggest a novel learning method, adaptive sharpness-aware minimization (ASAM), utilizing the proposed generalization bound. Experimental results in various benchmark datasets show that ASAM contributes to significant improvement of model generalization performance.<br />Comment: 13 pages, 4 figures, To be published in ICML 2021

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2102.11600
Document Type :
Working Paper