Back to Search Start Over

Unified regularity measures for sample-wise learning and generalization

Authors :
Chi Zhang
Meng Yuan
Xiaoning Ma
Yu Liu
Haoang Lu
Le Wang
Yuanqi Su
Yuehu Liu
Source :
Visual Intelligence, Vol 2, Iss 1, Pp 1-20 (2024)
Publication Year :
2024
Publisher :
Springer, 2024.

Abstract

Abstract Fundamental machine learning theory shows that different samples contribute unequally to both the learning and testing processes. Recent studies on deep neural networks (DNNs) suggest that such sample differences are rooted in the distribution of intrinsic pattern information, namely sample regularity. Motivated by recent discoveries in network memorization and generalization, we propose a pair of sample regularity measures with a formulation-consistent representation for both processes. Specifically, the cumulative binary training/generalizing loss (CBTL/CBGL), the cumulative number of correct classifications of the training/test sample within the training phase, is proposed to quantify the stability in the memorization-generalization process, while forgetting/mal-generalizing events (ForEvents/MgEvents), i.e., the misclassification of previously learned or generalized samples, are utilized to represent the uncertainty of sample regularity with respect to optimization dynamics. The effectiveness and robustness of the proposed approaches for mini-batch stochastic gradient descent (SGD) optimization are validated through sample-wise analyses. Further training/test sample selection applications show that the proposed measures, which share the unified computing procedure, could benefit both tasks.

Details

Language :
English
ISSN :
27319008
Volume :
2
Issue :
1
Database :
Directory of Open Access Journals
Journal :
Visual Intelligence
Publication Type :
Academic Journal
Accession number :
edsdoj.98ba18ac0974090902ab7dff32f9d3d
Document Type :
article
Full Text :
https://doi.org/10.1007/s44267-024-00069-4