Back to Search Start Over

Rethinking Soft Label in Label Distribution Learning Perspective

Authors :
Hong, Seungbum
Yoon, Jihun
Park, Bogyu
Choi, Min-Kook
Publication Year :
2023

Abstract

The primary goal of training in early convolutional neural networks (CNN) is the higher generalization performance of the model. However, as the expected calibration error (ECE), which quantifies the explanatory power of model inference, was recently introduced, research on training models that can be explained is in progress. We hypothesized that a gap in supervision criteria during training and inference leads to overconfidence, and investigated that performing label distribution learning (LDL) would enhance the model calibration in CNN training. To verify this assumption, we used a simple LDL setting with recent data augmentation techniques. Based on a series of experiments, the following results are obtained: 1) State-of-the-art KD methods significantly impede model calibration. 2) Training using LDL with recent data augmentation can have excellent effects on model calibration and even in generalization performance. 3) Online LDL brings additional improvements in model calibration and accuracy with long training, especially in large-size models. Using the proposed approach, we simultaneously achieved a lower ECE and higher generalization performance for the image classification datasets CIFAR10, 100, STL10, and ImageNet. We performed several visualizations and analyses and witnessed several interesting behaviors in CNN training with the LDL.<br />Comment: 11 pages main manuscript + references and 11 pages supplementary materials

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2301.13444
Document Type :
Working Paper