Back to Search Start Over

A Probabilistic Formulation for Meta-Weight-Net

Authors :
Qian Zhao
Ziming Liu
Jun Shu
Xiang Yuan
Deyu Meng
Source :
IEEE Transactions on Neural Networks and Learning Systems. 34:1194-1208
Publication Year :
2023
Publisher :
Institute of Electrical and Electronics Engineers (IEEE), 2023.

Abstract

In the last decade, deep neural networks (DNNs) have become dominant tools for various of supervised learning tasks, especially classification. However, it is demonstrated that they can easily overfit to training set biases, such as label noise and class imbalance. Example reweighting algorithms are simple and effective solutions against this issue, but most of them require manually specifying the weighting functions as well as additional hyperparameters. Recently, a meta-learning-based method Meta-Weight-Net (MW-Net) has been proposed to automatically learn the weighting function parameterized by an MLP via additional unbiased metadata, which significantly improves the robustness of prior arts. The method, however, is proposed in a deterministic manner, and short of intrinsic statistical support. In this work, we propose a probabilistic formulation for MW-Net, probabilistic MW-Net (PMW-Net) in short, which treats the weighting function in a probabilistic way, and can include the original MW-Net as a special case. By this probabilistic formulation, additional randomness is introduced while the flexibility of the weighting function can be further controlled during learning. Our experimental results on both synthetic and real datasets show that the proposed method improves the performance of the original MW-Net. Besides, the proposed PMW-Net can also be further extended to fully Bayesian models, to improve their robustness.

Details

ISSN :
21622388 and 2162237X
Volume :
34
Database :
OpenAIRE
Journal :
IEEE Transactions on Neural Networks and Learning Systems
Accession number :
edsair.doi.dedup.....191649abd72c3c0a5f832b6fc8cb1834