Back to Search Start Over

Multi-Domain Long-Tailed Learning: Challenges, Progress, and Prospects

Authors :
Panpan Fu
Umi Kalsom Yusof
Source :
IEEE Access, Vol 12, Pp 129528-129540 (2024)
Publication Year :
2024
Publisher :
IEEE, 2024.

Abstract

In practical applications, the issue of data imbalance inevitably rises. In most studies, the predominant focus regarding long-tailed class imbalance pertains to a setting within a single domain in which the training and test samples are presumed to originate from the same feature space and possess identical data distributions. However, natural datasets can be derived from distinct domains in which minority classes in a specific domain can be majority classes in other domains. Multi-domain long-tailed learning is the process of acquiring knowledge from imbalanced datasets spanning numerous domains, ensuring that the learned model can generalize to all classes across all domains. This study offers a comprehensive review of existing multi-domain long-tailed learning methods that includes challenges, advances in research, and prospects. Our study first defines multi-domain long-tailed learning and its associated challenges. Then, an overall categorization of existing methods is introduced, and an overview of these research advancements is provided.

Details

Language :
English
ISSN :
21693536
Volume :
12
Database :
Directory of Open Access Journals
Journal :
IEEE Access
Publication Type :
Academic Journal
Accession number :
edsdoj.86647270a71491591eaa8cb434d3b1e
Document Type :
article
Full Text :
https://doi.org/10.1109/ACCESS.2024.3413578