1. Multi-Domain Long-Tailed Learning: Challenges, Progress, and Prospects
- Author
-
Panpan Fu and Umi Kalsom Yusof
- Subjects
Domain adaptation ,domain generalization ,data imbalance ,domain shift ,long-tailed ,multi-domain long-tailed ,Electrical engineering. Electronics. Nuclear engineering ,TK1-9971 - Abstract
In practical applications, the issue of data imbalance inevitably rises. In most studies, the predominant focus regarding long-tailed class imbalance pertains to a setting within a single domain in which the training and test samples are presumed to originate from the same feature space and possess identical data distributions. However, natural datasets can be derived from distinct domains in which minority classes in a specific domain can be majority classes in other domains. Multi-domain long-tailed learning is the process of acquiring knowledge from imbalanced datasets spanning numerous domains, ensuring that the learned model can generalize to all classes across all domains. This study offers a comprehensive review of existing multi-domain long-tailed learning methods that includes challenges, advances in research, and prospects. Our study first defines multi-domain long-tailed learning and its associated challenges. Then, an overall categorization of existing methods is introduced, and an overview of these research advancements is provided.
- Published
- 2024
- Full Text
- View/download PDF