Back to Search
Start Over
Unified View Empirical Study for Large Pretrained Model on Cross-Domain Few-Shot Learning.
- Source :
- ACM Transactions on Multimedia Computing, Communications & Applications; Sep2024, Vol. 20 Issue 9, p1-18, 18p
- Publication Year :
- 2024
-
Abstract
- The challenge of cross-domain few-shot learning (CD-FSL) stems from the substantial distribution disparities between target and source domain images, necessitating a model with robust generalization capabilities. In this work, we posit that large-scale pretrained models are pivotal in addressing the CD-FSL task owing to their exceptional representational and generalization prowess. To our knowledge, no existing research comprehensively investigates the utility of large-scale pretrained models in the CD-FSL context. Addressing this gap, our study presents an exhaustive empirical assessment of the Contrastive LanguageāImage Pre-Training model within the CD-FSL task. We undertake a comparison spanning six dimensions: base model, transfer module, classifier, loss, data augmentation, and training schedule. Furthermore, we establish a straightforward baseline model, E-base, based on our empirical analysis, underscoring the importance of our investigation. Experimental results substantiate the efficacy of our model, yielding a mean gain of 1.2% in 5-way 5-shot evaluations on the BSCD dataset. [ABSTRACT FROM AUTHOR]
- Subjects :
- DATA augmentation
GENERALIZATION
EMPIRICAL research
Subjects
Details
- Language :
- English
- ISSN :
- 15516857
- Volume :
- 20
- Issue :
- 9
- Database :
- Complementary Index
- Journal :
- ACM Transactions on Multimedia Computing, Communications & Applications
- Publication Type :
- Academic Journal
- Accession number :
- 179790521
- Full Text :
- https://doi.org/10.1145/3673231