Back to Search Start Over

Multimodality in meta-learning: A comprehensive survey

Authors :
Yao Ma
Shilin Zhao
Weixiao Wang
Yaoman Li
Irwin King
Source :
Knowledge-Based Systems. 250:108976
Publication Year :
2022
Publisher :
Elsevier BV, 2022.

Abstract

Meta-learning has gained wide popularity as a training framework that is more data-efficient than traditional machine learning methods. However, its generalization ability in complex task distributions, such as multimodal tasks, has not been thoroughly studied. Recently, some studies on multimodality-based meta-learning have emerged. This survey provides a comprehensive overview of the multimodality-based meta-learning landscape in terms of the methodologies and applications. We first formalize the definition of meta-learning in multimodality, along with the research challenges in this growing field, such as how to enrich the input in few-shot learning (FSL) or zero-shot learning (ZSL) in multimodal scenarios and how to generalize the models to new tasks. We then propose a new taxonomy to discuss typical meta-learning algorithms in multimodal tasks systematically. We investigate the contributions of related papers and summarize them by our taxonomy. Finally, we propose potential research directions for this promising field.<br />Comment: Accepted by Knowledge-Based Systems; 21 pages

Details

ISSN :
09507051
Volume :
250
Database :
OpenAIRE
Journal :
Knowledge-Based Systems
Accession number :
edsair.doi.dedup.....7b4bb61791613d99944750c4fa03e9b1