Back to Search Start Over

Is Fast Adaptation All You Need?

Authors :
Javed, Khurram
Yao, Hengshuai
White, Martha
Publication Year :
2019

Abstract

Gradient-based meta-learning has proven to be highly effective at learning model initializations, representations, and update rules that allow fast adaptation from a few samples. The core idea behind these approaches is to use fast adaptation and generalization -- two second-order metrics -- as training signals on a meta-training dataset. However, little attention has been given to other possible second-order metrics. In this paper, we investigate a different training signal -- robustness to catastrophic interference -- and demonstrate that representations learned by directing minimizing interference are more conducive to incremental learning than those learned by just maximizing fast adaptation.<br />Comment: Meta Learning Workshop, NeurIPS 2019, 2 figures, MRCL, MAML

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.1910.01705
Document Type :
Working Paper