Back to Search Start Over

Stateless neural meta-learning using second-order gradients

Authors :
Mike Huisman
Aske Plaat
Jan N. van Rijn
Source :
Machine Learning. 111:3227-3244
Publication Year :
2022
Publisher :
Springer Science and Business Media LLC, 2022.

Abstract

Meta-learning can be used to learn a good prior that facilitates quick learning; two popular approaches are MAML and the meta-learner LSTM. These two methods represent important and different approaches in meta-learning. In this work, we study the two and formally show that the meta-learner LSTM subsumes MAML, although MAML, which is in this sense less general, outperforms the other. We suggest the reason for this surprising performance gap is related to second-order gradients. We construct a new algorithm (named TURTLE) to gain more insight into the importance of second-order gradients. TURTLE is simpler than the meta-learner LSTM yet more expressive than MAML and outperforms both techniques at few-shot sine wave regression and 50% of the tested image classification settings (without any additional hyperparameter tuning) and is competitive otherwise, at a computational cost that is comparable to second-order MAML. We find that second-order gradients also significantly increase the accuracy of the meta-learner LSTM. When MAML was introduced, one of its remarkable features was the use of second-order gradients. Subsequent work focused on cheaper first-order approximations. On the basis of our findings, we argue for more attention for second-order gradients.

Details

ISSN :
15730565 and 08856125
Volume :
111
Database :
OpenAIRE
Journal :
Machine Learning
Accession number :
edsair.doi.dedup.....1f09cc43635e635941939186ebc5ac16