Back to Search Start Over

Meta-Learning with MAML on Trees

Authors :
Garcia, Jezabel R.
Freddi, Federica
Liao, Feng-Ting
McGowan, Jamie
Nieradzik, Tim
Shiu, Da-shan
Tian, Ye
Bernacchia, Alberto
Publication Year :
2021

Abstract

In meta-learning, the knowledge learned from previous tasks is transferred to new ones, but this transfer only works if tasks are related. Sharing information between unrelated tasks might hurt performance, and it is unclear how to transfer knowledge across tasks with a hierarchical structure. Our research extends a model agnostic meta-learning model, MAML, by exploiting hierarchical task relationships. Our algorithm, TreeMAML, adapts the model to each task with a few gradient steps, but the adaptation follows the hierarchical tree structure: in each step, gradients are pooled across tasks clusters, and subsequent steps follow down the tree. We also implement a clustering algorithm that generates the tasks tree without previous knowledge of the task structure, allowing us to make use of implicit relationships between the tasks. We show that the new algorithm, which we term TreeMAML, performs better than MAML when the task structure is hierarchical for synthetic experiments. To study the performance of the method in real-world data, we apply this method to Natural Language Understanding, we use our algorithm to finetune Language Models taking advantage of the language phylogenetic tree. We show that TreeMAML improves the state of the art results for cross-lingual Natural Language Inference. This result is useful, since most languages in the world are under-resourced and the improvement on cross-lingual transfer allows the internationalization of NLP models. This results open the window to use this algorithm in other real-world hierarchical datasets.<br />Comment: Updated version of paper in EACL workshop: Adapt-NLP 2021

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2103.04691
Document Type :
Working Paper