Back to Search Start Over

Efficiently Tuned Parameters are Task Embeddings

Authors :
Zhou, Wangchunshu
Xu, Canwen
McAuley, Julian
Publication Year :
2022

Abstract

Intermediate-task transfer can benefit a wide range of NLP tasks with properly selected source datasets. However, it is computationally infeasible to experiment with all intermediate transfer combinations, making choosing a useful source task a challenging problem. In this paper, we anticipate that task-specific parameters updated in parameter-efficient tuning methods are likely to encode task-specific information. Therefore, such parameters can be predictive for inter-task transferability. Thus, we propose to exploit these efficiently tuned parameters as off-the-shelf task embeddings for the efficient selection of source datasets for intermediate-task transfer. We experiment with 11 text classification tasks and 11 question answering tasks. Experimental results show that our approach can consistently outperform existing inter-task transferability prediction methods while being conceptually simple and computationally efficient. Our analysis also reveals that the ability of efficiently tuned parameters on transferability prediction is disentangled with their in-task performance. This allows us to use parameters from early checkpoints as task embeddings to further improve efficiency.<br />Comment: EMNLP 2022 (main conference)

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2210.11705
Document Type :
Working Paper