Back to Search Start Over

A Survey on Transfer Learning in Natural Language Processing

Authors :
Alyafeai, Zaid
AlShaibani, Maged Saeed
Ahmad, Irfan
Publication Year :
2020

Abstract

Deep learning models usually require a huge amount of data. However, these large datasets are not always attainable. This is common in many challenging NLP tasks. Consider Neural Machine Translation, for instance, where curating such large datasets may not be possible specially for low resource languages. Another limitation of deep learning models is the demand for huge computing resources. These obstacles motivate research to question the possibility of knowledge transfer using large trained models. The demand for transfer learning is increasing as many large models are emerging. In this survey, we feature the recent transfer learning advances in the field of NLP. We also provide a taxonomy for categorizing different transfer learning approaches from the literature.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2007.04239
Document Type :
Working Paper