Back to Search Start Over

TimeLMs: Diachronic Language Models from Twitter

Authors :
Loureiro, Daniel
Barbieri, Francesco
Neves, Leonardo
Anke, Luis Espinosa
Camacho-Collados, Jose
Loureiro, Daniel
Barbieri, Francesco
Neves, Leonardo
Anke, Luis Espinosa
Camacho-Collados, Jose
Publication Year :
2022

Abstract

Despite its importance, the time variable has been largely neglected in the NLP and language model literature. In this paper, we present TimeLMs, a set of language models specialized on diachronic Twitter data. We show that a continual learning strategy contributes to enhancing Twitter-based language models' capacity to deal with future and out-of-distribution tweets, while making them competitive with standardized and more monolithic benchmarks. We also perform a number of qualitative analyses showing how they cope with trends and peaks in activity involving specific named entities or concept drift.<br />Comment: Accepted to ACL 2022 (Demo Track) - https://github.com/cardiffnlp/timelms

Details

Database :
OAIster
Publication Type :
Electronic Resource
Accession number :
edsoai.on1333748503
Document Type :
Electronic Resource