Back to Search Start Over

Transformer models for text-based emotion detection: a review of BERT-based approaches.

Authors :
Acheampong, Francisca Adoma
Nunoo-Mensah, Henry
Chen, Wenyu
Source :
Artificial Intelligence Review; Dec2021, Vol. 54 Issue 8, p5789-5829, 41p
Publication Year :
2021

Abstract

We cannot overemphasize the essence of contextual information in most natural language processing (NLP) applications. The extraction of context yields significant improvements in many NLP tasks, including emotion recognition from texts. The paper discusses transformer-based models for NLP tasks. It highlights the pros and cons of the identified models. The models discussed include the Generative Pre-training (GPT) and its variants, Transformer-XL, Cross-lingual Language Models (XLM), and the Bidirectional Encoder Representations from Transformers (BERT). Considering BERT's strength and popularity in text-based emotion detection, the paper discusses recent works in which researchers proposed various BERT-based models. The survey presents its contributions, results, limitations, and datasets used. We have also provided future research directions to encourage research in text-based emotion detection using these models. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
02692821
Volume :
54
Issue :
8
Database :
Complementary Index
Journal :
Artificial Intelligence Review
Publication Type :
Academic Journal
Accession number :
153318478
Full Text :
https://doi.org/10.1007/s10462-021-09958-2