Back to Search Start Over

Do Large Language Models Possess Sensitive to Sentiment?

Authors :
Liu, Yang
Zhu, Xichou
Shen, Zhou
Liu, Yi
Li, Min
Chen, Yujun
John, Benzi
Ma, Zhenzhen
Hu, Tao
Li, Zhi
Xu, Zhiyang
Luo, Wei
Wang, Junhui
Publication Year :
2024

Abstract

Large Language Models (LLMs) have recently displayed their extraordinary capabilities in language understanding. However, how to comprehensively assess the sentiment capabilities of LLMs continues to be a challenge. This paper investigates the ability of LLMs to detect and react to sentiment in text modal. As the integration of LLMs into diverse applications is on the rise, it becomes highly critical to comprehend their sensitivity to emotional tone, as it can influence the user experience and the efficacy of sentiment-driven tasks. We conduct a series of experiments to evaluate the performance of several prominent LLMs in identifying and responding appropriately to sentiments like positive, negative, and neutral emotions. The models' outputs are analyzed across various sentiment benchmarks, and their responses are compared with human evaluations. Our discoveries indicate that although LLMs show a basic sensitivity to sentiment, there are substantial variations in their accuracy and consistency, emphasizing the requirement for further enhancements in their training processes to better capture subtle emotional cues. Take an example in our findings, in some cases, the models might wrongly classify a strongly positive sentiment as neutral, or fail to recognize sarcasm or irony in the text. Such misclassifications highlight the complexity of sentiment analysis and the areas where the models need to be refined. Another aspect is that different LLMs might perform differently on the same set of data, depending on their architecture and training datasets. This variance calls for a more in-depth study of the factors that contribute to the performance differences and how they can be optimized.<br />Comment: 10 pages, 2 figures

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2409.02370
Document Type :
Working Paper