Back to Search Start Over

Boosting image sentiment analysis with visual attention.

Authors :
Song, Kaikai
Yao, Ting
Ling, Qiang
Mei, Tao
Source :
Neurocomputing. Oct2018, Vol. 312, p218-228. 11p.
Publication Year :
2018

Abstract

Sentiment analysis plays an important role in behavior sciences, which aims to determine the attitude of a speaker or a writer regarding some topic or the overall contextual polarity of a document. The problem nevertheless is not trivial, especially when inferring sentiment or emotion from visual contents, such as images and videos, which are becoming pervasive on the Web. Observing that the sentiment of an image may be reflected only by some spatial regions, a valid question is how to locate the attended spatial areas for enhancing image sentiment analysis. In this paper, we present Sentiment Networks with visual Attention (SentiNet-A) — a novel architecture that integrates visual attention into the successful Convolutional Neural Networks (CNN) sentiment classification framework, by training them in an end-to-end manner. To model visual attention, we develop multiple layers to generate the attention distribution over the regions of the image. Furthermore, the saliency map of the image is employed as a priori knowledge and regularizer to holistically refine the attention distribution for sentiment prediction. Extensive experiments are conducted on both Twitter and ARTphoto benchmarks, and our framework achieves superior results when compared to the state-of-the-art techniques. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
09252312
Volume :
312
Database :
Academic Search Index
Journal :
Neurocomputing
Publication Type :
Academic Journal
Accession number :
130689867
Full Text :
https://doi.org/10.1016/j.neucom.2018.05.104