Back to Search Start Over

Identifying Objective and Subjective Words via Topic Modeling.

Authors :
Wang, Hanqi
Wu, Fei
Lu, Weiming
Yang, Yi
Li, Xi
Li, Xuelong
Zhuang, Yueting
Source :
IEEE Transactions on Neural Networks & Learning Systems. Mar2018, Vol. 29 Issue 3, p718-730. 13p.
Publication Year :
2018

Abstract

It is observed that distinct words in a given document have either strong or weak ability in delivering facts (i.e., the objective sense) or expressing opinions (i.e., the subjective sense) depending on the topics they associate with. Motivated by the intuitive assumption that different words have varying degree of discriminative power in delivering the objective sense or the subjective sense with respect to their assigned topics, a model named as i dentified o bjective– s ubjective latent Dirichlet allocation (LDA) ( i osLDA) is proposed in this paper. In the i osLDA model, the simple Pólya urn model adopted in traditional topic models is modified by incorporating it with a probabilistic generative process, in which the novel “Bag-of-Discriminative-Words” (BoDW) representation for the documents is obtained; each document has two different BoDW representations with regard to objective and subjective senses, respectively, which are employed in the joint objective and subjective classification instead of the traditional Bag-of-Topics representation. The experiments reported on documents and images demonstrate that: 1) the BoDW representation is more predictive than the traditional ones; 2) i osLDA boosts the performance of topic modeling via the joint discovery of latent topics and the different objective and subjective power hidden in every word; and 3) i osLDA has lower computational complexity than supervised LDA, especially under an increasing number of topics. [ABSTRACT FROM PUBLISHER]

Details

Language :
English
ISSN :
2162237X
Volume :
29
Issue :
3
Database :
Academic Search Index
Journal :
IEEE Transactions on Neural Networks & Learning Systems
Publication Type :
Periodical
Accession number :
128240955
Full Text :
https://doi.org/10.1109/TNNLS.2016.2626379