Back to Search Start Over

Extending Embedding Representation by Incorporating Latent Relations

Authors :
Gao Yang
Wang Wenbo
Liu Qian
Huang Heyan
Yuefeng Li
Source :
IEEE Access, Vol 6, Pp 52682-52690 (2018)
Publication Year :
2018
Publisher :
IEEE, 2018.

Abstract

The semantic representation of words is a fundamental task in natural language processing and text mining. Learning word embedding has shown its power on various tasks. Most studies are aimed at generating embedding representation of a word based on encoding its context information. However, many latent relations, such as co-occurring associative patterns and semantic conceptual relations, are not well considered. In this paper, we propose an extensible model to incorporate these kinds of valuable latent relations to increase the semantic relatedness of word pairs by learning word embeddings. To assess the effectiveness of our model, we conduct experiments on both information retrieval and text classification tasks. The results indicate the effectiveness of our model as well as its flexibility on different tasks.

Details

Language :
English
ISSN :
21693536
Volume :
6
Database :
Directory of Open Access Journals
Journal :
IEEE Access
Publication Type :
Academic Journal
Accession number :
edsdoj.94e69fced8934c6c88bbd846218f39ee
Document Type :
article
Full Text :
https://doi.org/10.1109/ACCESS.2018.2866531