Back to Search Start Over

Skipping Word: A Character-Sequential Representation based Framework for Question Answering

Authors :
Meng, Lingxun
Li, Yan
Liu, Mengyi
Shu, Peng
Publication Year :
2016

Abstract

Recent works using artificial neural networks based on word distributed representation greatly boost the performance of various natural language learning tasks, especially question answering. Though, they also carry along with some attendant problems, such as corpus selection for embedding learning, dictionary transformation for different learning tasks, etc. In this paper, we propose to straightforwardly model sentences by means of character sequences, and then utilize convolutional neural networks to integrate character embedding learning together with point-wise answer selection training. Compared with deep models pre-trained on word embedding (WE) strategy, our character-sequential representation (CSR) based method shows a much simpler procedure and more stable performance across different benchmarks. Extensive experiments on two benchmark answer selection datasets exhibit the competitive performance compared with the state-of-the-art methods.<br />Comment: to be accepted as CIKM2016 short paper

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.1609.00565
Document Type :
Working Paper
Full Text :
https://doi.org/10.1145/2983323.2983861