Back to Search
Start Over
BERT with History Answer Embedding for Conversational Question Answering
- Source :
- SIGIR
- Publication Year :
- 2019
-
Abstract
- Conversational search is an emerging topic in the information retrieval community. One of the major challenges to multi-turn conversational search is to model the conversation history to answer the current question. Existing methods either prepend history turns to the current question or use complicated attention mechanisms to model the history. We propose a conceptually simple yet highly effective approach referred to as history answer embedding. It enables seamless integration of conversation history into a conversational question answering (ConvQA) model built on BERT (Bidirectional Encoder Representations from Transformers). We first explain our view that ConvQA is a simplified but concrete setting of conversational search, and then we provide a general framework to solve ConvQA. We further demonstrate the effectiveness of our approach under this framework. Finally, we analyze the impact of different numbers of history turns under different settings to provide new insights into conversation history modeling in ConvQA.<br />Accepted to SIGIR 2019 as a short paper
- Subjects :
- FOS: Computer and information sciences
Computer science
media_common.quotation_subject
02 engineering and technology
Computer Science - Information Retrieval
030507 speech-language pathology & audiology
03 medical and health sciences
Human–computer interaction
0202 electrical engineering, electronic engineering, information engineering
Question answering
Embedding
020201 artificial intelligence & image processing
Conversation
0305 other medical science
Information Retrieval (cs.IR)
Transformer (machine learning model)
media_common
Subjects
Details
- Language :
- English
- Database :
- OpenAIRE
- Journal :
- SIGIR
- Accession number :
- edsair.doi.dedup.....843719bf7c580ad52077d8b00240511f