Back to Search
Start Over
Investigating better context representations for generative question answering.
- Source :
-
Information Retrieval Journal . Dec2023, Vol. 26 Issue 1/2, p1-35. 35p. - Publication Year :
- 2023
-
Abstract
- Generating natural language answers for question-answering (QA) tasks has recently surged in popularity with the rise of task-based personalized assistants. Most QA research is on extractive QA, methods that find answer spans in text passages. However, the extracted answers are often incomplete and sound unnatural in a conversational context. In contrast, generative QA systems aim to generate well-formed natural language answers. For this type of QA, the answer generation method and context play crucial roles in the model performance. A challenge of generative QA is simultaneously incorporating all facts in the context necessary to answer the question and discarding irrelevant information. In this paper, we investigate efficient ways to utilize the context and to generate better contextual answers. We present a framework for generative QA that effectively selects relevant parts from context documents by eliminating extraneous information. We first present multiple strong generative baselines that use transformer-based encoder-decoder architectures to synthesize answers. These models perform equal to or better than the current state-of-the-art generative models. We next investigate the selection of relevant information from context. The context selector component can be a summarizer, reranker, evidence extractor or a combination of these. Finally, we effectively use this filtered context information to provide the most pertinent cues to the generative model to synthesize factually correct natural language answers. This significantly boosts the model’s performance. The setting with the reranked context together with evidence gives the best performance. We also study the impact of different training strategies on the answer generation capability. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 13864564
- Volume :
- 26
- Issue :
- 1/2
- Database :
- Academic Search Index
- Journal :
- Information Retrieval Journal
- Publication Type :
- Academic Journal
- Accession number :
- 172747710
- Full Text :
- https://doi.org/10.1007/s10791-023-09420-7