Back to Search
Start Over
Improving generation quality of pointer networks via guided attention
- Publication Year :
- 2019
-
Abstract
- Pointer generator networks have been used successfully for abstractive summarization. Along with the capability to generate novel words, it also allows the model to copy from the input text to handle out-of-vocabulary words. In this paper, we point out two key shortcomings of the summaries generated with this framework via manual inspection, statistical analysis and human evaluation. The first shortcoming is the extractive nature of the generated summaries, since the network eventually learns to copy from the input article most of the times, affecting the abstractive nature of the generated summaries. The second shortcoming is the factual inaccuracies in the generated text despite grammatical correctness. Our analysis indicates that this arises due to incorrect attention transition between different parts of the article. We propose an initial attempt towards addressing both these shortcomings by externally appending traditional linguistic information parsed from the input text, thereby teaching networks on the structure of the underlying text. Results indicate feasibility and potential of such additional cues for improved generation.<br />Comment: In AAAI-19 Workshop on Network Interpretability for Deep Learning
Details
- Database :
- arXiv
- Publication Type :
- Report
- Accession number :
- edsarx.1901.11492
- Document Type :
- Working Paper