Back to Search Start Over

Target-aware Abstractive Related Work Generation with Contrastive Learning

Authors :
Xiuying Chen
Hind Alamro
Mingzhe Li
Shen Gao
Rui Yan
Xin Gao
Xiangliang Zhang
Source :
Proceedings of the 45th International ACM SIGIR Conference on Research and Development in Information Retrieval.
Publication Year :
2022
Publisher :
ACM, 2022.

Abstract

The related work section is an important component of a scientific paper, which highlights the contribution of the target paper in the context of the reference papers. Authors can save their time and effort by using the automatically generated related work section as a draft to complete the final related work. Most of the existing related work section generation methods rely on extracting off-the-shelf sentences to make a comparative discussion about the target work and the reference papers. However, such sentences need to be written in advance and are hard to obtain in practice. Hence, in this paper, we propose an abstractive target-aware related work generator (TAG), which can generate related work sections consisting of new sentences. Concretely, we first propose a target-aware graph encoder, which models the relationships between reference papers and the target paper with target-centered attention mechanisms. In the decoding process, we propose a hierarchical decoder that attends to the nodes of different levels in the graph with keyphrases as semantic indicators. Finally, to generate a more informative related work, we propose multi-level contrastive optimization objectives, which aim to maximize the mutual information between the generated related work with the references and minimize that with non-references. Extensive experiments on two public scholar datasets show that the proposed model brings substantial improvements over several strong baselines in terms of automatic and tailored human evaluations.<br />11 pages, 7 figures, SIGIR 2022

Details

Database :
OpenAIRE
Journal :
Proceedings of the 45th International ACM SIGIR Conference on Research and Development in Information Retrieval
Accession number :
edsair.doi.dedup.....c8d3e34d96e66c614bc20a86c9dfea94