Back to Search Start Over

Teaching Large Language Models Number-Focused Headline Generation With Key Element Rationales

Authors :
Qian, Zhen
Zhang, Xiuzhen
Xu, Xiaofei
Xia, Feng
Publication Year :
2025

Abstract

Number-focused headline generation is a summarization task requiring both high textual quality and precise numerical accuracy, which poses a unique challenge for Large Language Models (LLMs). Existing studies in the literature focus only on either textual quality or numerical reasoning and thus are inadequate to address this challenge. In this paper, we propose a novel chain-of-thought framework for using rationales comprising key elements of the Topic, Entities, and Numerical reasoning (TEN) in news articles to enhance the capability for LLMs to generate topic-aligned high-quality texts with precise numerical accuracy. Specifically, a teacher LLM is employed to generate TEN rationales as supervision data, which are then used to teach and fine-tune a student LLM. Our approach teaches the student LLM automatic generation of rationales with enhanced capability for numerical reasoning and topic-aligned numerical headline generation. Experiments show that our approach achieves superior performance in both textual quality and numerical accuracy.<br />Comment: Pre-print for a paper accepted to findings of NAACL 2025

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2502.03129
Document Type :
Working Paper