Back to Search Start Over

Few-Shot Table-to-Text Generation with Prompt Planning and Knowledge Memorization

Authors :
Guo, Zhixin
Yan, Minyxuan
Qi, Jiexing
Zhou, Jianping
He, Ziwei
Lin, Zhouhan
Zheng, Guanjie
Wang, Xinbing
Publication Year :
2023

Abstract

Pre-trained language models (PLM) have achieved remarkable advancement in table-to-text generation tasks. However, the lack of labeled domain-specific knowledge and the topology gap between tabular data and text make it difficult for PLMs to yield faithful text. Low-resource generation likewise faces unique challenges in this domain. Inspired by how humans descript tabular data with prior knowledge, we suggest a new framework: PromptMize, which targets table-to-text generation under few-shot settings. The design of our framework consists of two aspects: a prompt planner and a knowledge adapter. The prompt planner aims to generate a prompt signal that provides instance guidance for PLMs to bridge the topology gap between tabular data and text. Moreover, the knowledge adapter memorizes domain-specific knowledge from the unlabelled corpus to supply essential information during generation. Extensive experiments and analyses are investigated on three open domain few-shot NLG datasets: human, song, and book. Compared with previous state-of-the-art approaches, our model achieves remarkable performance in generating quality as judged by human and automatic evaluations.<br />Comment: Accidental duplicate. Please see arXiv:2302.12468

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2302.04415
Document Type :
Working Paper