Back to Search Start Over

Adapting Knowledge for Few-shot Table-to-Text Generation

Authors :
Guo, Zhixin
Yan, Minyxuan
Qi, Jiexing
Zhou, Jianping
He, Ziwei
Zheng, Guanjie
Wang, Xinbing
Publication Year :
2023

Abstract

Pretrained language models (PLMs) have made remarkable progress in table-to-text generation tasks. However, the lack of domain-specific knowledge makes it challenging to bridge the topological gap between tabular data and text, especially in real-world applications with limited resources. To mitigate the limitation of insufficient labeled data, we propose a novel framework: Adapt-Knowledge-to-Generate (AKG). The core insight of AKG is to adapt unlabeled domain-specific knowledge into the model, which brings at least three benefits: (1) it injects representation of normal table-related descriptions to bridge the topological gap between tabular data and texts; (2) it enables us to use large amounts of unlabeled domain-specific knowledge fully, which can alleviate the PLMs' inherent shortcomings of lacking domain knowledge; (3) it allows us to design various tasks to employ the domain-specific knowledge. Extensive experiments and analyses are conducted on three open-domain, few-shot natural language generation (NLG) data sets: Humans, Songs, and Books. Compared to previous state-of-the-art approaches, our model achieves superior performance in terms of both fluency and accuracy as judged by human and automatic evaluations.<br />Comment: arXiv admin note: substantial text overlap with arXiv:2302.04415

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2302.12468
Document Type :
Working Paper