Back to Search Start Over

Prompt-SAW: Leveraging Relation-Aware Graphs for Textual Prompt Compression

Authors :
Ali, Muhammad Asif
Li, Zhengping
Yang, Shu
Cheng, Keyuan
Cao, Yang
Huang, Tianhao
Hu, Guimin
Lyu, Weimin
Hu, Lijie
Yu, Lu
Wang, Di
Publication Year :
2024

Abstract

Large Language Models (LLMs) have shown exceptional abilities for multiple different natural language processing tasks. While prompting is a crucial tool for LLM inference, we observe that there is a significant cost associated with exceedingly lengthy prompts. Existing attempts to compress lengthy prompts lead to substandard results in terms of readability/interpretability of the compressed prompt, with a detrimental impact on prompt utility. To address this, we propose PromptSAW: Prompt compresSion via Relation AWare graphs, an effective strategy for prompt compression over task-agnostic and task-aware prompts. Prompt-SAW uses the prompt's textual information to build a graph and later extracts key information elements in the graph to come up with the compressed prompt. We also propose GSM8K-aug, i.e., an extended version of the existing GSM8K benchmark for task-agnostic prompts in order to provide a comprehensive evaluation platform. Experimental evaluation using benchmark datasets shows that prompts compressed by Prompt-SAW are not only better in terms of readability, but they also outperform the best-performing baseline models by up to 10.1 and 77.1, respectively, for task-agnostic and task-aware settings while compressing the original prompt text by 34.9 and 56.7.<br />Comment: 16 pages

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2404.00489
Document Type :
Working Paper