Back to Search Start Over

MPrompt: Exploring Multi-level Prompt Tuning for Machine Reading Comprehension

Authors :
Chen, Guoxin
Qian, Yiming
Wang, Bowen
Li, Liangzhi
Publication Year :
2023

Abstract

The large language models have achieved superior performance on various natural language tasks. One major drawback of such approaches is they are resource-intensive in fine-tuning new datasets. Soft-prompt tuning presents a resource-efficient solution to fine-tune the pre-trained language models (PLMs) while keeping their weight frozen. Existing soft prompt methods mainly focus on designing the input-independent prompts that steer the model to fit the domain of the new dataset. Those methods often ignore the fine-grained information about the task and context of the text. In this paper, we propose a multi-level prompt tuning (MPrompt) method for machine reading comprehension. It utilizes prompts at task-specific, domain-specific, and context-specific levels to enhance the comprehension of input semantics at different granularities. We also propose an independence constraint to steer each domain-specific prompt to focus on information within its domain to avoid redundancy. Moreover, we present a prompt generator that incorporates context-related knowledge in the prompt generation to enhance contextual relevancy. We conducted extensive experiments on 12 benchmarks of various QA formats and achieved an average improvement of 1.94\% over the state-of-the-art methods.<br />Comment: 13 pages, 5 figures, accepted by EMNLP2023-Findings

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2310.18167
Document Type :
Working Paper