Back to Search Start Over

Adaptive Prompting for Continual Relation Extraction: A Within-Task Variance Perspective

Authors :
Le, Minh
Luu, Tien Ngoc
The, An Nguyen
Le, Thanh-Thien
Nguyen, Trang
Nguyen, Tung Thanh
Van, Linh Ngo
Nguyen, Thien Huu
Publication Year :
2024

Abstract

To address catastrophic forgetting in Continual Relation Extraction (CRE), many current approaches rely on memory buffers to rehearse previously learned knowledge while acquiring new tasks. Recently, prompt-based methods have emerged as potent alternatives to rehearsal-based strategies, demonstrating strong empirical performance. However, upon analyzing existing prompt-based approaches for CRE, we identified several critical limitations, such as inaccurate prompt selection, inadequate mechanisms for mitigating forgetting in shared parameters, and suboptimal handling of cross-task and within-task variances. To overcome these challenges, we draw inspiration from the relationship between prefix-tuning and mixture of experts, proposing a novel approach that employs a prompt pool for each task, capturing variations within each task while enhancing cross-task variances. Furthermore, we incorporate a generative model to consolidate prior knowledge within shared parameters, eliminating the need for explicit data storage. Extensive experiments validate the efficacy of our approach, demonstrating superior performance over state-of-the-art prompt-based and rehearsal-free methods in continual relation extraction.<br />Comment: Accepted to AAAI 2025

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2412.08285
Document Type :
Working Paper