1. Few-Shot Relation Extraction Through Prompt With Relation Information and Multi-Level Contrastive Learning
- Author
-
Ye Dong, Rong Yang, Junbao Liu, and Xizhong Qin
- Subjects
Few-shot relation extraction ,prompt learning ,contrastive learning ,relation information ,prototype network ,Electrical engineering. Electronics. Nuclear engineering ,TK1-9971 - Abstract
Few-shot relation extraction uses only limited labeled data to predict relations between entities. Recently, several studies have introduced prompts to better guide models in understanding relations between entities. Although effective, these approaches ignore the hidden interaction information between support instances and relations, which causes the prompts without effective guidance. In addition, due to the limited labeled data, the model cannot get enough information for training, leading to the problems of relation confusion. In this paper, we propose RelPromptCL, a few-shot relation extraction method that consists of Prompt learning with Relation information and Contrastive Learning. Specifically, RelPromptCL first gets more helpful information by utilizing prompt templates with relation information and then fuses the instance features with the relation features to obtain prototype representation. At the same time, the use of multi-level contrastive learning allows the model to discriminate more between different classes and improves the discriminative capability of the model. Finally, the similarity between the query instance and the prototypes is computed for relation classification. We carried out extensive experiments on both public datasets, the FewRel1.0 datasets and the FewRel2.0 datasets. The results clearly show the efficiency of RelPromptCL.
- Published
- 2024
- Full Text
- View/download PDF