Back to Search Start Over

Investigating Prompt Learning for Chinese Few-Shot Text Classification with Pre-Trained Language Models

Authors :
Chengyu Song
Taihua Shao
Kejing Lin
Dengfeng Liu
Siyuan Wang
Honghui Chen
Source :
Applied Sciences, Vol 12, Iss 21, p 11117 (2022)
Publication Year :
2022
Publisher :
MDPI AG, 2022.

Abstract

Text classification aims to assign predefined labels to unlabeled sentences, which tend to struggle in real-world applications when only a few annotated samples are available. Previous works generally focus on using the paradigm of meta-learning to overcome the classification difficulties brought by insufficient data, where a set of auxiliary tasks is given. Accordingly, prompt-based approaches are proposed to deal with the low-resource issue. However, existing prompt-based methods mainly focus on English tasks, which generally apply English pretrained language models that can not directly adapt to Chinese tasks due to structural and grammatical differences. Thus, we propose a prompt-based Chinese text classification framework that uses generated natural language sequences as hints, which can alleviate the classification bottleneck well in low-resource scenarios. In detail, we first design a prompt-based fine-tuning together with a novel pipeline for automating prompt generation in Chinese. Then, we propose a refined strategy for dynamically and selectively incorporating demonstrations into each context. We present a systematic evaluation for analyzing few-shot performance on a wide range of Chinese text classification tasks. Our approach makes few assumptions about task resources and expertise and therefore constitutes a powerful, task-independent approach for few-shot learning.

Details

Language :
English
ISSN :
20763417
Volume :
12
Issue :
21
Database :
Directory of Open Access Journals
Journal :
Applied Sciences
Publication Type :
Academic Journal
Accession number :
edsdoj.205c604de69a4f64be76dbbdf31735b8
Document Type :
article
Full Text :
https://doi.org/10.3390/app122111117