Back to Search Start Over

CDGP: Automatic Cloze Distractor Generation based on Pre-trained Language Model

Authors :
Chiang, Shang-Hsuan
Wang, Ssu-Cheng
Fan, Yao-Chung
Source :
chiang-etal-2022-cdgp
Publication Year :
2024

Abstract

Manually designing cloze test consumes enormous time and efforts. The major challenge lies in wrong option (distractor) selection. Having carefully-design distractors improves the effectiveness of learner ability assessment. As a result, the idea of automatically generating cloze distractor is motivated. In this paper, we investigate cloze distractor generation by exploring the employment of pre-trained language models (PLMs) as an alternative for candidate distractor generation. Experiments show that the PLM-enhanced model brings a substantial performance improvement. Our best performing model advances the state-of-the-art result from 14.94 to 34.17 (NDCG@10 score). Our code and dataset is available at https://github.com/AndyChiangSH/CDGP.<br />Comment: Findings of short paper, EMNLP 2022

Details

Database :
arXiv
Journal :
chiang-etal-2022-cdgp
Publication Type :
Report
Accession number :
edsarx.2403.10326
Document Type :
Working Paper
Full Text :
https://doi.org/10.18653/v1/2022.findings-emnlp.429