Back to Search Start Over

PAL-BERT: An Improved Question Answering Model.

Authors :
Wenfeng Zheng
Siyu Lu
Zhuohang Cai
Ruiyang Wang
Lei Wang
Lirong Yin
Source :
CMES-Computer Modeling in Engineering & Sciences; 2024, Vol. 139 Issue 3, p2729-2745, 17p
Publication Year :
2024

Abstract

In the field of natural language processing (NLP), there have been various pre-training language models in recent years, with question answering systems gaining significant attention. However, as algorithms, data, and computing power advance, the issue of increasingly larger models and a growing number of parameters has surfaced. Consequently, model training has become more costly and less efficient. To enhance the efficiency and accuracy of the training process while reducing themodel volume, this paper proposes a first-order pruningmodel PAL-BERT based on the ALBERT model according to the characteristics of question-answering (QA) system and language model. Firstly, a first-order network pruning method based on the ALBERT model is designed, and the PAL-BERT model is formed. Then, the parameter optimization strategy of the PAL-BERT model is formulated, and the Mish function was used as an activation function instead of ReLU to improve the performance. Finally, after comparison experiments with traditional deep learning models TextCNN and BiLSTM, it is confirmed that PALBERT is a pruning model compression method that can significantly reduce training time and optimize training efficiency. Compared with traditional models, PAL-BERT significantly improves the NLP task's performance. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
15261492
Volume :
139
Issue :
3
Database :
Complementary Index
Journal :
CMES-Computer Modeling in Engineering & Sciences
Publication Type :
Academic Journal
Accession number :
176091299
Full Text :
https://doi.org/10.32604/cmes.2023.046692