Back to Search Start Over

BBT-Fin: Comprehensive Construction of Chinese Financial Domain Pre-trained Language Model, Corpus and Benchmark

Authors :
Lu, Dakuan
Wu, Hengkui
Liang, Jiaqing
Xu, Yipei
He, Qianyu
Geng, Yipeng
Han, Mengkun
Xin, Yingsi
Xiao, Yanghua
Publication Year :
2023

Abstract

To advance Chinese financial natural language processing (NLP), we introduce BBT-FinT5, a new Chinese financial pre-training language model based on the T5 model. To support this effort, we have built BBT-FinCorpus, a large-scale financial corpus with approximately 300GB of raw text from four different sources. In general domain NLP, comprehensive benchmarks like GLUE and SuperGLUE have driven significant advancements in language model pre-training by enabling head-to-head comparisons among models. Drawing inspiration from these benchmarks, we propose BBT-CFLEB, a Chinese Financial Language understanding and generation Evaluation Benchmark, which includes six datasets covering both understanding and generation tasks. Our aim is to facilitate research in the development of NLP within the Chinese financial domain. Our model, corpus and benchmark are released at https://github.com/ssymmetry/BBT-FinCUGE-Applications. Our work belongs to the Big Bang Transformer (BBT), a large-scale pre-trained language model project.<br />Comment: Changed author order

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2302.09432
Document Type :
Working Paper