Back to Search Start Over

Enhancing domain-specific text generation for power grid maintenance with P2FT.

Authors :
Yang, Yi
Li, Chenhao
Zhu, Binghang
Zheng, Wenjie
Zhang, Fengda
Li, Zhuangzhuang
Source :
Scientific Reports. 11/5/2024, Vol. 14 Issue 1, p1-12. 12p.
Publication Year :
2024

Abstract

The digitization of operation and maintenance in the intelligent power grid equipment relies on a diverse array of information for smart decision-making. In the domain of intelligent decision generation, proficiency is contingent upon extensive learning from copious amounts of text. This necessitates not only robust processing capabilities but also a high level of specialization. In addressing situations where authorization is lacking, pre-trained language models (PLMs) have already provided ideas when confronted with specialized domains or tasks. In consideration of the complexity of textual content in the field of the power grid, which encompasses a multitude of specialized knowledge and involves an abundance of proprietary terminology, we have undertaken an exploration of pre-trained model specialization using the power grid domain as an example, specifically for the task of generating maintenance strategies. A two-stage fine-tuning approach (P2FT) is employed, utilizing a large-scale pre-training model specifically designed for natural language processing. The efficacy and practical value of this method were evaluated through multiple metrics, juxtaposed with other advanced approaches involving low-parameter or parameter-free fine-tuning methods. Through a meticulous analysis and validation of experimental outcomes, we have corroborated the feasibility and practical application value of employing this approach for pre-trained model specialization. Additionally, it has furnished valuable guidance for text generation within both the Chinese language domain and the power grid domain. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
20452322
Volume :
14
Issue :
1
Database :
Academic Search Index
Journal :
Scientific Reports
Publication Type :
Academic Journal
Accession number :
180734807
Full Text :
https://doi.org/10.1038/s41598-024-78078-y