Back to Search
Start Over
Enhancing domain-specific text generation for power grid maintenance with P2FT
- Source :
- Scientific Reports, Vol 14, Iss 1, Pp 1-12 (2024)
- Publication Year :
- 2024
- Publisher :
- Nature Portfolio, 2024.
-
Abstract
- Abstract The digitization of operation and maintenance in the intelligent power grid equipment relies on a diverse array of information for smart decision-making. In the domain of intelligent decision generation, proficiency is contingent upon extensive learning from copious amounts of text. This necessitates not only robust processing capabilities but also a high level of specialization. In addressing situations where authorization is lacking, pre-trained language models (PLMs) have already provided ideas when confronted with specialized domains or tasks. In consideration of the complexity of textual content in the field of the power grid, which encompasses a multitude of specialized knowledge and involves an abundance of proprietary terminology, we have undertaken an exploration of pre-trained model specialization using the power grid domain as an example, specifically for the task of generating maintenance strategies. A two-stage fine-tuning approach (P2FT) is employed, utilizing a large-scale pre-training model specifically designed for natural language processing. The efficacy and practical value of this method were evaluated through multiple metrics, juxtaposed with other advanced approaches involving low-parameter or parameter-free fine-tuning methods. Through a meticulous analysis and validation of experimental outcomes, we have corroborated the feasibility and practical application value of employing this approach for pre-trained model specialization. Additionally, it has furnished valuable guidance for text generation within both the Chinese language domain and the power grid domain.
Details
- Language :
- English
- ISSN :
- 20452322
- Volume :
- 14
- Issue :
- 1
- Database :
- Directory of Open Access Journals
- Journal :
- Scientific Reports
- Publication Type :
- Academic Journal
- Accession number :
- edsdoj.715b5e921e415e94f1c010e716540a
- Document Type :
- article
- Full Text :
- https://doi.org/10.1038/s41598-024-78078-y