Back to Search Start Over

MTLFormer: Multi-Task Learning Guided Transformer Network for Business Process Prediction

Authors :
Jiaojiao Wang
Jiawei Huang
Xiaoyu Ma
Zhongjin Li
Yaqi Wang
Dingguo Yu
Source :
IEEE Access, Vol 11, Pp 76722-76738 (2023)
Publication Year :
2023
Publisher :
IEEE, 2023.

Abstract

The predictive business process monitoring mainly focuses on the performance prediction of business process execution, i.e., predicting the next activity, the execution time of the next activity, and the remaining time, respectively, for an ongoing process instance based on the knowledge gained from historical event logs. Although there is a specific relationship between these three tasks, recent research has focused on training separate prediction models for each task, resulting in high costs and time complexity. Additionally, existing technologies are limited in their ability to capture long-distance dependent features in process instances, further impeding prediction performance. To address these issues, this paper proposes the MTLFormer approach, which leverages the self-attention mechanism of the Transformer network and conducts multi-task parallel training through shared feature representation obtained from different tasks. Our approach reduces the time complexity of model training while simultaneously improving prediction performance. We extensively evaluate our approach on four real-life event logs, demonstrating its capability to achieve multi-task online real-time prediction and effectively improve prediction performance.

Details

Language :
English
ISSN :
21693536
Volume :
11
Database :
Directory of Open Access Journals
Journal :
IEEE Access
Publication Type :
Academic Journal
Accession number :
edsdoj.19725c0ff74e0389716a98bb4f5aba
Document Type :
article
Full Text :
https://doi.org/10.1109/ACCESS.2023.3298305