1. Incremental learning for Lagrangian ε-twin support vector regression.
- Author
-
Gu, Binjie, Cao, Jie, Pan, Feng, and Xiong, Weili
- Subjects
MACHINE learning ,MATRIX inversion ,NEWTON-Raphson method ,HESSIAN matrices ,NONLINEAR regression ,REGULARIZATION parameter - Abstract
This paper investigates the online learning problem of Lagrangian ε -twin support vector regression (L- ε -TSVR), with the goal of presenting incremental implementations. First, to solve the problem that the existing L- ε -TSVR cannot efficiently update the model under incremental scenarios, an incremental Lagrangian ε -twin support vector regression (IL- ε -TSVR) based on the semi-smooth Newton method is proposed. By utilizing the matrix inverse theorems to update the Hessian matrices incrementally, IL- ε -TSVR lowers the time complexity and expedites the training process. However, when solving the problem of nonlinear case, the training speed of IL- ε -TSVR rapidly decreases with the increasing size of the kernel matrix. Therefore, an incremental reduced Lagrangian ε -twin support vector regression (IRL- ε -TSVR) is presented. IRL- ε -TSVR employs the reduced technique to restrict the size of the inverse matrix at the cost of slightly lower the prediction accuracy. Next, to lighten the prediction accuracy loss caused by parameters reduction, a novel regularization term is introduced to replace the original one, and an improved incremental reduced Lagrangian ε -twin support vector regression (IIRL- ε -TSVR) is designed. The results on UCI benchmark datasets show that IL- ε -TSVR can effectively address the linear regression problem under incremental scenarios and obtain almost the same generalization capability as offline learning. Moreover, IRL- ε -TSVR and IIRL- ε -TSVR can reduce training time of nonlinear regression model and obtain sparse solution, and their generalization capabilities are close to those of offline ones. Particularly, the proposed algorithms can implement fast incremental learning of large-scale data. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF