1. An Efficient v-Minimum Absolute Deviation Distribution Regression Machine
- Author
-
Yan Wang, Yao Wang, Yingying Song, Xuping Xie, Lan Huang, Wei Pang, and George M. Coghill
- Subjects
v-support vector regression ,absolute regression deviation mean ,absolute regression deviation variance ,dual coordinate descent algorithm ,Electrical engineering. Electronics. Nuclear engineering ,TK1-9971 - Abstract
Support Vector Regression (SVR) and its variants are widely used regression algorithms, and they have demonstrated high generalization ability. This research proposes a new SVR-based regressor: v-minimum absolute deviation distribution regression (v-MADR) machine. Instead of merely minimizing structural risk, as with v-SVR, v-MADR aims to achieve better generalization performance by minimizing both the absolute regression deviation mean and the absolute regression deviation variance, which takes into account the positive and negative values of the regression deviation of sample points. For optimization, we propose a dual coordinate descent (DCD) algorithm for small sample problems, and we also propose an averaged stochastic gradient descent (ASGD) algorithm for large-scale problems. Furthermore, we study the statistical property of v-MADR that leads to a bound on the expectation of error. The experimental results on both artificial and real datasets indicate that our v-MADR has significant improvement in generalization performance with less training time compared to the widely used v-SVR, LS-SVR, ε-TSVR, and linear ε-SVR. Finally, we open source the code of v-MADR at https://github.com/AsunaYY/v-MADR for wider dissemination.
- Published
- 2020
- Full Text
- View/download PDF