1. Efficient Leave-m-out Cross-Validation of Support Vector Regression by Generalizing Decremental Algorithm.
- Author
-
Karasuyama, Masayuki, Takeuchi, Ichiro, and Nakano, Ryohei
- Subjects
- *
REGRESSION analysis , *SUPPORT vector machines , *SUPERVISED learning , *COMPUTATIONAL learning theory , *MACHINE learning , *ALGORITHMS - Abstract
We propose a computationally efficient method for cross-validation of the Support Vector Regression (SVR) by generalizing the decremental algorithm of SVR. Incremental and decremental algorithm of Support Vector Machines (SVM)) efficiently update the trained SVM model when a single data point is added to or removed from the training set. The computational cost of leave-one-out cross-validation can be reduced using the decremental algorithm. However, when we perform leave-m-out cross-validation (m>1), we have to repeatedly apply the decremental algorithm for each data point. In this paper, we extend the decremental algorithm of SVR ) in such a way that several data points can be removed more efficiently. Experimental results indicate that the proposed approach can reduce the computational cost. In particular, we observed that the number of breakpoints, which is the main computational cost of the involved path-following, were reduced from (O(m) to O(√m). [ABSTRACT FROM AUTHOR]
- Published
- 2009
- Full Text
- View/download PDF