Back to Search
Start Over
Efficient Leave-m-out Cross-Validation of Support Vector Regression by Generalizing Decremental Algorithm.
- Source :
-
New Generation Computing . 2009, Vol. 27 Issue 4, p307-318. 12p. - Publication Year :
- 2009
-
Abstract
- We propose a computationally efficient method for cross-validation of the Support Vector Regression (SVR) by generalizing the decremental algorithm of SVR. Incremental and decremental algorithm of Support Vector Machines (SVM)) efficiently update the trained SVM model when a single data point is added to or removed from the training set. The computational cost of leave-one-out cross-validation can be reduced using the decremental algorithm. However, when we perform leave-m-out cross-validation (m>1), we have to repeatedly apply the decremental algorithm for each data point. In this paper, we extend the decremental algorithm of SVR ) in such a way that several data points can be removed more efficiently. Experimental results indicate that the proposed approach can reduce the computational cost. In particular, we observed that the number of breakpoints, which is the main computational cost of the involved path-following, were reduced from (O(m) to O(√m). [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 02883635
- Volume :
- 27
- Issue :
- 4
- Database :
- Academic Search Index
- Journal :
- New Generation Computing
- Publication Type :
- Academic Journal
- Accession number :
- 47116639
- Full Text :
- https://doi.org/10.1007/s00354-008-0067-3