1. Relaxed support vector regression
- Author
-
Talayeh Razzaghi, Petros Xanthopoulos, Orestis P. Panagopoulos, and Onur Seref
- Subjects
021103 operations research ,Mean squared error ,business.industry ,0211 other engineering and technologies ,General Decision Sciences ,Regression analysis ,Pattern recognition ,02 engineering and technology ,Management Science and Operations Research ,Regression ,Robust regression ,Support vector machine ,ComputingMethodologies_PATTERNRECOGNITION ,Robustness (computer science) ,Ordinary least squares ,Outlier ,Artificial intelligence ,business ,Mathematics - Abstract
Datasets with outliers pose a serious challenge in regression analysis. In this paper, a new regression method called relaxed support vector regression (RSVR) is proposed for such datasets. RSVR is based on the concept of constraint relaxation which leads to increased robustness in datasets with outliers. RSVR is formulated using both linear and quadratic loss functions. Numerical experiments on benchmark datasets and computational comparisons with other popular regression methods depict the behavior of our proposed method. RSVR achieves better overall performance than support vector regression (SVR) in measures such as RMSE and $$R^2_{adj}$$ while being on par with other state-of-the-art regression methods such as robust regression (RR). Additionally, RSVR provides robustness for higher dimensional datasets which is a limitation of RR, the robust equivalent of ordinary least squares regression. Moreover, RSVR can be used on datasets that contain varying levels of noise.
- Published
- 2018
- Full Text
- View/download PDF