1. Bounded quantile loss for robust support vector machines-based classification and regression.
- Author
-
Zhang, Jiaqi and Yang, Hu
- Subjects
- *
SUPPORT vector machines , *QUANTILE regression , *LEARNING problems , *CLASSIFICATION - Abstract
In this paper, motivate by quantile in the field statistics and bounded linex loss function a novel robust bounded quantile loss is proposed for improving the performance of support vector machine (SVM) and support vector regression (SVR). The bounded quantile loss has some important properties such as asymmetry, non-convexity, which make SVM and SVR based on bounded quantile (BQ-SVM and BQ-SVR) robust to noise. The Fisher consistency and generalization error bound of BQ-SVM guarantees the generalization capability of BQ-SVM. In addition, we derive the influence functions of BQ-SVM and BQ-SVR to show that they are less sensitive to outliers. However, the non-convexity of the proposed bounded quantile makes it difficult to optimize. We utilize the concave-convex procedure (CCCP) technique to solve the proposed models. A host of numerical studies on artificial and benchmark datasets are conducted, and various evaluation criteria were considered to demonstrate the effectiveness and the efficiency of the proposed method. Experimental results indicate that our proposed methods is more robust compare to classical and advanced methods when the dataset contains outliers in both classification and regression applications. • A novel bounded quantile loss (L b q loss) is proposed for robust learning problems. • The support vector machines based methods with L b q loss (BQ-SVM and BQ- SVR) are robust. • The designed ClipDCD-based CCCP algorithm is efficient to solve the BQ-SVM and BQ-SVR. • Numerical and theoretical results demonstrate the effectiveness of BQ-SVM and BQ-SVR. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF