Back to Search
Start Over
Matrix-Regularized Multiple Kernel Learning via $(r,~p)$ Norms.
- Source :
-
IEEE Transactions on Neural Networks & Learning Systems . Oct2018, Vol. 29 Issue 10, p4997-5007. 11p. - Publication Year :
- 2018
-
Abstract
- This paper examines a matrix-regularized multiple kernel learning (MKL) technique based on a notion of $(r,~p)$ norms. For the problem of learning a linear combination in the support vector machine-based framework, model complexity is typically controlled using various regularization strategies on the combined kernel weights. Recent research has developed a generalized $\ell _{p}$ -norm MKL framework with tunable variable $p$ ($p\ge 1$) to support controlled intrinsic sparsity. Unfortunately, this “1-D” vector $\ell _{p}$ -norm hardly exploits potentially useful information on how the base kernels “interact.” To allow for higher order kernel-pair relationships, we extend the “1-D” vector $\ell _{p}$ -MKL to the “2-D” matrix $(r,~p)$ norms ($1\le r,~p<\infty $). We develop a new formulation and an efficient optimization strategy for $(r,~p)$ -MKL with guaranteed convergence. A theoretical analysis and experiments on seven UCI data sets shed light on the superiority of $(r,~p)$ -MKL over $\ell _{p}$ -MKL in various scenarios. [ABSTRACT FROM AUTHOR]
- Subjects :
- *KERNEL (Mathematics)
*MACHINE learning
*SUPPORT vector machines
Subjects
Details
- Language :
- English
- ISSN :
- 2162237X
- Volume :
- 29
- Issue :
- 10
- Database :
- Academic Search Index
- Journal :
- IEEE Transactions on Neural Networks & Learning Systems
- Publication Type :
- Periodical
- Accession number :
- 131880294
- Full Text :
- https://doi.org/10.1109/TNNLS.2017.2785329