Back to Search Start Over

Matrix-Regularized Multiple Kernel Learning via (r,p) Norms

Matrix-Regularized Multiple Kernel Learning via (r,p) Norms

Authors :
Yina, Han
Yixin, Yang
Xuelong, Li
Qingyu, Liu
Yuanliang, Ma
Source :
IEEE transactions on neural networks and learning systems.
Publication Year :
2018

Abstract

This paper examines a matrix-regularized multiple kernel learning (MKL) technique based on a notion of (r,p) norms. For the problem of learning a linear combination in the support vector machine-based framework, model complexity is typically controlled using various regularization strategies on the combined kernel weights. Recent research has developed a generalized ℓp-norm MKL framework with tunable variable p(p≥1) to support controlled intrinsic sparsity. Unfortunately, this ``1-D'' vector ℓp-norm hardly exploits potentially useful information on how the base kernels ``interact.'' To allow for higher order kernel-pair relationships, we extend the ``1-D'' vector ℓp-MKL to the ``2-D'' matrix (r,p) norms (1 ≤ r,p∞). We develop a new formulation and an efficient optimization strategy for (r,p)-MKL with guaranteed convergence. A theoretical analysis and experiments on seven UCI data sets shed light on the superiority of (r,p)-MKL over ℓp-MKL in various scenarios.

Details

ISSN :
21622388
Database :
OpenAIRE
Journal :
IEEE transactions on neural networks and learning systems
Accession number :
edsair.pmid..........e5610abb48ee0ad8adf7e7f607b15dfe