Back to Search Start Over

A multi-task framework for metric learning with common subspace.

Authors :
Yang, Peipei
Huang, Kaizhu
Liu, Cheng-Lin
Source :
Neural Computing & Applications. Jun2013, Vol. 22 Issue 7/8, p1337-1347. 11p. 1 Diagram, 6 Graphs.
Publication Year :
2013

Abstract

Metric learning has been widely studied in machine learning due to its capability to improve the performance of various algorithms. Meanwhile, multi-task learning usually leads to better performance by exploiting the shared information across all tasks. In this paper, we propose a novel framework to make metric learning benefit from jointly training all tasks. Based on the assumption that discriminative information is retained in a common subspace for all tasks, our framework can be readily used to extend many current metric learning methods. In particular, we apply our framework on the widely used Large Margin Component Analysis (LMCA) and yield a new model called multi-task LMCA. It performs remarkably well compared to many competitive methods. Besides, this method is able to learn a low-rank metric directly, which effects as feature reduction and enables noise compression and low storage. A series of experiments demonstrate the superiority of our method against three other comparison algorithms on both synthetic and real data. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
09410643
Volume :
22
Issue :
7/8
Database :
Academic Search Index
Journal :
Neural Computing & Applications
Publication Type :
Academic Journal
Accession number :
87909719
Full Text :
https://doi.org/10.1007/s00521-012-0956-8