Back to Search Start Over

Cost-sensitive multi-label learning with positive and negative label pairwise correlations.

Authors :
Wu, Guoqiang
Tian, Yingjie
Liu, Dalian
Source :
Neural Networks. Dec2018, Vol. 108, p411-423. 13p.
Publication Year :
2018

Abstract

Abstract Multi-label learning is the problem where each instance is associated with multiple labels simultaneously. Binary Relevance (BR) is a representative algorithm for multi-label learning. However, it may suffer the class-imbalance issue especially when the label space is large. Besides, it ignores the label correlations, which is of importance to improve the performance. Moreover, labels might have positive and negative correlations in real applications, but existing methods seldom exploit the negative label correlations. In this paper, we propose a novel Cost-sensitive multi-label learning model with Positive and Negative Label pairwise correlations (CPNL), which extends BR to tackle the above issues. The kernel extension of the linear model is also provided to explore complex input–output relationships. Moreover, we adopt two accelerated gradient methods (AGM) to efficiently solve the linear and kernel models. Experimental results show that our approach CPNL achieves a competitive performance to some state-of-the-art approaches for multi-label learning. Highlights • We propose a novel cost-sensitive multi-label learning model with positive and negative label pairwise correlations. • We introduce a new regularizer to exploit the negative label pairwise correlations. • Kernel extension of the linear model is provided to explore nonlinear input–output relationships. • Two accelerated gradient methods (AGM) are adopted to efficiently solve the linear and kernel model. • Extensive experiments have verified the effectiveness of our proposed method. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
08936080
Volume :
108
Database :
Academic Search Index
Journal :
Neural Networks
Publication Type :
Academic Journal
Accession number :
133047568
Full Text :
https://doi.org/10.1016/j.neunet.2018.09.003