Back to Search Start Over

Flexible Affinity Matrix Learning for Unsupervised and Semisupervised Classification.

Authors :
Fang, Xiaozhao
Han, Na
Wong, Wai Keung
Teng, Shaohua
Wu, Jigang
Xie, Shengli
Li, Xuelong
Source :
IEEE Transactions on Neural Networks & Learning Systems; Apr2019, Vol. 30 Issue 4, p1133-1149, 17p
Publication Year :
2019

Abstract

In this paper, we propose a unified model called flexible affinity matrix learning (FAML) for unsupervised and semisupervised classification by exploiting both the relationship among data and the clustering structure simultaneously. To capture the relationship among data, we exploit the self-expressiveness property of data to learn a structured matrix in which the structures are induced by different norms. A rank constraint is imposed on the Laplacian matrix of the desired affinity matrix, so that the connected components of data are exactly equal to the cluster number. Thus, the clustering structure is explicit in the learned affinity matrix. By making the estimated affinity matrix approximate the structured matrix during the learning procedure, FAML allows the affinity matrix itself to be adaptively adjusted such that the learned affinity matrix can well capture both the relationship among data and the clustering structure. Thus, FAML has the potential to perform better than other related methods. We derive optimization algorithms to solve the corresponding problems. Extensive unsupervised and semisupervised classification experiments on both synthetic data and real-world benchmark data sets show that the proposed FAML consistently outperforms the state-of-the-art methods. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
2162237X
Volume :
30
Issue :
4
Database :
Complementary Index
Journal :
IEEE Transactions on Neural Networks & Learning Systems
Publication Type :
Periodical
Accession number :
135443148
Full Text :
https://doi.org/10.1109/TNNLS.2018.2861839