Back to Search Start Over

Supervised Dimensionality Reduction Methods via Recursive Regression.

Authors :
Liu, Yun
Zhang, Rui
Nie, Feiping
Li, Xuelong
Ding, Chris
Source :
IEEE Transactions on Neural Networks & Learning Systems. Sep2020, Vol. 31 Issue 9, p3269-3279. 11p.
Publication Year :
2020

Abstract

In this article, the recursive problems of both orthogonal linear discriminant analysis (OLDA) and orthogonal least squares regression (OLSR) are investigated. Different from other works, the associated recursive problems are addressed via a novel recursive regression method, which achieves the dimensionality reduction in the orthogonal complement space heuristically. As for the OLDA, an efficient method is developed to obtain the associated optimal subspace, which is closely related to the orthonormal basis of the optimal solution to the ridge regression. As for the OLSR, the scalable subspace is introduced to build up an original OLSR with optimal scaling (OS). Through further relaxing the proposed problem into a convex parameterized orthogonal quadratic problem, an effective approach is derived, such that not only the optimal subspace can be achieved but also the OS could be obtained automatically. Accordingly, two supervised dimensionality reduction methods are proposed via obtaining the heuristic solutions to the recursive problems of the OLDA and the OLSR. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
2162237X
Volume :
31
Issue :
9
Database :
Academic Search Index
Journal :
IEEE Transactions on Neural Networks & Learning Systems
Publication Type :
Periodical
Accession number :
145476379
Full Text :
https://doi.org/10.1109/TNNLS.2019.2940088