Back to Search Start Over

On the Comparisons of Decorrelation Approaches for Non-Gaussian Neutral Vector Variables.

Authors :
Ma Z
Lu X
Xie J
Yang Z
Xue JH
Tan ZH
Xiao B
Guo J
Source :
IEEE transactions on neural networks and learning systems [IEEE Trans Neural Netw Learn Syst] 2023 Apr; Vol. 34 (4), pp. 1823-1837. Date of Electronic Publication: 2023 Apr 04.
Publication Year :
2023

Abstract

As a typical non-Gaussian vector variable, a neutral vector variable contains nonnegative elements only, and its l <subscript>1</subscript> -norm equals one. In addition, its neutral properties make it significantly different from the commonly studied vector variables (e.g., the Gaussian vector variables). Due to the aforementioned properties, the conventionally applied linear transformation approaches [e.g., principal component analysis (PCA) and independent component analysis (ICA)] are not suitable for neutral vector variables, as PCA cannot transform a neutral vector variable, which is highly negatively correlated, into a set of mutually independent scalar variables and ICA cannot preserve the bounded property after transformation. In recent work, we proposed an efficient nonlinear transformation approach, i.e., the parallel nonlinear transformation (PNT), for decorrelating neutral vector variables. In this article, we extensively compare PNT with PCA and ICA through both theoretical analysis and experimental evaluations. The results of our investigations demonstrate the superiority of PNT for decorrelating the neutral vector variables.

Details

Language :
English
ISSN :
2162-2388
Volume :
34
Issue :
4
Database :
MEDLINE
Journal :
IEEE transactions on neural networks and learning systems
Publication Type :
Academic Journal
Accession number :
32248126
Full Text :
https://doi.org/10.1109/TNNLS.2020.2978858