1. A direct method to Frobenius norm-based matrix regression.
- Author
-
Yuan, Shi-Fang, Yu, Yi-Bin, Li, Ming-Zhao, and Jiang, Hua
- Subjects
- *
COMPLEX matrices , *HUMAN facial recognition software , *MATRIX norms , *LEAST squares , *MATRICES (Mathematics) , *COEFFICIENTS (Statistics) , *TRIANGULAR norms - Abstract
Regression analysis has been widely used for face recognition. This paper mainly discuss the following regularized matrix regression problem: Given a set of k image matrices A 1 , A 2 , ... , A k ∈ R m × n and an image matrix B ∈ R m × n , find x = (x 1 , x 2 , ... , x k) T ∈ R k such that min x ∈ R k ∥ x 1 A 1 + x 2 A 2 + ⋯ + x k A k − B ∥ F 2 + λ 2 ∥ x ∥ 2 2 , where x 1 , x 2 , ... , x k are also a set of representation coefficients, λ is the model parameter, and ∥ A ∥ F represents the Frobenius norm of matrix A. Yuan and Liao [S.F. Yuan, A.P. Liao, Least squares Hermitian solution of the complex matrix equation AXB + CXD = E with the least norm. J. Frankl. Inst. 351 (2014), pp. 4978–4997] introduced a new product for matrices and vectors, and solved the least squares Hermitian problem of complex matrix equation A X B + C X D = E. In this paper, we deeply investigate this product and its relative properties about matrix trace, norm, and determinant. We then provide a direct method to get the close form solution for solving the regularized image matrix regression problem in face recognition. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF