Back to Search
Start Over
Accurate Tensor Completion via Adaptive Low-Rank Representation.
- Source :
-
IEEE Transactions on Neural Networks & Learning Systems . Oct2020, Vol. 31 Issue 10, p4170-4184. 15p. - Publication Year :
- 2020
-
Abstract
- Low-rank representation-based approaches that assume low-rank tensors and exploit their low-rank structure with appropriate prior models have underpinned much of the recent progress in tensor completion. However, real tensor data only approximately comply with the low-rank requirement in most cases, viz., the tensor consists of low-rank (e.g., principle part) as well as non-low-rank (e.g., details) structures, which limit the completion accuracy of these approaches. To address this problem, we propose an adaptive low-rank representation model for tensor completion that represents low-rank and non-low-rank structures of a latent tensor separately in a Bayesian framework. Specifically, we reformulate the CANDECOMP/PARAFAC (CP) tensor rank and develop a sparsity-induced prior for the low-rank structure that can be used to determine tensor rank automatically. Then, the non-low-rank structure is modeled using a mixture of Gaussians prior that is shown to be sufficiently flexible and powerful to inform the completion process for a variety of real tensor data. With these two priors, we develop a Bayesian minimum mean-squared error estimate framework for inference. The developed framework can capture the important distinctions between low-rank and non-low-rank structures, thereby enabling more accurate model, and ultimately, completion. For various applications, compared with the state-of-the-art methods, the proposed model yields more accurate completion results. [ABSTRACT FROM AUTHOR]
- Subjects :
- *ORTHOTROPIC plates
*SCIENTIFIC computing
Subjects
Details
- Language :
- English
- ISSN :
- 2162237X
- Volume :
- 31
- Issue :
- 10
- Database :
- Academic Search Index
- Journal :
- IEEE Transactions on Neural Networks & Learning Systems
- Publication Type :
- Periodical
- Accession number :
- 146358980
- Full Text :
- https://doi.org/10.1109/TNNLS.2019.2952427