1. Generalization of the K-SVD algorithm for minimization of β-divergence
- Author
-
Tuomas Virtanen, Pablo San Juan, Pedro Alonso, Victor M. Garcia-Molla, and Antonio M. Vidal
- Subjects
Beta-divergence ,Computer science ,Generalization ,Matrix norm ,02 engineering and technology ,Non-negative matrix factorization ,Set (abstract data type) ,Artificial Intelligence ,CIENCIAS DE LA COMPUTACION E INTELIGENCIA ARTIFICIAL ,0202 electrical engineering, electronic engineering, information engineering ,NMF ,Electrical and Electronic Engineering ,Divergence (statistics) ,Linear combination ,K-SVD ,Applied Mathematics ,020206 networking & telecommunications ,Matching pursuit algorithms ,Computational Theory and Mathematics ,Signal Processing ,Nonnegative K-SVD ,020201 artificial intelligence & image processing ,Computer Vision and Pattern Recognition ,Minification ,Statistics, Probability and Uncertainty ,Algorithm - Abstract
[EN] In this paper, we propose, describe, and test a modification of the K-SVD algorithm. Given a set of training data, the proposed algorithm computes an overcomplete dictionary by minimizing the ß-divergence () between the data and its representation as linear combinations of atoms of the dictionary, under strict sparsity restrictions. For the special case , the proposed algorithm minimizes the Frobenius norm and, therefore, for the proposed algorithm is equivalent to the original K-SVD algorithm. We describe the modifications needed and discuss the possible shortcomings of the new algorithm. The algorithm is tested with random matrices and with an example based on speech separation., This work has been partially supported by the EU together with the Spanish Government through TEC2015-67387-C4-1-R (MINECO/FEDER) and by Programa de FPU del Ministerio de Educacion, Cultura y Deporte FPU13/03828 (Spain).
- Published
- 2019
- Full Text
- View/download PDF