Back to Search Start Over

Dictionary Optimization for Block-Sparse Representations.

Authors :
Zelnik-Manor, Lihi
Rosenblum, Kevin
Eldar, Yonina C.
Source :
IEEE Transactions on Signal Processing. May2012, Vol. 60 Issue 5, p2386-2395. 10p.
Publication Year :
2012

Abstract

Recent work has demonstrated that using a carefully designed dictionary instead of a predefined one, can improve the sparsity in jointly representing a class of signals. This has motivated the derivation of learning methods for designing a dictionary which leads to the sparsest representation for a given set of signals. In some applications, the signals of interest can have further structure, so that they can be well approximated by a union of a small number of subspaces (e.g., face recognition and motion segmentation). This implies the existence of a dictionary which enables block-sparse representations of the input signals once its atoms are properly sorted into blocks. In this paper, we propose an algorithm for learning a block-sparsifying dictionary of a given set of signals. We do not require prior knowledge on the association of signals into groups (subspaces). Instead, we develop a method that automatically detects the underlying block structure given the maximal size of those groups. This is achieved by iteratively alternating between updating the block structure of the dictionary and updating the dictionary atoms to better fit the data. Our experiments show that for block-sparse data the proposed algorithm significantly improves the dictionary recovery ability and lowers the representation error compared to dictionary learning methods that do not employ block structure. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
1053587X
Volume :
60
Issue :
5
Database :
Academic Search Index
Journal :
IEEE Transactions on Signal Processing
Publication Type :
Academic Journal
Accession number :
74305447
Full Text :
https://doi.org/10.1109/TSP.2012.2187642