Sparse decompositions are similar to low-rank decompositions except that the first matrix is considered to be sparse instead of low-rank. Sparse decompositions are achieved in the different following problem formulations: Sparse dictionary learning, sparse linear approximation and compressive sensing.
Robust Decomposition in Sparse + Additive Matrices
All the decompositions in these different problem formulations of robust subspace learning/tracking can be considered in a unified view that we called Decomposition into Sparse plus Additive Matrices (DSAM). Thus, all the decompositions can be written in a general formulation as follows:
Key Characteristics
Applications to Statistical Modeling and Computer Vision
The different robust problem formulations based on the decomposition into sparse plus additive matrices are fundamental in several applications . Indeed, as this decomposition is nonparametric and does not make many assumptions, it is widely applicable to a large scale of problems ranging from:
Author: Thierry BOUWMANS, Associate Professor, Lab. MIA, Univ. Rochelle, France.
Fair Use Policy
As this website gives many information that come from my research, please cite my following survey papers:
T. Bouwmans . A. Sobral, S. Javed, S. Jung, E. Zahzah, "Decomposition into Low-rank plus Additive Matrices for Background/Foreground Separation: A Review for a Comparative Evaluation with a Large-Scale Dataset", Computer Science Review, November 2016. [pdf]
T. Bouwmans, E. Zahzah, “Robust PCA via Principal Component Pursuit: A Review for a Comparative Evaluation in Video Surveillance”, Special Issue on Background Models Challenge, Computer Vision and Image Understanding, CVIU 2014, Volume 122, pages 22–34, May 2014. [pdf]