A Unified View of Robust Decomposition in Low-rank + Additive Matrices 

Low-rank and sparse matrix decomposition from noisy matrix observations (Image from Lu et al. 2019)

1. Introduction

Problem formulations of robust subspace learning/tracking by decomposition into low-rank plus additive matrices show a suitable framework for image and video analysis. The representative problem formulations is the Robust Principal Component Analysis (RPCA) solved via Principal Component Pursuit which decomposes a data matrix in a low-rank matrix and sparse matrix. However, similar implicit or explicit decompositions can be achieved in the other following problem formulations: Robust Non-negative Matrix Factorization (RNMF), Robust Subspace Recovery (RSR), Subspace Tracking (ST) , Robust Matrix Completion (RMC) and  Robust Low-Rank Minimization (RLRM). 

2. Robust Decomposition in Low-rank + Additive Matrice: Principle

All the decompositions in these different problem formulations of  robust subspace learning/tracking can be considered in a unified view that we called Decomposition into Low-rank plus Additive Matrices (DLAM). Thus, all the decompositions can be written in a general formulation as follows:

Practically, the decomposition is implicit when K=1. It is the degenerated case for problem formulations in their basic formulation (LRM, MC, etc...)  which are not robust because there are no constraints on the matrix S=A-L.

The decomposition is explicit  when K =2 and we have A=L+S. It is the for problem formulations in their robust formulation (RLRM, RMC, RPCA, RNMF, RSR, RST).

In the case of K=3, the decomposition is A = L+S +E. This explicit decomposition is called ”stable” decomposition as it separates the outliers in S and the noise in E.

Figure 1 shows an overview of the proposed unified view.

Figure 1: A Unified View of Problem Formulations via Decomposition into Low-rank plus Additive Matrices (DLAM)

3. Key Characteristics

4. Applications

The different robust problem formulations based on the decomposition into low-rank plus additive matrices are fundamental in several applications . Indeed, as this decomposition is nonparametric and does not make many assumptions, it is widely applicable to a large scale of problems ranging from:

4.1 Applications in Statistics

4.2  Applications in Signal Processing (one-dimensional signal)

4.3  Applications in Computer Vision (two-dimensional signal) (three-dimensional signal)

4.4 Applications in Computer Graphics

4.5 Applications in Computer Science

4.6 Applications in Astronomy

4.7 Applications in Industrial Process

Author: Thierry BOUWMANS, Associate Professor, Lab. MIA, Univ.  Rochelle, France.

Fair Use Policy

As this website gives many information that come from my research, please cite my following survey papers:

T. Bouwmans, A. Sobral, S. Javed, S. Jung, E. Zahzah, "Decomposition into Low-rank plus Additive Matrices for Background/Foreground Separation: A Review for a Comparative Evaluation with a Large-Scale Dataset", Computer Science Review, Volume 23, pages 1-71, February 2017. [pdf]

T. Bouwmans, E. Zahzah, “Robust PCA via Principal Component Pursuit: A Review for a Comparative Evaluation in Video Surveillance”, Special Issue on Background Models Challenge, Computer Vision and Image Understanding, CVIU 2014, Volume 122, pages 22–34, May 2014. [pdf]

Note: My publications are available on Academia, ResearchGate, Researchr, ORCID and Publication List.