#### Fast Algorithm for Visual Analysis, Matrix Completion and Multi-label Learning

Tianyi Zhou, Dacheng Tao, Centre for Quantum Computation and Intelligent Systems, University of Technology, Sydney

#### Abstract:

Low-rank and sparse structures have been profoundly studied in matrix completion and compressed sensing. In this paper, we develop Go Decomposition'' (GoDec) to efficiently and robustly estimate the low-rank part $L$ and the sparse part $S$ of a matrix $X=L+S+G$ with noise $G$. GoDec alternatively assigns the low-rank approximation of $X-S$ to $L$ and the sparse approximation of $X-L$ to $S$. The algorithm can be significantly accelerated by bilateral random projections (BRP). We also propose GoDec for matrix completion as an important variant. We prove that the objective value $\|X-L-S\|_F^2$ converges to a local minimum, while $L$ and $S$ linearly converge to local optimums. Theoretically, we analyze the influence of $L$, $S$ and $G$ to the asymptotic/convergence speeds in order to discover the robustness of GoDec. Empirical studies suggest the efficiency, robustness and effectiveness of GoDec comparing with representative matrix decomposition and completion tools, e.g., Robust PCA and OptSpace.
GoDec can be extended to solve multi-label learning problem by decomposing the multi-label data into the sum of several low-rank part and a sparse residual, where each low-rank part corresponds to the mapping of a particular label in the feature space. Then prediction can be obtained by finding the group sparse representations of a new instance on the subsapces defined by the low-rank parts.

#### 2. Directly control the rank of the low-rank part and the sparsity of the sparse part, thus brings great savings in time and space costs (by trade-off between efficiency and accuracy);

3. Develop randomized low-rank approximation method "bilateral random projection (BRP)" to further accelerate the update in the algorithm;
4. Linear convergence and robustness to the noise can be theoretically proved by the scheme of "alternating projections on manifolds" by Lewis and Malick;
5. Effective extensions to other problems such as matrix completion and multi-label learning.

NEWS

GreBsmo Code: Now you can try the greedy version of GoDec now, Greedy Bilateral Smoothing from our AISTATS 2013 paper, Greedy Bilateral Sketch, Completion and Smoothing.

How to extend GoDec in nonlinear low-rank case? How to develop a simple matrix factorization method for complex computer vision tasks such as detection, tracking and motion segmentation? Please refer to our IJCAI 2013 paper: Shifted Subspace Tracking on Sparse Outliers for Motion Segmentation.

How to extend GoDec to solving multi-label learning problem by leveraging the geometry of label vector space? Please refer to our ICMR 2012 paper: Labelset Anchored Subspace Ensemble (LASE) for Multi-label Annotation.

How to extend GoDec to solving multi-label learning problem? Please refer to our AISTATS 2012 paper: Multi-label Subspace Ensemble.

Semi-Soft GoDec is released. Different from the ordinary GoDec which imposes hard threshholding to both the singular values of the low-rank part L and the entries of the sparse part S, Semi-Soft GoDec adopts soft thresholding to the entries of S. This change brings two key advantages: 1) the parameter k in constraint "card(S)<=k" is now can be automatically determined by the soft-threshold \tau, thus avoids the situation when k is chosen too large and some part of noise G is leaked into S; 2) the time cost is substantially smaller than the ordinary GoDec, for example, the background modeling experiments can be accomplished with a speed 4 times faster then ordinary GoDec, which means almost all the videos shown in our ICML paper can be processed in less than 10 seconds, while the error is kept the same or even smaller. This GoDec variant will be included in our journal version, the MatLab code of Semi-Soft GoDec is now available for download. Enjoy it!