Tianyi Zhou, Dacheng Tao, Centre for Quantum Computation and Intelligent Systems, University of Technology, Sydney
GoDec can be extended to solve multi-label learning problem by decomposing the multi-label data into the sum of several low-rank part and a sparse residual, where each low-rank part corresponds to the mapping of a particular label in the feature space. Then prediction can be obtained by finding the group sparse representations of a new instance on the subsapces defined by the low-rank parts.
2. Directly control the rank of the low-rank part and the sparsity of the sparse part, thus brings great savings in time and space costs (by trade-off between efficiency and accuracy);
3. Develop randomized low-rank approximation method "bilateral random projection (BRP)" to further accelerate the update in the algorithm;
4. Linear convergence and robustness to the noise can be theoretically proved by the scheme of "alternating projections on manifolds" by Lewis and Malick;
5. Effective extensions to other problems such as matrix completion and multi-label learning.
GreBsmo Code: Now you can try the greedy version of GoDec now, Greedy Bilateral Smoothing from our AISTATS 2013 paper, Greedy Bilateral Sketch, Completion and Smoothing.
How to extend GoDec in nonlinear low-rank case? How to develop a simple matrix factorization method for complex computer vision tasks such as detection, tracking and motion segmentation? Please refer to our IJCAI 2013 paper: Shifted Subspace Tracking on Sparse Outliers for Motion Segmentation.
How to extend GoDec to solving multi-label learning problem by leveraging the geometry of label vector space? Please refer to our ICMR 2012 paper: Labelset Anchored Subspace Ensemble (LASE) for Multi-label Annotation.
How to extend GoDec to solving multi-label learning problem? Please refer to our AISTATS 2012 paper: Multi-label Subspace Ensemble.
Semi-Soft GoDec is released. Different from the ordinary GoDec which imposes hard threshholding to both the singular values of the low-rank part L and the entries of the sparse part S, Semi-Soft GoDec adopts soft thresholding to the entries of S. This change brings two key advantages: 1) the parameter k in constraint "card(S)<=k" is now can be automatically determined by the soft-threshold \tau, thus avoids the situation when k is chosen too large and some part of noise G is leaked into S; 2) the time cost is substantially smaller than the ordinary GoDec, for example, the background modeling experiments can be accomplished with a speed 4 times faster then ordinary GoDec, which means almost all the videos shown in our ICML paper can be processed in less than 10 seconds, while the error is kept the same or even smaller. This GoDec variant will be included in our journal version, the MatLab code of Semi-Soft GoDec is now available for download. Enjoy it!
|Conference/Journal||Paper||Author||Keywords||Year||AISTATS 2012 ||Multilabel Subspace Ensemble PDF ||Tianyi Zhou, Dacheng Tao ||Low-rank, sparse, multi-label learning, sparse coding, structured learning||2012||ICML 2011||GoDec: Randomized Low-rank & Sparse Matrix Decomposition in Noisy Case|
PDF BIBTEX DISCUSS TALK VEDIO CODE
|Tianyi Zhou, Dacheng Tao||Low-rank, sparse, structured learning, bilateral random projection, matrix decomposition, visual analysis, matrix completion||2011||arXiv||Multi-label Learning via Structured Decomposition and Group Sparsity|
PDF BIBTEX CODE
|Tianyi Zhou, Dacheng Tao ||Low-rank, sparse, structured learning, bilateral random projection, matrix decomposition, multi-label learning||2011|