2015 Spring‎ > ‎

Mar. 11 | Ju Sun

Complete Dictionary Recovery over the Sphere

We consider the problem of recovering a complete (i.e., square and invertible) matrix A, from Y with Y = AX, provided X is sufficiently sparse. This recovery problem is central to the theoretical understanding of dictionary learning, which seeks a sparse representation for a collection of input signals and finds numerous applications in modern signal processing and machine learning.

We give the first efficient algorithm that provably recovers A when X has O(n) nonzeros per column, under suitable probability model for X. In contrast, prior results based on efficient algorithms provide recovery guarantees when X has only O(n^{1/2}) nonzeros per column, severely limiting the model capacity of dictionary learning.

Our algorithmic pipeline centers around solving a certain nonconvex optimization problem with a spherical constraint, and hence is naturally phrased in the language of manifold optimization. To show this apparently hard problem is tractable, we first provide a geometric characterization of the high-dimensional objective landscape, which shows that with high probability there are no spurious local minima. This particular geometric structure allows us to design a Riemannian trust region algorithm over the sphere that provably converges to one global minimizer with an arbitrary initialization, despite the presence of saddle points.

Besides the new theoretical insight into the dictionary learning problem, the geometric approach we develop here may shed light on other problems arising from nonconvex recovery of structured signals.

This is joint work with Prof. John Wright and Mr. Qing Qu. Related papers:

 

* Ju Sun, Qing Qu, John Wright. Complete Dictionary Recovery over the Sphere: a Summary. Online: http://sunju.org/docs/DL_LSE_15.pdf

* Qing Qu, Ju Sun, John Wright. Finding a Sparse Vector in a Subspace: Linear Sparsity Using Alternating Direction. NIPS'14. Online: http://arxiv.org/abs/1412.4659