Thursday, June 23rd, 2016

   CfP              Invited speakers             Schedule

Non-convex optimization lies at the heart of some exciting recent developments in machine learning, optimization, statistics and signal processing. Deep networks, Bayesian inference, matrix and tensor factorization and dynamical systems are some representative examples where non-convex methods constitute efficient -- and, in many cases, even more accurate -- alternatives to convex ones. However, unlike convex optimization, these non-convex approaches often lack theoretical justification.


The above facts have triggered attempts in providing answers to when and why non-convex methods perform well in practice in the hope that it might provide a new algorithmic paradigm for designing faster and better algorithms. This workshop will attempt to present some of the very recent developments on non-convex analysis and optimization, as reported in diverse research fields: from machine learning and mathematical programming to statistics and theoretical computer science. We believe that this workshop can bring researchers closer, in order to facilitate a discussion regarding why tackling non-convexity is important, where it is found, why non-convex schemes work well in practice and, how we can progress further with interesting research directions and open problems.