The workshop focused on the use of low-rank models in signal processing, data mining and machine learning. This includes but is not restricted to all aspects of low-rank matrix/tensor approximations,
such as nonnegative matrix factorization, (subspace) clustering, independent component analysis, low-rank matrix completion, sparse component analysis, dictionary learning, tensor decompositions, etc
. The workshop is also interested in
optimization algorithms that are central to compute such decompositions.
Many aspects of this class of problems will be discussed including
theory (complexity, identifiability, etc.)
, algorithms, and applications.