Syllabus [pdf]

Grades

Final project, 50%: The final project should involve reading one or more research papers related to the course, identifying an open research problem, and making progress towards resolving the problem. Around the middle of the semester, a project proposal consisting of one page describing the problem and all references will be due.

Weekly reading reports and participation, 35%: There will be a research paper reading assignment every week. Together with a partner you are expected to submit a written report summarizing the paper, pointing out the main contributions, techniques, and adding your opinion on the quality of the work. We will discuss these papers during lecture every 2 weeks. Everyone will be expected to contribute to each discussion, and to lead one or two discussions during the term.

Homework, 15%: There will be a few light homework exercises assigned throughout the semester. You are encouraged to work together with others on the problems, but you have to write down the solutions on your own.

Tentative List of Topics

  1. Course Overview and Motivating Examples

  2. Examples of CP decomposition - latent variable models

  3. Examples of CP decomposition - tensor-shaped data

  4. Examples of CP decomposition - blind source separation, ICA, the method of moments

  5. Examples of CP decomposition - Gaussian mixture models

  6. CP decomposition - properties

  7. CP decomposition - properties continued

  8. CP decomposition - properties continued

  9. CP decomposition - properties continued

  10. CP decomposition - properties continued

  11. Algorithms for CP decomposition - alternating least squares

  12. Algorithms for CP decomposition - Jennrich’s algorithm, tensor power method and eigenvectors of tensors

  13. Eigenvectors of tensors, the tensor power method, orthogonally decomposable tensors

  14. Overcomplete symmetric CP decompositions - the subspace power method

  15. Atomic norm minimization and the tensor nuclear norm

  16. Tensor network decompositions - motivation from quantum physics

  17. Tensor network decompositions - Tucker, MPS (aka Tensor Train), PEPS, MERA

  18. Graphical models - undirected, Markov properties

  19. Correspondence between tensor networks and graphical models

  20. Nonnegative matrix decompositions - nonnegative rank, properties, examples

  21. Nonnegative matrix decompositions - alternating least squares, EM, other algorithms

  22. Nonnegative matrix decompositions - geometric description

  23. Nonnegative tensor decompositions - properties, examples, algorithms

  24. Total positivity - properties and relationship to nonnegative tensor decomposition

  25. Graphical models - directed acyclic, Markov properties, equivalence classes

  26. Linear structural equation models - Gaussian vs. non-Gaussian; learning non-Gaussian LSEMs via ICA

  27. Presentations

  28. Presentations

  29. Presentations