Grades
Final project, 50%: The final project should involve reading one or more research papers related to the course, identifying an open research problem, and making progress towards resolving the problem. Around the middle of the semester, a project proposal consisting of one page describing the problem and all references will be due.
Weekly reading reports and participation, 35%: There will be a research paper reading assignment every week. Together with a partner you are expected to submit a written report summarizing the paper, pointing out the main contributions, techniques, and adding your opinion on the quality of the work. We will discuss these papers during lecture every 2 weeks. Everyone will be expected to contribute to each discussion, and to lead one or two discussions during the term.
Homework, 15%: There will be a few light homework exercises assigned throughout the semester. You are encouraged to work together with others on the problems, but you have to write down the solutions on your own.
Tentative List of Topics
Course Overview and Motivating Examples
Examples of CP decomposition - latent variable models
Examples of CP decomposition - tensor-shaped data
Examples of CP decomposition - blind source separation, ICA, the method of moments
Examples of CP decomposition - Gaussian mixture models
CP decomposition - properties
CP decomposition - properties continued
CP decomposition - properties continued
CP decomposition - properties continued
CP decomposition - properties continued
Algorithms for CP decomposition - alternating least squares
Algorithms for CP decomposition - Jennrich’s algorithm, tensor power method and eigenvectors of tensors
Eigenvectors of tensors, the tensor power method, orthogonally decomposable tensors
Overcomplete symmetric CP decompositions - the subspace power method
Atomic norm minimization and the tensor nuclear norm
Tensor network decompositions - motivation from quantum physics
Tensor network decompositions - Tucker, MPS (aka Tensor Train), PEPS, MERA
Graphical models - undirected, Markov properties
Correspondence between tensor networks and graphical models
Nonnegative matrix decompositions - nonnegative rank, properties, examples
Nonnegative matrix decompositions - alternating least squares, EM, other algorithms
Nonnegative matrix decompositions - geometric description
Nonnegative tensor decompositions - properties, examples, algorithms
Total positivity - properties and relationship to nonnegative tensor decomposition
Graphical models - directed acyclic, Markov properties, equivalence classes
Linear structural equation models - Gaussian vs. non-Gaussian; learning non-Gaussian LSEMs via ICA
Presentations
Presentations
Presentations