References

The following references are relevant for the topics covered in this course. More materials will be added throughout the course.


  1. E. Allman, J. Rhodes, B. Sturmfels, and P. Zwiernik. Tensors of Nonnegative Rank Two. Linear Algebra and its Applications. 473:37-53, 2015.

  2. A. Anandkumar, R. Ge, D. Hsu, S. Kakade, and M. Telgarsky. Tensor Decompositions for Learning Latent Variable Models. Journal of Machine Learning Research, 15(80):2773-2832, 2014.

  3. A. Anandkumar, R. Ge, and M. Janzamin. Learning Overcomplete Latent Variable Models through Tensor Methods. JMLR: Workshop and Conference Proceedings, 40:1-77, 2015.

  4. S. Boyd and L. Vandenberghe. Convex Optimization. Cambridge university Press, 2004.

  5. J. Bridgeman. Hand-waving and Interpretive Dance: An Introductory Course on Tensor Networks. Journal of Physics A Mathematical and Theoretical 50(22), 2016.

  6. D. Cartwright and B. Sturmfels. The Number of Eigenvalues of a Tensor. Linear Algebra and its Applications, 432(2):942-952, 2013.

  7. P. Comon and C. Jutten. Handbook of Blind Source Separation : Independent Component Analysis and Applications. Academic Press, Inc., 2010.

  8. V. de Silva and L.-H. Lim. Tensor rank and the ill-posedness of the best low-rank approximation problem. SIAM J. Matrix Anal. Appl. 30(3):1084-1127, 2008.

  9. M. Drton. Algebraic problems in structural equation modeling. Adv. Stud. Pure Math. The 50th Anniversary of Grbner Bases, T. Hibi, ed. (Tokyo: Mathematical Society of Japan), 35 - 86, 2018.

  10. S. Friedland. Best rank one approximation of real symmetric tensors can be chosen symmetric. Front. Math. China 8(1):19-40, 2013.

  11. S. Friedland and L.-H. Lim. Nuclear norm of higher-order tensors. Math. Comp. 87:1255-1281, 2018.

  12. C. Hillar and L.-H. Lim. Most Tensor Problems are NP-Hard. Journal of the ACM, 60(6), Article 45, 2013.

  13. F. L. Hitchcock. The expression of a tensor or a polyadic as a sum of products. J. Math. Phys. 6(1): 164-189, 1927.

  14. J. Kileel and J. Pereira. Subspace power method for symmetric tensor decomposition and generalized PCA. Preprint: arXiv:1912.04007, 2019.

  15. T. Kolda and B. Bader. Tensor Decompositions and Applications. SIAM Review, 51(3):455-500, 2009.

  16. J. M. Landsberg. Tensors: Geometry and Applications. Graduate Studies in Mathematics, 2012.

  17. S. L. Lauritzen. Graphical Models. Clarendon Press, 1996.

  18. L.-H. Lim. Singular Values and Eigenvalues of Tensors: a Variational Approach. Computational Advances in Multi- Sensor Adaptive Processing, 1st IEEE International Workshop 129-132, 2005.

  19. A. Moitra. Algorithmic Aspects of Machine Learning. http://people.csail.mit.edu/moitra/docs/bookex.pdf

  20. D. Mond, J. Smith, and D. van Stratten. Stochastic factorizations, sandwiched simplices and the topology of the space of explanations. Proceedings of the Royal Society A. 459(2039), 2003.

  21. I.~Oseledets. Tensor-Train Decomposition. SIAM Journal on Scientific Computing 33(5):2295-2317, 2011.

  22. L. Qi. Eigenvalues of a Real Symmetric Tensor. Journal of Symbolic Computation 40(6):1302-1324, 2005.

  23. B. Recht, M. Fazel and P. Parrilo. Guaranteed minimum rank solutions of matrix equations via nuclear norm minimization. SIAM Review, 52(3):471-501, 2010.

  24. E. Robeva and A. Seigal. Duality of Graphical Models and Tensor Networks. Information and Inference: A Journal of the IMA, 8(2):273-288, 2018.

  25. K. Sadeghi and S. Lauritzen. Markov Properties of Mixed Graphs. Bernoulli 20(2):676-696, 2014.

  26. S. Shimizu, P.O. Hoyer, A. Hyv?arinen, and A. Kerminen. A linear non-gaussian acyclic model for causal discovery. Journal of Machine Learning Research 7:2003-2030, 2006.

  27. M. J. Wainwright and M. I. Jordan. Graphical Models, Exponential Families, and Variational Inference. Foundations and Trends in Machine Learning, Vol. 1, 2008.