In this lecture, we discuss modern convex optimization techniques to solve large-scale machine learning problems that involve big data. Most of machine learning problems are written as convex optimization at their core, and therefore it is important to have an in-depth understanding of convex optimization, to solve large-scale machine learning problems efficiently. Topics will include recent developments in SGD (stochastic gradient descent), proximal gradient descent, Nesterov-type acceleration (FISTA & Smoothing), block coordinate descent, and ADMM (alternating direction method of multipliers).
Time: Wed 13:00am-16:00
Location: Cluster Bd. R509 (학연산클러스터 509호)
References:
Grading: this lecture will follow the format of IC-PBL+ lectures.
TA email: nomar0107@gmail.com
Lecture Notes
ICCV2019 Paper Review