Dogyoon Song
Email: dogyoons [at] umich [dot] edu
Office: 4234B EECS, 1301 Beal Avenue
About me
I am currently a postdoctoral research fellow in the Electrical Engineering and Computer Science department at University of Michigan, hosted by Alfred O. Hero and Qing Qu. Recently, I received my Ph.D. in EECS from MIT, where I was very fortunate to be advised by Pablo A. Parrilo and Devavrat Shah. My thesis research was supported by a Samsung scholarship and a Siebel scholarship. Prior to MIT, I studied ECE, mathematics, and physics as an undergraduate at Seoul National University in South Korea.
My research interests span the broad field of mathematical optimization and its intersection with data science, aiming to develop efficient algorithms and theories for data-driven decision making. I am particularly interested in geometric properties that can explain the outstanding performance of high-dimensional, overparameterized models and their inductive bias. My objective is to design efficient and reliable algorithms to construct practical and theoretically sound predictive models. Currently, my research focuses on four key topics:
Geometry of regularized optimization problems and induced low-complexity structures in the solutions.
Emergent phenomena in overparameterized models, especially for mixture models and feature representations.
Simple and robust methodologies for regression and causal inference.
Uncertainty quantification and calibration techniques for predictive inference.
CV (last updated 12/01/2023) | Google Scholar
I am on the academic job market this 2023-2024 cycle.
News
Paper Updates:
Two papers, Errors-in-variables Fréchet regression with low-rank covariate approximation and Minimum-risk recalibration of classifiers, are accepted at NeurIPS 2023.
EIV Fréchet regression: Wednesday, December 13, 5 pm - 7 pm CST, Great Hall & Hall B1+B2 (level 1) #1726
Minimum-risk recalibration: Thursday, December 14, 10:45 am - 12:45 pm CST, Great Hall & Hall B1+B2 (level 1) #1700
A revised preprint, On separability of covariance in multiway data analysis, is now available on arXiv. Comments and feedback are greatly appreciated!
A preprint, Algebraic and statistical properties of the ordinary least squares interpolator, is now available on arXiv. Comments and feedback are greatly appreciated!
Publications
* Note: (α-β) denotes alphabetical ordering of the authors by last name, and "⋆" denotes equal contribution as co-first authors.
Journal Articles
On approximations of the PSD cone by a polynomial number of smaller-sized PSD cones
Dogyoon Song and Pablo A. Parrilo
Mathematical Programming (MAPR), 2023.Kronecker-structured covariance models for multiway data
Yu Wang, Zeyu Sun, Dogyoon Song and Alfred Hero
Statistics Surveys, 2022.On robustness of principal component regression
(α-β) Anish Agarwal⋆, Devavrat Shah, Dennis Shen⋆ and Dogyoon Song⋆
Journal of the American Statistical Association (JASA), 2021. (Preliminary conference version in NeurIPS 2019)Nearest neighbors for matrix estimation interpreted as blind regression for latent variable model
(α-β) Yihua Li, Devavrat Shah, Dogyoon Song⋆ and Christina Lee Yu⋆
IEEE Transactions on Information Theory (TransIT), 2019. (Preliminary conference version in NeurIPS 2016)
Conference Proceedings
Errors-in-variables Fréchet regression with low-rank covariate approximation
Dogyoon Song⋆ and Kyunghee Han⋆
Conference on Neural Information Processing Systems (NeurIPS), 2023, to appear.Minimum-risk recalibration of classifiers
Zeyu Sun⋆, Dogyoon Song⋆ and Alfred Hero
Conference on Neural Information Processing Systems (NeurIPS), 2023, to appear. (Spotlight)Robustness-preserving lifelong learning via dataset condensation
Jinghan Jia, Yihua Zhang, Dogyoon Song, Sijia Liu and Alfred Hero
IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 2023.Sample-efficient reinforcement learning via low-rank matrix estimation
(α-β) Devavrat Shah, Dogyoon Song⋆, Zhi Xu⋆ and Yuzhe Yang
Conference on Neural Information Processing Systems (NeurIPS), 2020.On robustness of principal component regression
(α-β) Anish Agarwal⋆, Devavrat Shah, Dennis Shen⋆ and Dogyoon Song⋆
Conference on Neural Information Processing Systems (NeurIPS), 2019. (Selected for Oral presentation)Blind regression: Nonparametric regression for latent variable models via collaborative filtering
(α-β) Christina E. Lee⋆, Yihua Li, Devavrat Shah and Dogyoon Song⋆
Conference on Neural Information Processing Systems (NeurIPS), 2016.Verifying start-up failures in coupled ring oscillators in presence of variability using predictive global optimization
Taehwan Kim, Dogyoon Song, Sangho Youn, Jaejin Park, Hojin Park and Jaeha Kim
IEEE/ACM International Conference on Computer-Aided Design (ICCAD), 2013.Discretization and discrimination methods for design, verification, and testing of analog/mixed-signal circuits
Jaeha Kim, Jiho Lee, Dogyoon Song, Taehwan Kim, Kyunghoon Kim, Seobin Jung and Sangho Youn
IEEE Custom Integrated Circuits Conference (CICC), 2013.A low-power high-radix switch fabric based on low-swing signaling and partially-activated input lines
Dogyoon Song and Jaeha Kim
International Symposium on VLSI Design, Automation, and Test (VLSI-DAT), 2013.
Preprints under Review & Manuscripts in Preparation
On separability of covariance in multiway data analysis
Dogyoon Song and Alfred Hero
Preprint available online (v2, October 2023).Algebraic and statistical properties of the ordinary least squares interpolator
Dennis Shen⋆, Dogyoon Song⋆, Peng Ding and Jasjeet Sekhon
Preprint available online (v1, September 2023).Local minima structures in Gaussian mixture models
(α-β) Yudong Chen⋆, Dogyoon Song⋆, Xumei Xi and Yuqian Zhang
Under review. Preprint available online (v2, April 2023).Deconvolution with unknown error distribution interpreted as blind isotonic regression
(α-β) Devavrat Shah and Dogyoon Song
Presented at the Allerton Conference 2018. Preprint available online (v3, April 2020).Learning RUMs: Reducing mixture to single component via PCA
(α-β) Devavrat Shah and Dogyoon Song
Preprint available online (v3, March 2020).
Ph.D. Thesis
Addressing missing data and scalable optimization for data-driven decision making
Ph.D. Thesis, Massachusetts Institute of Technology, June 2021.
supervised by Pablo A. Parrilo and Devavrat Shah.
Teaching
At MIT, I was a TA for the graduate-level courses:
6.256 Algebraic Techniques and Semidefinite Optimization -- Spring 2019
6.437 Inference and Information -- Spring 2018
6.252 Nonlinear Programming -- Spring 2017