Email: mlou30 [at] gatech [dot] edu
Email: mlou30 [at] gatech [dot] edu
About me
I am a final year PhD student in the interdisciplinary Algorithms, Combinatorics and Optimization program at the School of Industrial and Systems Engineering, Georgia Tech. I am fortunate to be advised by Ashwin Pananjady. Previously, I obtained a Master's degree from the Johns Hopkins University, and a Bachelor's degree from Zhejiang University. I have been fortunate to also work with Guy Bresler, Sara Fridovich-Keil, Kabir Aladin Verchand, and Guanyi Wang during my PhD.
I am on the 2025-2026 Academic Job Market. My CV is here.
Research Interests
My research has focused on the computational and statistical aspects of learning in high-dimensional settings. On the computational side, I have studied the efficiency of iterative algorithms for nonconvex model-fitting, particularly in the settings of random initialization, model misspecification, and hyperparameter tuning. On the statistical side, my work has developed average-case reductions between various statistical models, establishing fundamental statistical-computational tradeoffs. My work is broadly motivated by applications in machine learning, data science, and medical imaging.
Papers
(* denotes equal contribution)
Efficient reductions from a Gaussian source with applications to statistical-computational tradeoffs (in preparation)
Mengqi Lou, Guy Bresler, Ashwin Pananjady
Accurate, provable, and fast polychromatic tomographic reconstruction: A variational inequality approach
Mengqi Lou, Kabir Aladin Verchand, Sara Fridovich-Keil, Ashwin Pananjady
SIAM Journal on Imaging Sciences
Extended abstract in the 18th International Meeting on Fully Three-Dimensional Image Reconstruction in Radiology and Nuclear Medicine (Fully3D 2025)
Computationally efficient reductions between some statistical models
Mengqi Lou, Guy Bresler, Ashwin Pananjady
IEEE Transactions on Information Theory
Outstanding Paper Award in the Conference on Algorithmic Learning Theory 2025
Hyperparameter tuning via trajectory predictions: Stochastic prox-linear methods in matrix sensing
Mengqi Lou, Kabir Aladin Verchand, Ashwin Pananjady
Mathematical Programming: Series B
Spotlight in the ICML 2023 workshop on High-dimensional Learning Dynamics
Alternating minimization for generalized rank-1 matrix sensing: Sharp predictions from a random initialization
Kabir Aladin Verchand*, Mengqi Lou*, Ashwin Pananjady
Information and Inference: A Journal of IMA
Extended abstract in the Conference on Algorithmic Learning Theory 2024
Spotlight in the NeurIPS 2022 workshop on The Benefits of Higher-Order Optimization in Machine Learning
Do algorithms and barriers for sparse principal component analysis extend to other structured settings?
Guanyi Wang, Mengqi Lou, Ashwin Pananjady
IEEE Transactions on Signal Processing
On seeded subgraph-to-subgraph matching: The ssSGM Algorithm and matchability information theory
Lingyao Meng, Mengqi Lou, Jianyu Lin, Vince Lyzinski, Donniell E. Fishkind
Journal of Computational and Graphical Statistics
Honors and Awards
Cornell ORIE Young Researcher, Oral Presentation, 2025.
Outstanding Paper Award in the Conference on Algorithmic Learning Theory, 2025
ARC-ACO Research Fellowship from Algorithms and Randomness Center, Georgia Tech, 2025
Robert Goodell Brown Award – Research Excellence in Data Science and Statistics, Georgia Tech, 2024
Stewart Fellowship, William Green Fellowship, ACO Student Fellowship, Georgia Tech, 2020-2021