Peng Zhang's homepage
I am an assistant professor in Computer Science at Rutgers University. I am also a graduate faculty in Statistics at Rutgers.
I was a postdoc at Yale University, under the supervision of Prof. Daniel Spielman, during 2018 - 2021. I obtained my Ph.D. from Georgia Tech in 2018, advised by Prof. Richard Peng. Before that, I got my MS from Purdue University in 2015 and my BS from Zhejiang University in 2013.
I am actively looking for self-motivated students. Please send me an email if you are interested in my research!
Rutgers 2023 fall PhD application deadline Jan 1st, 2023 (link)
Email: pz149@rutgers.edu
Hill Center, Room 444
My research lies broadly in theoretical computer science and data science. Specifically, I am interested in
fast algorithms for solving structured linear equations and linear programs,
discrepancy theory and its applications (e.g., in the design of randomized experiments).
Together with Guanyang Wang in the Statistics department, we received the Adobe Data Science Research Award for the project "Improving the Design of Randomized Controlled Trials via Discrepancy Theory.” (September 2022)
My project "CAREER: Fine-Grained Complexity and Algorithms for Structured Linear Equations and Linear Programs" received an NSF CAREER Award (January 2023).
Teaching
At Rutgers:
CS 514 Advanced Algorithms II (Spectral Graph Algorithms): Spring 2023
CS 323 Numerical Analysis and Computing: Fall 2022
CS 673 Discrepancy Theory and Its Applications: Fall 2021
At Yale:
CPSC 662 Combinatorial Discrepancy and Its Applications: Fall 2020
Preprints
Hardness Results for Minimizing the Covariance of Randomly Signed Sum of Vectors
Peng Zhang
Arxiv https://arxiv.org/abs/2211.14658
Balancing covariates in randomized experiments using the Gram-Schmidt walk
with Christopher Harshaw, Fredrik Savje, and Daniel Spielman.
Arxiv https://arxiv.org/abs/1911.03071.
Publications
Hardness Results for Weaver's Discrepancy Problem
with Daniel Spielman.
APPROX 2022.
Hardness Results for Laplacians of Simplicial Complexes via Sparse-Linear Equation Complete Gadgets
with Ming Ding, Rasmus Kyng, and Maximilian Probst Gutenberg.
ICALP 2022.
Two-Commodity Flow is as Hard as Linear Programming under Nearly Linear-Time Reductions
with Ming Ding and Rasmus Kyng.
ICALP 2022.
Positive LPs are Hard to Solve Accurately, Assuming Linear Equations are Hard
with Rasmus Kyng and Di Wang.
SODA 2020.
with Rasmus Kyng, Richard Peng, and Robert Schwieterman.
STOC 2018.
Hardness Results for Structured Linear Systems
with Rasmus Kyng.
FOCS 2017, won the best student paper award.
SIAM Journal on Computing (2020).
On Approximate Pattern Matching with Thresholds
with Mikhail J. Atallah.
Information Processing Letters 123 (2017): 21-26.
Approximating the Solution to Mixed Packing and Covering LPs in Parallel O~(ε^{−3}) Time
with Michael W. Mahoney, Satish Rao, and Di Wang.
ICALP 2016.
Optimal Query Complexity for Estimating the Trace of a Matrix
with Karl Wimmer and Yi Wu.
ICALP 2014.
Minimizing Seed Set Selection with Probabilistic Coverage Guarantee in a Social Network
with Wei Chen, Xiaoming Sun, Yajun Wang, and Jialin Zhang.
SIGKDD 2014.
My Ph.D. dissertation:
Hardness and Tractability For Structured Numerical Problems. 2018. Georgia Tech College of Computing Dissertation Award.