Background
I received my bachelor's degree in Mathematics and Applied Mathematics at the Beijing Normal University in 2016. Then, I got my degree of Doctor of Philosophy in Applied Mathematics at the Hong Kong Polytechnic University in 2021, under the supervision of my Chief-supervisor Professor Ting Kei Pong, and Co-supervisor Professor Xiaojun Chen. During 2021-2024, I was a Postdoctoral Associate in the Department of Electrical and Computer at the University of Pittsburgh and the Department of Computer Science at the University of Maryland, under the mentorship of Professor Heng Huang. Now I am an assistant professor at the Department of Computer Science and Engineering at the University of Texas Arlington. I serve as a reviewer in ICML, ICLR, MP, MOR, SIOPT, IEEE Signal Processing, OJMO.
This is my Google Scholar.
This is my CV.
I am currently recruiting PhD students with a background in Computer Science, Statistics, Applied Mathematics, Data Science, or related fields. If you are interested, please contact me via email and attach your CV, transcripts, as well as details about your research experience.
Research interests
Machine Learning, Optimization and Foundation models: Bilevel Optimization, Federated Learning, LLMs, diffusion models, Adversarial Training, Meta-learning, Hyper Representation Learning, Data Hyper Cleaning, Continuous Optimization.
Publications
Qi He, Peiran Yu, Ziyi Chen, Heng Huang. Revisiting Convergence: A Study on Shuffling-Type Gradient Methods. Forty-second International Conference on Machine Learning (ICML) 2025.
Reza Shirkavand, Peiran Yu, Shangqian Gao, Gowthami Somepalli, Tom Goldstein, Heng Huang. Efficient Fine-Tuning and Concept Suppression for Pruned Diffusion Models. Conference on Computer Vision and Pattern Recognition (CVPR) 2025.
Peiran Yu, Junyi Li and Heng Huang. Hessian Free Efficient Single Loop Iterative Differentiation Methods for Bilevel Optimization Problems. TMLR 2024 (Featured Certification).
Peiran Yu, Junyi Li and Heng Huang. Dropout enhances bilevel training Tasks. Accepted by The Twelfth International Conference on Learning Representations (ICLR) 2024. (Spotlight)
P. Yu, G. Li and T. K. Pong. Kurdyka-Lojasiewicz exponent via inf-projection. Found. Comput. Math. 22: 1171--1217, 2022.
Peiran Yu, Ting Kei Pong and Zhaosong Lu. Convergence rate analysis of a sequential convex programming method with line search for a class of constrained difference-of-convex optimization problems. SIAM J. Optim. 31: 2024--2054, 2021.
Liaoyuan Zeng, Peiran Yu and Ting Kei Pong. Analysis and algorithms for some compressed sensing models based on L1/L2 minimization. SIAM J. Optim. 31:1576--1603, 2021.
Peiran Yu and Ting Kei Pong. Iteratively reweighted l1 algorithms with extrapolation. Comput. Optim. Appl. 73:353--386, 2019. code
Peiran Yu, Liaoyuan Zeng, Ting Kei Pong. Convergence analysis for a variant of manifold proximal point algorithm based on Kurdyka-Łojasiewicz property.
Reza Shirkavand, Qi He, Peiran Yu, Heng Huang. Bilevel ZOFO: Bridging Parameter-Efficient and Zeroth-Order Techniques for Efficient LLM Fine-Tuning and Meta-Training. Under review.
Qi He, Peiran Yu, Ziyi Chen, Heng Huang. Revisiting Convergence: A Study on Shuffling-Type Gradient Methods. Under review.
Peiran Yu, Wenhan Xian, Heng Huang. A Fast Federated Method for Minimax Problems with Sequential Convergence Guarantees. Under review.
Ziyi Chen, Junyi Li, Peiran Yu, Heng Huang. Provably Mitigating Corruption, Overoptimization, and Verbosity Simultaneously in Offline and Online RLHF/DPO Alignment. Under review.
Shaocong Ma, Peiran Yu, Heng Huang. Hybrid Fine-Tuning of LLMs: Theoretical Insights on Generalized Smoothness and Convergence. Under review.
Peiran Yu, Lichang Chen and Heng Huang. Federated learning using an inexact ADMM with relative errors. Under review.
Peiran Yu, Rera Shirkavand, Tong Zheng, Junyi Li and Heng Huang. Enhancing Prompt Tuning for Classification via Task-Related Hard Prompts. Under review.
Talks and posters
MOPTA 2024: A Fast Federated Method for Minimax Problems with Sequential Convergence Guarantees.
ICLR 2024: Dropout enhances bilevel training Tasks.
SIOPT (May 21 - June 3, 2023): Talk: Fully single loop methods for bi-level optimization problems.
ICCOPT (3-8 August, 2019) in Berlin. Talk: Convergence analysis and Kurdyka-Łojasiewicz property.
A workshop on Applied Mathematics organized by the CAS AMSS-PolyU Joint Lab of Applied Math (22-23 August, 2019) in PolyU Shenzhen Base. Poster: Deducing Kurdyka-Łojasiewicz exponent via inf-projection.