Naichen Shi
About me
I am a PhD student at IOE in the University of Michigan, advised by professor Dr. Al Kontar. My research interests include optimization, machine learning, and their applications, especially in manufacturing systems.
News
June 2024: I will present at ICQSR 2024 on the topic of physics simulation guided diffusion models!
October 2023: Our paper, Personalized Tucker Decomposition: Modeling Commonality and Peculiarity on Tensor Data, is selected as the finalist of the INFORMS 2023 QSR best refereed paper competition!
October 2023: Our paper, Heterogeneous Matrix Factorization: When features differ by datasets, is selected as the finalist of the INFORMS 2023 best student paper competition!
July 2023: I am selected as the instructor of the small course of IOE 202 Operations Engineering and Analytics!
June 2023: I presented at ICQSR 2023 on the topic of heterogeneous matrix factorization!
June 2023: Jiuyun presented our joint work on personalized Tucker decomposition at ICQSR 2023!
Research Topics
I am interested in a wide range of topics in bringing in AI and statistical techniques into advanced manufacturing.
1. Learning with heterogeneity
When data are collected from different but related sources, can we identify their common features and source-specific ones?
We use a series of techniques from matrix factorization and tensor factorization to recover the shared and unique low-rank features. These features are helpful in several interesting applications. For example, in laser-based metal powder additive manufacturing, the layer-wise shared features characterize the signature process, while the unique features indicate anomalies.
2. Physics-informed generative model
Can we use physics knowledge to guide the generative models so that the generated samples conform with the physics principles?
We leverage conditional diffusion models to integrate the information from physics simulations with diffusion models. The model can help predict the temperature distribution of additive manufacturing.
3. Adaptize stepsize optimization
Almost every ML/AL practitioner uses adaptive stepsize optimization algorithms (e.g., Adam). I am interested in the theoretical properties of these optimizers: Can they converge? Why do they become popular? Can they be improved?
As the first step, we show, both theoretically and numerically, that the good performance of RMSprop and Adam is contingent on the appropriate choice of the exponential averaging parameter.
Publications:
Journal
[JMLR] Naichen Shi, Raed Kontar, Personalized PCA: Decoupling Shared and Unique Features. Journal of Machine Learning Research 2023. link
[Technometrics] Naichen Shi, Raed Kontar, Personalized Federated Learning via Domain Adaptation with Application to Distributed Manufacturing. Technometrics 2022. link
[TASE] Naichen Shi, Raed Al Kontar, Fed-ensemble: Ensemble Models in Federated Learning for Improved Generalization and Uncertainty Quantification. IEEE Transactions on Automation Science and Engineering, 2022. link
[IEEE Access] Raed Kontar, Naichen Shi, Xubo Yue, Seokhyun Chung, Eunshin Byon, Mosharaf Chowdhury, Judy Jin, Wissam Kontar, Neda Masoud, Maher Noueihed, Chinedum E. Okwudire, Garvesh Raskutti, Romesh Saigal, Karandeep Singh, and Zhisheng Ye, The Internet of Federated Things. IEEE Access, 2021.
Conference
[NeurIPS Spotlight] Geyu Liang, Naichen Shi, Raed Al Kontar, Salar Fattahi. Personalized Dictionary Learning for Heterogeneous Datasets. Thirty-seventh Conference on Neural Information Processing Systems, 2023. link
[MSEC] Naichen Shi, Raed Kontar, Shenghan Guo, Process Signature Characterization and Anomaly Detection with Personalized PCA in Laser-Based Metal Additive Manufacturing. Proceedings of the ASME 2023 18th International Manufacturing Science and Engineering Conference. 2022.
[NeurIPS Spotlight] Yushun Zhang, Congliang Chen, Naichen Shi, Ruoyu Sun, Zhiquan Luo. Adam Can Converge Without Any Modification On Update Rules. Thirty-sixth Conference on Neural Information Processing Systems, 2022. link
[ICLR spotlight] Naichen Shi, Dawei Li, Mingyi Hong, and Ruoyu Sun, RMSprop converges with proper hyper-parameter. International Conference on Learning Representations, 2021. link
Teaching:
For fall 2023, I am the main instructor of the course IOE202 Operations Engineering and Analytics.
For winter 2023 and winter 2022, I am the Graduate student instructor of the course Modern Bayesian data analytics at the University of Michigan. Instructor: Raed Al Kontar.
For summer 2021, I am a teaching assistant of the course Optimization in deep learning at the Chinese University of Hong Kong, Shen Zhen. Instructor: Ruoyu Sun
For spring 2020, I am an undergraduate teaching assistant of the course Game Theory at Peking University. Instructor: Weiying Zhang
Activities:
I am a student representative and fundraiser of the 2023 Michigan Student Symposium for Interdisciplinary Statistical Sciences (MSSISS). The MSSISS is an annual event organized by graduate students.
I am a member of INFORMS Pro Bono group. Our group applies data visualization and data analytics to manufactured housing communities in Michigan. We are working with community leaders to help them in legislative efforts.