Taehwan Lee (이태환)
Ph.D Student, Graduate School of Artificial Intelligence, UNIST
Address: 409-1, Building 106, 50 UNIST-gil, Ulju-gun, Ulsan 44919, Republic of Korea
Email: taehwan@unist.ac.kr
Lab: MIIT Lab
Research Interests
Federated & decentralized learning
Model generalization
Multimodal learning
Generative models
Education
Ph.D in Artificial Intelligence Graduate School, Ulsan National Institute of Science and Technology (UNIST), Ulsan, South Korea.
Advisor: Sung Whan Yoon (Lab: MIIT)
Mar. 2022 ~ Present
M.S. in Electrical Engineering at Ulsan National Institute of Science and Technology (UNIST), Ulsan, South Korea.
Advisor: Jinho Chung
Mar. 2017 ~ Aug. 2019
B.S. in Electronic Engineering, Hanbat National University, Daejeon, South Korea.
Advisor: Kyoungjae Lee (Lab: WiSL)
Mar. 2011 ~ Feb. 2017.
Journals & Conferences
*Equal contribution.
Taehwan Lee*, Kyeongkook Seo*, Jaejun Yoo, Sung Whan Yoon, "Understanding Flatness in Generative Models: Its Role and Benefits," in International Conference on Computer Vision (ICCV), Honolulu, HI, 2025.
SeungBum Ha*, Taehwan Lee*, Jiyoun Lim, Sung Whan Yoon, "Benchmarking Federated Learning for Semantic Datasets: Federated Scene Graph Generation," Pattern Recognition Letters, vol. 197, Nov. 2025
Taehwan Lee, Sung Whan Yoon, "Rethinking the Flat Minima Searching in Federated Learning," in the 41st International Conference on Machine Learning (ICML), Vienna, Austria, 2024.
Taehwan Lee*, Hee-Heon Jung*, and Jin-Ho Chung. "A new one-coincidence frequency-hopping sequence set of length p2-p." 2018 IEEE Information Theory Workshop (ITW). IEEE, 2018.
Projects
* All projects were supervised by the PI (Prof. Sung Whan Yoon).
FLAME-ARK: Federated Learning Multimodal Foundation Models for Medical Institutes
Korean Ministry of Health and Welfare (Korea) | Apr. 2025 – Present
Develop privacy-preserving foundation-model federated learning techniques for multimodal medical data.
Investigate parameter-efficient adaptation in FL (e.g., LoRA-based tuning) and multimodal learning methods.
Academic outcome: Understanding Flatness in Generative Models: Its Role and Benefits (ICCV 2025).
Korea–U.S. (NSF) International Collaborative Research Project
IITP (Korea) | Jan. 2023 – Nov. 2024
Address non-IID data challenges in federated learning arising from heterogeneous data distributions across edge devices.
Academic outcome: Rethinking the Flat Minima Searching in Federated Learning (ICML 2024).
Development of a Hierarchical FL Simulator and Contribution Measurement Toolkit
ETRI (Korea) | Jun. 2024 – Nov. 2024
Experimental Testbed for Asynchronous Consensus in Federated Learning (Multi-metric inference-driven)
ETRI (Korea) | Jun. 2023 – Nov. 2023
Feasibility Validation for Resource-Obfuscated Deep Learning Models
ETRI (Korea) | Apr. 2023 – Nov. 2023
Academic outcome: Benchmarking Federated Learning for Semantic Datasets: Federated Scene Graph Generation (PRL).
Experiences
"Rethinking the Flat Minima Searching in Federated Learning" is awarded as the finalist of Qualcomm Innovation Fellowship Korea (QIFK) 2024 [Link].
Teaching Assistant: Electrical Engineering Programming
Course: EE233, 2025 spring @ UNIST.
Description: Basic programming tools for electrical engineering (C++).
Teaching Assistant: Electrical Engineering Programming
Course: EE233, 2024 spring @ UNIST.
Description: Basic programming tools for electrical engineering (C++).