Sangmin Bae
Research Scientist
Sangmin Bae
Research Scientist
Please visit my new website.
Welcom to my website! I am a Ph.D. candidate at OSI Lab, KAIST AI with a strong desire to become a versatile T-shaped expert in AI.
While I have primarily focused on Computer Vision, I have also explored other AI domains, including NLP, Audio, and Video, to broaden my knowledge and expertise.
My research interests lie in Efficient AI, which entails exploring training- or data-efficient approaches to make AI more accessible and sustainable.
Some of my research areas include Self-Supervised Learning, Federated Learning, Efficient Generative AI, and Multimodal Learning.
Contacts: bsmn0223@kaist.ac.kr
Google Scholar, Github, Linkedin, Twitter
Research Advisors:
Se-Young Yun (KAIST AI Prof.)
Hwanjun Song (AWS AI Lab), 2022-2023
ㅡ
Education
KAIST / Ph.D. in Kim Jaechul Graduate School of AI Present
Under the supervision of Prof. Se-Young Yun
KAIST / M.S. in Industrial and Systems Engineering 03. 2019 - 02. 2021
Under the supervision of Prof. Se-Young Yun
KAIST / B.S. in Industrial and Systems Engineering 03. 2014 - 02. 2019
ㅡ
Publications
Conference
Workshop
Preprints
*: 1st co-authors, _: corresponding authors, C: conferences, W: workshops, J: journals, P: preprints
[C5] Patch-Mix Contrastive Learning with Audio Spectrogram Transformer on Respiratory Sound Classification
Sangmin Bae*, J. Kim*, W. Cho, H. Baek, S. Son, B. Lee, C. Ha, K. Tae, S. Kim and S-Y. Yun
Conference of the International Speech Communication Association (INTERSPEECH), 2023 [paper, github]
[W4] Coreset Sampling from Open-Set for Fine-Grained Self-Supervised Learning
S. Kim*, Sangmin Bae* and S-Y. Yun
Neural Information Processing Systems (NeurIPS) Workshop SSLTheoryPractice, 2022 [paper, github]
[W3] LG-FAL: Federated Active Learning Strategy using Local and Global Models
S. Kim*, Sangmin Bae*, H. Song and S-Y. Yun
International Conference on Machine Learning (ICML) ReALML Workshop, 2022 [paper, github]
[W2] MixCo: Mix-up Contrastive Learning for Visual Representation
S. Kim*, G. Lee*, Sangmin Bae* and S-Y. Yun
Neural Information Processing Systems (NeurIPS) Workshop SSLTheoryPractice, 2020 [paper, github]
[P1] Accurate and Fast Federated Learning via Combinatorial Multi-Armed Bandits
T. Kim*, Sangmin Bae*, J. Lee and S-Y. Yun
Preprint, 2020 [paper, github]
ㅡ
Awards
Two Best Presentation Awards, Korea Computing Congress (KCC) 2022
Best Paper Award (5th place), Korean AI Association and LG AI Research (JKAIA) 2021
MicroNet Challenge 3rd place, Neural Information Processing Systems (NeurIPS) Workshop 2019
Dean’s List (Top 3%), Faculty of Engineering Department in KAIST 2017
ㅡ
Patents
Toward Enhanced Representation for Federated Re-Identification by Not-True Self Knowledge Distillation
S-Y. Yun, S. Kim, W. Chung, Sangmin Bae
Korea Patent Application No. 10-2022-0125654
Federated Learning System for Performing Individual Data Customized Federated Learning, Method for Federated Learning, and Client Aratus for Performing Same
J. Oh, S. Kim, S-Y. Yun, Sangmin Bae, J. Shin, S. Kim, W. Chung
US Patent Application No. 17/975,664
Korea Patent Application No. 10-2022-0075186
System, Method, Computer-Readable Storage Medium and Computer Program for Federated Learning of Local Model based on Learning Direction of Global Model
G. Lee, M. Jeong, S-Y. Yun, Sangmin Bae, J. Ahn, S. Kim, W. Chung
US Patent Application No. 17/974,545
Korea Patent Application No. 10-2022-0075187
ㅡ
Projects
National Institute of Environmental Research Present (Project Manager)
Short-term prediction of particulate matter via U-Net or ViT-based models
KT 04. 2022 - 02. 2023 (Project Manager)
Neural Architecture Search (NAS) for detecting communication network failure (NAS survey materials)
Deep neural networks on tabular dataset (Tabular DL survey materials)
ETRI 03. 2019 - 09. 2022 (Project Manager 03.2021 - 04.2022)
Model compression for big data edge analysis (06.2019 - 11.2019)
AI Grand Challenge: object detection challenge (10.2020 - 12.2020)
Lightweight edge device technology via federated learning (03.2021 - 09. 2022)
SK Hynix 03. 2020 - 09. 2021
Unsupervised representation learning (03. 2020 - 12. 2020)
Unsupervised semantic segmentation prediction (02. 2021 - 09. 2021)
Hankook Tire and Technology 03. 2019 - 02. 2020
Compound prediction with artificial intelligence and Auto-ML
ㅡ
Services
NAVER AI / Research Collaboration 01. 2022 - 01. 2023
Develop an algorithm for Federated Active Learning (paper has accepted at CVPR 2023)
Kakao / Research Internship in Recommendation Team 09. 2018 - 02. 2019
Daum article recommendation with MAB algorithm
OSI Lab / Research Internship 08. 2018 - 09. 2018
Research on Multi-Armed Bandit algorithm
HFEL Lab / Research Internship 12. 2017 - 06. 2018
Research on Cross Country Skiing Techniques Classification using ML
Samsung DS, Deep Learning Course (07. 2020)
MetaCode, Machine Learning Course (06. 2021 - 12. 2022)
ForumM, Recommendation System Seminar (11.2022)
Korea Blockchain Institute, Machine Learning Course (12. 2020)
Optimization for AI, AI505 (Fall 2021, Fall 2022)
Machine Learning Theory, AI603 (Spring 2021)
Deep Reinforcement Learning, AI611 (Spring 2022)