Welcome to the Signal Learning Lab (SLL)!
We are a passionate team of researchers at Korea University, Republic of Korea, dedicated to advancing the frontiers of signal learning.
Our current focus lies in foundation model-based deep learning methods, positioning us at the forefront of upcoming technological innovations.
Join us as we explore and contribute to the latest breakthroughs in this dynamic and evolving field!
One paper (CTTA) accepted at ICML 2025
(May 01, 2025)
2024 디지털혁신인재 심포지엄 대학원생 우수성과 발표 정보통신기획평가원 원장상 수상
(Aug. 23, 2024)
과총 제34회 과학기술
우수논문상 수상
(Jul. 11, 2024)
2024년 연구우수교수
(FWCI, 국제부분 수상)
(Jun. 05, 2024)
여름방학 학부생 단기 인턴생 모집 -> 교수(wjhwang@korea.ac.kr)에게 메일 문의
[May/2025] One paper accepted at ICML 2025
- [CTTA] Ph. D student, Mr. Han's Paper,
"Ranked Entropy Minimization for Continual Test-Time Adaptation,"
International Conference on Machine Learning (ICML), Vancouver, Canada, July 2025
IITP 인간지향적 차세대도전형 AI 기술개발 과제 경북대 컨소시움 (책임: 경북대, 세부책임: 고려대) 선정!
[Feb./2025] One paper accepted at Pattern Recognition Journal
- [Domain Adaptation] Dr. Na's paper,
"Bridging Domain Spaces for Unsupervised Domain Adaptation,"
Pattern Recognition (PR), Feb. 2025 (JIF Rank=93.1%, Q1)
Now we are softly landing on Signal Learning Lab., Korea University from CVPR Lab., Ajou University.
Test-Time Adaptation (TTA) adapts a pre-trained model to new data distributions during inference, without accessing the original training data.
(New) "When Test-Time Adaptation Meets Self-Supervised Models"
https://www.arxiv.org/abs/2506.23529
(New)[ICML'25]
"Continual TTA for Transformer"
https://arxiv.org/abs/2505.16441
Semantic Segmentation is the task of clustering parts of an image together which belong to the same object class.
(New) "Human parsing using multi-foundation models"
http://arxiv.org/abs/2503.22237
[NeurIPS'23]
https://arxiv.org/abs/2310.18640
[CVPR'22] https://arxiv.org/abs/2111.14173
Continual Learning is a model learning a large number of tasks sequentially without forgetting knowledge obtained from the preceding tasks.
(New)[WACV'25]
https://arxiv.org/abs/2403.11537
[ACCV'24] https://arxiv.org/abs/2305.05175
Domain Adaption aims to adapt models from a labeled source domain to a different but target domain without labels.
Knowledge Distillation extracts pivtoal knowledge from a teacher network to guide the learning of a student network.
[ICCV'23]
https://arxiv.org/abs/2206.01186
[CVPR'23] https://arxiv.org/abs/2205.15531
[ICCV'21] https://arxiv.org/abs/2009.08825
Self-supervised Learning for leveraging training data without supervision signals for Classification and Detection.