4D Radar Technology and Advanced Sensor Fusion AI:
From Hardware and Signal Processing of 4D Radar to Camera and LiDAR Integration in AI
From Hardware and Signal Processing of 4D Radar to Camera and LiDAR Integration in AI
This tutorial provides a comprehensive overview and in-depth insights into advanced 4D Radar technology and its integration with other sensors through fusion AI. 4D Radar, which offers 3D spatial information along with velocity data, has garnered significant attention in both academic and industrial circles due to its robustness in various conditions, including inclement weather. The tutorial is structured in two parts to facilitate understanding of both 4D Radar systems and their integration in multi-sensor environments.
In the first part, presented by Junfeng Guan, we delve into the fundamentals of 4D Radar. This section covers the importance of 4D Radar in robotics, the overall structure of 4D Radar systems, hardware components, waveforms, MIMO techniques, radar equations, and the signal processing pipeline.
The second part, presented by Dong-Hee Paek, introduces AI applications specific to 4D Radar, addressing challenges and pre-processing techniques for neural networks. He also focuses on sensor fusion AI, with particular emphasis on integrating 4D Radar with other sensors like cameras and LiDAR. This section introduces the concept of sensor fusion, explores multi-modal datasets, and discusses the characteristics and backbones of different sensors. It then delves into advancements in sensor fusion AI, covering various tasks, fusion levels, and emerging technologies like auto-labeling and auto-calibration. The tutorial concludes with a look at future directions in sensor fusion AI.
This tutorial aims to provide valuable insights for researchers and practitioners at all levels of familiarity with 4D Radar and sensor fusion technologies. The content is designed to offer both theoretical understanding and practical applications in the rapidly evolving field of autonomous systems and robotics.
If you found the materials helpful, please cite:
D.-H. Paek and J. Guan, "4D Radar Technology and Advanced Sensor Fusion AI: From Hardware and Signal Processing of 4D Radar to Camera and LiDAR Integration in AI," in 2025 IEEE International Conference on Robotics and Automation (ICRA), Atlanta, GA, USA, 2025.
Dong-Hee Paek received the B.S. degree in robotics from Kwang-Woon University, South Korea, in 2019, and the M.S. degree from The Robotics Program at the Korea Advanced Institute of Science and Technology (KAIST) in 2021. He is currently pursuing the Ph.D. degree at the CCS Graduate School of Mobility from KAIST. Additionally, he is working as a Senior Research Engineer at Zeta Mobility, a venture company specializing in sensor fusion and embedded AI. He developed K-Radar and K-Lane, which are the world's first 4D Radar-based 3D object detection and LiDAR-based lane detection datasets, respectively. His work has led to publications in top conferences such as NeurIPS and CVPR. K-Radar and K-Lane are being distributed to universities and companies worldwide, including GM, Motional, EPFL, and other major groups.
Junfeng Guan is a researcher in the wireless connectivity and sensing group at Bosch Research in Sunnyvale, USA. His research focuses on radar perception and wireless networking & sensing systems. He completed his postdoctoral research in the School of Communication and Computer Science at EPFL, Switzerland. Junfeng received his Ph.D. from the Department of Electrical and Computer Engineering at the University of Illinois Urbana-Champaign in 2022. He is a recipient of the Qualcomm Innovation Fellowship 2020. His research has received ACM SIGMOBILE Research Highlights and was selected as RFIC'20 Industry Best Paper Finalist. He has published in top conferences and journals including NSDI, SIGCOMM, CVPR, ECCV, ICASSP, and IEEE TMTT. Junfeng also served on the shadow Program Committee for SenSys 2022.