Visual Perception for Autonomous Driving


  • [CODE] will be opened.
  • [PAPER] The new paper was submitted in ECCV2018. This work was related to Adaptive Fusion Approach.
  • [AWARD] Our project has selected as one of the ten most outstanding projects of 2017 in KAIST.
  • [NOTICE] Pedestrian Detection Benchmark (CVPR2015) will be integrated in this webpage. [link]

Welcome to the Multispectral All-day Vision Benchmark Suite !


We introduce the KAIST multi-spectral dataset, which covers a greater range of drivable regions, from urban to residential, for autonomous systems. Our dataset provides different perspectives of the world captured in coarse time slots (day and night) in addition to fine time slots (sunrise, morning, afternoon, sunset, night and dawn). For all-day perception of autonomous systems, we propose the use of a different spectral sensor, i.e., a thermal imaging camera. Toward this goal, we develop a multi-sensor platform, which supports the use of a co-aligned RGB/Thermal camera, RGB stereo, 3D LiDAR and inertial sensors (GPS/IMU) and a related calibration technique. We design a wide range of visual perception tasks including the object detection, drivable region detection, localization, image enhancement, depth estimation and colorization using a single/multi-spectral approach. In this paper, we provide a description of our benchmark with the recording platform, data format, development toolkits, and lessons about the progress of capturing datasets.

[Data & Tasks]

(1st and 2nd rows) The KAIST multi-spectral dataset for visual perception of autonomous driving in day and night. Dataset was repeatedly collected by the KAIST Multi-spectral All-day platform traversing in campus, urban and residential over several days.

(3rd and 4th rows) The collected RGB stereo, thermal image, LiDAR and GPS data enable study into all-day vision problems such as image enhancement (red rectangles from top- left to top-right), pedestrian/vehicle detection, colorization, drivable region detection, driving path prediction with localization, dense depth estimation, 3D reconstruction.


All datasets and benchmarks on this page are copyright by us and published under the Creative Commons Attribution-NonCommercial-ShareAlike 3.0 License. This means that you must attribute the work in the manner specified by the authors, you may not use this work for commercial purposes and if you alter, transform, or build upon this work, you may distribute the resulting work only under the same license.


When using all dataset in your research, we will be happy if you cite us!

author={Yukyung Choi, Namil Kim, Soonmin Hwang, Kibaek Park, Jae Shin Yoon, Kyounghwan An, In So Kweon}, 
title={KAIST Multi-spectral Day/Night Dataset for Autonomous and Assisted Driving}, 


  • 12.2017: We will release the full dataset and toolkits to encourage research on related researchers.