Probabilistic End-to-end Vehicle Navigation in Complex Dynamic

Environments with Multimodal Sensor Fusion

Peide Cai, Sukai Wang, Yuxiang Sun, Ming Liu

Hong Kong University of Science and Technology, Robotics and Multi-Perception (RAM) Lab

IEEE Robotics and Automation Letters (RA-L) with IROS 2020

[Paper 2.6MB] [Code]

All-day and all-weather navigation is a critical capability for autonomous driving, which requires proper reaction to varied environmental conditions and complex agent behaviors. Recently, with the rise of deep learning, end-to-end control for autonomous vehicles has been well studied. However, most works are solely based on visual information, which can be degraded by challenging illumination conditions such as dim light or total darkness. In addition, they usually generate and apply deterministic control commands without considering the uncertainties in the future.

In this work, we propose a mixed sensor setup combining camera, lidar and radar. The multimodal information is processed by uniform alignment and projection onto the image plane, then ResNet is used for feature extraction. Based on this setup, we introduce a probabilistic motion planning network (PMP-net) to learn a deep probabilistic driving policy from expert provided data. Finally, we evaluate its driving performance on-line on a new benchmark DeepTest. It includes various environments (e.g., urban and rural areas, traffic densities, weather and times of the day) and dynamic obstacles (e.g., vehicles, pedestrians, motorcyclists and bicyclists).

The results suggest that our proposed model enables autonomous driving in different environments without hand-crafted rules.

Various weather and illumination conditions are considered in this work.

Three levels of traffic density are considered in our DeepTest benchmark.

Empty Regular Dense