Slasher is an open 1/10 scale robot car platform for aggressive (up to 70 km/h with stock gearing) deep neural network driving experiments, using neuromorphic silicon retina dynamic vision sensor (DVS) event cameras. It enables E2E dataset collection and localization. E2E dataset collection is enabled via the race controller and first-person-view camera. Localization is via the Loco Positioning System Decawave radios. Control of the human/computer controlled car mode and safety stop is via the BlueTooth joystick remote control.
Slasher was developed by the Sensors Group of the Inst. of Neuroinformatics, Univ. of Zurich and ETH Zurich. See more datasets and tools from the Sensors Group.
If you use Slasher, please cite the accompanying AICAS 2019 paper
Y. Hu, H.M. Chen, T. Delbruck, “Slasher: Stadium racer car for event camera end-to-end learning autonomous driving experiments” AICAS 2019, Mar 2019, Hsinshu City, Taiwan.
Snapshot of DAVIS camera output with recorded steering and throttle on the foyer dataset
Snapshot of DAVS camera output on the joggling dataset
Steering ground truth and prediction from foyer data.
Steering ground truth and prediction from joggling data.
Trajectories for training and autonomous test recorded with positioning system
Slasher CNN running in jAER. Left: AEViewer display, showing DAVIS output and overlaid ground truth steering (blue) and prediction (red). Right: The last DVS input frame to CNN.
Human controlled fast drive on previous Traxxas E-Maxx platform, to show potential speed
Recorded DAVIS240C frame+event data with human steering and throttle
Autonomous driving on foyer track
Outdoor winter jogging trail autonomous drive
This reference CNN was reported in paper for steering control on foyer and jogging trail tracks