Odometry Challenge using Omnidirectional Drone Data
In association with Spleenlab.AI, we will host a challenge around odometry and multi-sensor data using drone data provided by Spleenlab. This data will consist of omnidirectional video captured using two 190° fisheye lens, LiDAR, RTK GPS data. The challenge will mainly focus on odometry using Omnidirectional input 360° in “off-Road” (Forest) scenarios.
Spleenlab.AI is a specialized AI software company founded with the idea to redefine Safety and AI. The company has been primarily engaged in the development and distribution of safe AI algorithms for semi- and fully autonomous mobility, especially UAV, helicopters, Air Taxis and On-Road motor vehicles. The groundbreaking fusion of different sensors, such as camera, lidar and radar by means of AI is the core business of the company.
Dataset
The Hainich forest dataset comprises of two fisheye camera images, a 360 degree LiDAR point cloud data and a RTK GPS ground truth data. The dataset was recorded in two forest locations during summer, winter and spring. A prototype UAV and 3D perception suite was used along with ROS Melodic as middleware to record data from all the sensors. The dataset contains raw images from 5MP fisheye cameras with 190 FoV and no IR filter. The subsequent color corrected images are also provided. Download the dataset from here.
Summer
Winter
Spring
Spring with color correction
The dataset contains 6 ROS bag files overall and out of that 2 will be used as test sets.
"h2f1r1" and "h3f2r2" will be the two evaluation datasets for submission. Ground truth odometry data is provided for the other 4 bag files.
Each bag file is over 10 minutes long. The cameras run at 25hz and the lidar at 10hz.
The intrinsics for the two fisheye cameras and the extrinsics of the sensor setup are available.
Forest Coverage
LiDAR Data
Hardware Setup
Challenge
The objective of this challenge is to develop an odometry algorithm that can accurately track the motion of a UAV through a forest environment using fisheye camera image and LiDAR point cloud dataset taken in different seasons of the year. The solution can be visual based, lidar based or fusion of both sensors. The algorithm should be able to handle the challenges of a forest environment, such as occlusions and changing lighting conditions.
Submission
You can use "h2f1r1" and "h3f2r2" bag files from the dataset to test the algorithm. The format for final submission is expected to be in the same format as the ground truth odometry. It could be either .txt file or csv containing ROS timestamp from the bag file, position (x y z) and orientation (x y z w).
Rules
Individuals and teams (2 or more) can participate in the challenge.
Publicly available datasets and pre-trained models are allowed.
Limit of 3 submissions per day and 50 submissions in total per team / individual.
A brief report of your technical solution must also be submitted along with the results. It is not required to submit the technical report for every submission, unless there is a change in your solution.
Individuals and institutions that contributed to the creation of the Hainich dataset may not take part in the challenge.
Timeline
Competition Release : January 2023
Submission Deadline : June 15, 2023
Winner Announcement: June 16, 2023
OmniCV 2023 winner presentation: June 19, 2023
All deadlines are at 11:59 PM UTC on the corresponding day unless otherwise noted.
Reward
$1000 for the winning team / individual.
The prize will be paid directly to the Team Lead / Individual.
The winning team / individual is expected to present their solution at the OmniCV workshop event. Paper or poster is not required but encouraged.
The final decision remains at the discretion of the organizing committee.
Cite
@dataset{milz_stefan_2022_6891131,
author = {Milz, Stefan and
Wäldchen, Jana and
Abouee, Amin and
Ravichandran, Ashwanth A and
Schall, Peter and
Hagen, Chris and
Borer, John and
Lewandowski, Benjamin and
Wittich, Hans-Christian and
Maeder, Patrick},
title = {{The HAInich: A multidisciplinary vision data-set
for a better understanding of the forest ecosystem}},
month = sep,
year = 2022,
publisher = {Zenodo},
version = {1.0},
doi = {10.5281/zenodo.6891131},
url = {https://doi.org/10.5281/zenodo.6891131}
}