The Extreme Weather Tracking Dataset (ExtremeTrack) is introduced to provide a dedicated platform for evaluating and advancing object tracking algorithms under challenging weather conditions. ExtremeTrack features:
The dataset contains 199 video sequences, comprising approximately 95,000 frames, captured under diverse real-world adverse conditions.
Among these, 100 videos depict hazy environments, while 99 videos illustrate rainy conditions, encompassing varying intensity levels and visibility degradations.
The dataset is divided into 159 videos for training and 40 videos for testing, enabling standardized benchmarking of tracking performance.
All annotations are performed manually, frame by frame, and cross-verified to ensure high accuracy and consistency across the dataset.
The dataset incorporates realistic environmental challenges, including motion blur, occlusion, background clutter, camera instability, and dynamic lighting variations, to simulate authentic adverse-weather scenarios.
A wide range of object types, from humans and vehicles to small handheld objects, have been tracked, offering comprehensive diversity for evaluating algorithmic robustness.
ExtremeTrack thus provides a challenging yet realistic benchmark for developing weather-resilient visual tracking systems, facilitating progress in applications such as surveillance, autonomous navigation, and intelligent transportation.
Dataset examples from the ExtremeTrack Dataset