ICPR 2026 Competition on VISual Tracking in Adverse Conditions
(VISTAC-2)
28th International Conference on Pattern Recognition (ICPR), December 17-08, 2026, , French
ICPR 2026 Competition on VISual Tracking in Adverse Conditions
(VISTAC-2)
28th International Conference on Pattern Recognition (ICPR), December 17-08, 2026, , French
Introduction
Competition Schedule
The evolution of video databases is crucial in understanding complex spatiotemporal dynamics and extracting semantic content from video data. Understanding "Spatiotemporal Semantic Content" means deciphering how objects and actions within a scene evolve in meaning and significance across time and space. This is vital in video surveillance, where interpreting physical movements and contextual behaviors—such as flagging suspicious activities based on temporal patterns—is paramount.
Adverse weather conditions in video analysis present unique challenges due to reduced visibility from haze, rain, and other degradations that obscure key details. Sophisticated algorithms are essential for extracting actionable insights in these difficult scenarios. These advancements directly benefit surveillance, security, and autonomous navigation systems. Despite the progress fueled by deep learning and large datasets, a critical gap exists: the lack of specialized, publicly accessible datasets focused on adverse weather conditions. To address this need, we introduce the ExtremeTrack dataset, comprising 199 real-world videos (100 hazy and 99 rainy) with detailed annotations, providing accurate ground truth data for object tracking in degraded environments.
Building on the success of the first VISTAC (VISual Tracking in Adverse Conditions) challenge at ICPR 2024, which focused on nighttime infrared video tracking, VISTAC-2 extends the scope to object tracking under adverse weather conditions. Despite advancements in tracking under well-lit settings, algorithm performance often degrades in challenging environment scenarios involving haze and rain. To address this, VISTAC-2 introduces the ExtremeTrack dataset, featuring 199 real-world videos (100 hazy and 99 rainy) with detailed annotations. The challenge aims to benchmark and promote the development of robust and resilient tracking algorithms capable of maintaining accuracy and temporal consistency in degraded environments, supporting progress in surveillance, intelligent transportation, and autonomous vision systems.
The ExtremeTrack dataset can significantly impact the field of video analysis under adverse conditions. By providing a high-quality, specialized dataset, researchers can develop and refine algorithms tailored to the unique challenges of hazy and rainy scenarios. This will directly contribute to the improvement of surveillance and navigation systems in real-world degraded environments. We also present the qualitative precision (QP) metric to establish a new benchmark for evaluating machine learning-based object-tracking algorithms. QP is designed to assess the accuracy and reliability of algorithms operating within the demanding context of adverse weather video analysis. This initiative will drive progress by giving researchers a robust tool to evaluate and enhance their technological capabilities.
2026/03/22: Registration Opens
2026/04/22: Training Data Release
2026/05/20: Test Data Release
2026/06/20: Deadline for test results and method descriptions report submission
2026/07/10: Announcement of the final decision
Registration Link for VISTAC
Available Soon..
Awards of VISTAC Challange
We will award the top 3 participating teams with certificates from the ICPR 2026 committee.
The top 10 teams will be invited to contribute to the competition summary paper, which will be included in the proceedings of ICPR 2026.
Competition Outline
The VISTAC (VISual Tracking in Adverse Conditions) challenge returns for its second edition at the 28th International Conference on Pattern Recognition (ICPR) 2026. The primary goal of the competition is to advance the state of the art in object tracking, particularly in environments where traditional tracking algorithms struggle. While significant progress has been made in object tracking within controlled or well-lit settings, much less research has been conducted on tracking performance under challenging weather conditions such as haze and rain. VISTAC-2 seeks to address this gap by providing a platform for the development of robust tracking algorithms that can handle degraded video quality due to environmental factors like haze, rain, and other adverse conditions.
This challenge introduces a new dataset, ExtremeTrack, designed specifically to help improve tracking under these harsh conditions. The dataset contains 199 real-world videos: 100 videos in hazy conditions and 99 in rainy conditions. These videos will serve as the foundation for evaluating new tracking algorithms and assessing their ability to handle the visual degradation commonly associated with adverse weather.
● Training Data: A total of 139 videos (70 in hazy conditions and 69 in rainy conditions) will be released for training purposes on April 22nd, 2026. These videos come with ground truth annotations for the tracking task.
● Validation Data: 20 videos (10 in hazy conditions and 10 in rainy conditions) will be provided with ground truth annotations to assist participants in fine-tuning their algorithms. These videos will be available on April 22nd, 2026.
● Test Data: 40 additional videos (20 in hazy and 20 in rainy conditions) will be released without ground truth on May 20th, 2026. These videos will be used to evaluate the tracking performance of participants in real-world scenarios.
The ground truth annotations for the dataset will be in the form of bounding boxes for each object, where each bounding box is defined by its top-left (X1, Y1) and bottom-right (X2, Y2) coordinates. This provides the exact location of the tracked object in each frame, allowing for precise evaluation of the tracking results.
The VISTAC-2 competition is organized into three distinct tasks, each focusing on different adverse weather conditions to evaluate the robustness of tracking algorithms using the Qualitative Precision (QP) metric:
Task 1 – Hazy-Condition Tracking:
In this task, participants will train their models exclusively on the hazy subset of the ExtremeTrack dataset. The QP metric will be evaluated only on the hazy test videos, measuring the algorithm’s performance under hazy conditions.
Task 2 – Rainy-Condition Tracking:
Participants will train their models using only the rainy subset of the ExtremeTrack dataset. The QP metric will then be computed exclusively on the rainy test videos, assessing performance under rainy conditions.
Task 3 – Combined-Condition Tracking:
This task evaluates algorithms trained on the entire combined ExtremeTrack dataset (both hazy and rainy videos). The QP metric will be calculated on the full test set, testing the algorithm’s ability to generalize across multiple adverse weather conditions.
Participants must submit their tracking results for each test video in a unified CSV file. The results must be formatted as follows:
● For each frame of the video, the bounding box coordinates (X1, Y1, X2, Y2) of the tracked object must be provided.
● Each video will have a corresponding CSV file following this naming convention:
{video_name}.csv
For example, if a video is named "rainy_video_01", the corresponding result file should be "rainy_video_01.csv".
● All CSV files for the test videos should be bundled together into three ZIP files. The ZIP file should be named according to the participant’s group number: {GroupNumber_haze}.zip, {GroupNumber_rain}.zip and {GroupNumber_combined}.zip
**Please note that the proposed model was trained solely on the training set of the ExtremeTrack dataset, and fine-tuning was not permitted.
Fig 1: Task Chart of the competition
Competition Objectives
The ICPR 2026 Competition on Visual Tracking in Adverse Conditions (VISTAC-2) aims to advance the state of the art in robust object tracking by addressing the challenges posed by visually degraded environments such as haze and rain. Building on the success of the first edition of VISTAC, this second version focuses on expanding the dataset, improving evaluation standards, and fostering greater collaboration within the research community.
Present and release the newly developed ExtremeTrack dataset, specifically designed for single-object tracking in adverse weather conditions. The dataset contains 199 videos (100 hazy and 99 rainy), accompanied by carefully annotated ground truths for training and validation. This dataset provides a unique opportunity for researchers to develop and test tracking algorithms in real-world, visually degraded environments.
Encourage the development of robust and adaptive tracking algorithms capable of handling visibility degradation caused by environmental factors such as haze, rain, and motion blur. The competition aims to highlight the limitations of current approaches and inspire new strategies that can generalize better across diverse and challenging conditions.
Define a standardized benchmarking framework for evaluating and comparing deep learning–based tracking algorithms using the provided dataset. By introducing consistent evaluation protocols, VISTAC-2 aims to promote transparency, reproducibility, and meaningful comparison of algorithmic performance across different methods.
For queries and suggestions, contact us at: nvisot.ju.etce@gmail.com