Attentional Correlation Filter Network for Adaptive Visual Tracking
Abstract
We propose a new tracking framework with an attentional mechanism that chooses a subset of the associated correlation filters for increased robustness and computational efficiency. The subset of filters is adaptively selected by a deep attentional network according to the dynamic properties of the tracking target. Our contributions are manifold, and are summarised as follows: (i) Introducing the Attentional Correlation Filter Network which allows adaptive tracking of dynamic targets. (ii) Utilising an attentional network which shifts the attention to the best candidate modules, as well as predicting the estimated accuracy of currently inactive modules. (iii) Enlarging the variety of correlation filters which cover target drift, blurriness, occlusion, scale changes, and flexible aspect ratio. (iv) Validating the robustness and efficiency of the attentional mechanism for visual tracking through a number of experiments. Our method achieves similar performance to non real-time trackers, and state-of-the-art performance amongst real-time trackers.
Fig 1. Framework of ACFN
Fig 2. Tracking performance obtained by benchmark dataset
News
11/29, 2017 Github open
05/24, 2017 Training code was uploaded with training data.
05/23, 2017 All benchmark results were uploaded.
05/05, 2017 Test code for ACFN was uploaded.
04/10, 2017 Project page was built.
03/03, 2017 The conference paper was accepted in CVPR2017. (Poster)
Publication
Attentional Correlation Filter Network for Adaptive Visual Tracking
Jongwon Choi, Hyung Jin Chang, Sangdoo Yun, Tobias Fischer, Yiannis Demiris, and Jin Young Choi
IEEE Conference on Computer Vision and Pattern Recognition 2017 (CVPR2017), Accepted. [Poster]
[pdf] [supplementary] [poster] [test code] [training code][training data (OOTB) (VOT)] [results] [bibtex]
If you have questions, please contact jwchoi.pil@gmail.com