遮挡判别下的多尺度相关滤波跟踪算法
Multi-scale correlation filter tracking algorithm based on occlusion discrimination
- 2018年23卷第12期 页码:1789-1800
收稿:2018-04-09,
修回:2018-7-17,
纸质出版:2018-12-16
DOI: 10.11834/jig.180237
移动端阅览

浏览全部资源
扫码关注微信
收稿:2018-04-09,
修回:2018-7-17,
纸质出版:2018-12-16
移动端阅览
目的
2
复杂环境下,运动目标在跟踪过程中受尺度变换以及遮挡因素的影响,跟踪准确率较低。针对这一问题,提出一种遮挡判别下的多尺度相关滤波跟踪方法。
方法
2
首先选取第1帧图像的前景区域,训练目标的位置、尺度滤波器和GMS(grid-based motion statistics)检测器。然后,通过位置滤波器估计目标位置,尺度滤波器计算目标尺度,得到初选目标区域。最后,利用相关滤波响应情况对初选目标区域进行评估,通过相关滤波响应值的峰值和峰值波动情况判断是否满足遮挡和更新条件。若遮挡,启动检测器检测目标位置,检测到目标位置后,更新目标模型;若更新,则更新位置、尺度滤波器和GMS检测器,完成跟踪。
结果
2
本文使用多尺度相关滤波方法作为算法的基本框架,对尺度变化目标跟踪具有较好的适应性。同时,利用目标模型更新机制和GMS检测器检索目标,有效地解决了遮挡情况下的目标丢失问题。在公开数据集上的测试结果表明,本文算法平均中心误差为5.58,平均跟踪准确率为94.2%,跟踪速度平均可达27.5帧/s,与当前先进的跟踪算法相比,本文算法兼顾了跟踪速度和准确率,表现出更好的跟踪效果。
结论
2
本文提出一种新的遮挡判别下的多尺度相关滤波跟踪算法。实验结果表明,本文算法在不同的尺度变换及遮挡条件下能够快速准确跟踪目标,具有较好的跟踪准确率和鲁棒性。
Objective
2
Visual target tracking has become a popular research topic locally and globally in the field of artificial intelligence
which is widely used in national defense security
industry
and people's daily life
such as military recognition
security monitoring
pilotless automobile
and human-computer interaction. Although great progress has been realized in the past decade
model-free tracking remains a tough problem due to illumination changes
geometric deformation
partial occlusion
fast motions
and background clutters. The traditional methods of target tracking generally track the target through visual features. In the case of the simple environment
these trackers can perform well for specific targets. Recently
visual object tracking has been widely applied to object tracking field due to its efficiency and robustness of correlation filter theory. A series of new advances of target tracking have been introduced and much attention has been achieved. A novel approach to predictive tracking
which is based on occlusion discriminant multi-scale correlation filter tracking algorithm
is proposed to overcome the problems of low accuracy caused by occlusion
which are scale changes in the tracking processin complex environments.
Method
2
On the basis of the basic framework of DSST (discriminated scale space tracker)
a multi-scale correlation filter tracking algorithm is proposed. Reliability discrimination for the results of the correlation filter response
which contributes to long-term stable tracking
refers to occlusion and update discrimination by the peak and multiple peak fluctuation of the response map. The proposed algorithm in this paper can be summarized as two main points:1) Two types of calculation models were designed for the maximum and multiple peak fluctuation. By evaluating the tracking results according to the two abovementioned models
we can determine the occlusion of the target
and whether the target should be updated. 2) Redetect the missing target using the detector based on GMS (grid-based motion statistics). When the target is occluded
the GMS detector has been trained start to detect the target and locate it again. Concrete tracking is conducted as follows:First
the foreground area of the first frame image is selected
and the target position and scale filters and GMS detector are trained. Then
the target location is estimated by the translation filter
and the target scale is calculated by the scale filter. Performing a correlation between the candidate samples that are obtained using different scales center on the new position
and the scale correlation filter derives the primary target area. The maximum response scale is the current frame image scale. Finally
the primary target area is evaluated by the correlation filter response
and occlusion and update conditions are determined by the peak and multiple peak fluctuation of the correlation filter response values. The mutation of the peak and multiple peak fluctuation indicated that the target is occluded at the moment. The greater the mutation
the greater the degree of occlusion. In this case
update should be avoided to prevent tracking drift. If the target is occluded
then the detector detects the target position and updates the target model after detecting the target location. When the peak value of the correlation filter response is greater than the historical value
and the peak fluctuation does not mutate
then the target information at the moment is complete than that at time t-1
and the correlation filter should be updated. If the target is updated
then this update should focus on the location and scale filters and the GMS detector to complete tracking.
Result
2
The multi-scale correlation filtering method is used as the basic framework in our algorithm
which displays good adaptability to the target tracking of scale transformation. At the same time
the target model updating mechanism and GMS detector are used to retrieve the target and effectively solve the target loss problem in the occlusion. This paper selected nine challenging video sequences namely
Box
Bird1
Lemming
Panda
Basketball
DragonBaby
CarScale
Bird2
Girl2 from the public dataset OTB-2013 and OTB-2015
and video data car_Xvid
to conduct the experiments. The test results from using the public datasets show that the algorithm has a lower average center error of 5.58 and has a better tracking accuracy of 0.942 and tracking speed of 27.5 frames per second
compared with state-of-the-art tracking algorithms DSST
KCF (kernel correlation filter)
LCT (long-term correlation tracking)
Staple
GOTUTN (generic object tracking by using regression networks)
and FCNT (fully convolutional networks tracking). Thus
the algorithm shows improved tracking performance with higher tracking speed and accuracy.
Conclusion
2
Based on DSST correlation filtering tracking
a multi-scale correlation filtering method based on occlusion discrimination is proposed. The results show that the algorithm solved the problems of losing goals due to occlusion and error accumulation because of the continuously and effectively updated strategy and achieved stable tracking under occlusion and multi-scale changes. Compared with current popular tracking algorithms
this algorithm has the following remarkable advantages:Solves the problem of losing goals due to occlusion in DSST algorithm and detects occlusion and determines whether updates can mitigate the tracking drift problem via frame update. Doing so can not only reduce unnecessary update time
but also substantially improve the tracking speed and accuracy. This paper presents a new multi-scale correlation filtering tracking algorithm based on occlusion discrimination. Experiments show that the proposed algorithm can track the target rapidly and accurately under conditions of varying scale transformation and occlusion
and it has enhanced tracking accuracy and robustness.
Zhang T Z, Xu C S, Yang M H. Multi-task correlation particle filter for robust object tracking[C ] //Proceeding of 2017 IEEE Conference on Computer Vision and Pattern Recognition. Honolulu, Hawaii, USA: IEEE Computer Society, 2017: 4819-4827.[ DOI: 10.1109/CVPR.2017.512 http://dx.doi.org/10.1109/CVPR.2017.512 ]
Tian P, Lü J H, Ma S L, et al. Robust object tracking based on local discriminative analysis[J]. Journal of Electronics&Information Technology, 2017, 39(11):2635-2643.
田鹏, 吕江花, 马世龙, 等.基于局部差别性分析的目标跟踪算法[J].电子与信息学报, 2017, 39(11):2635-2643. [DOI:10.11999/JEIT170045]
Xie Y, Zhang W S, Li C H, et al. Discriminative object tracking via sparse representation and online dictionary learning[J]. IEEE Transactions on Cybernetics, 2014, 44(4):539-553.[DOI:10.1109/TCYB.2013.2259230]
Liu Y T, Wang K F, Wang F Y. Tracklet association-based visual object tracking:the state of the art and beyond[J]. Acta Automatica Sinica, 2017, 43(11):1869-1885.
刘雅婷, 王坤峰, 王飞跃.基于踪片Tracklet关联的视觉目标跟踪:现状与展望[J].自动化学报, 2017, 43(11):1869-1885.
Danelljan M, Bhat G, Khan F S, et al. ECO: efficient convolution operators for tracking[C ] //Proceedings of 2017 IEEE Conference on Computer Vision and Pattern Recognition. Honolulu, Hawaii, USA: IEEE Computer Society, 2017: 6931-6939.[ DOI: 10.1109/CVPR.2017.733 http://dx.doi.org/10.1109/CVPR.2017.733 ]
Jiang W T, Liu W J, Yuan H. Research of object tracking based on soft feature theory[J]. Chinese Journal of Computers, 2016, 39(7):1334-1355.
姜文涛, 刘万军, 袁姮.基于软特征理论的目标跟踪研究[J].计算机学报, 2016, 39(7):1334-1355. [DOI:10.11897/SP.J.1016.2016.01334]
Xu Y L, Wang J B, Li Y, et al. One-step backtracking for occlusion detection in real-time visual tracking[J]. Electronics Letters, 2017, 53(5):318-320.[DOI:10.1049/el.2016.4183]
Zhu S G, Du J P, Ren N. A novel simple visual tracking algorithm based on hashing and deep learning[J]. Chinese Journal of Electronics, 2017, 26(5):1073-1078.[DOI:10.1049/cje.2016.06.026]
Guo W, You S S, Gao J Y, et al. Deep relative metric learning for visual tracking[J]. Scientia Sinica (Informationis), 2018, 48(1):60-78.
郭文, 游思思, 高君宇, 等.深度相对度量学习的视觉跟踪[J].中国科学:信息科学, 2018, 48(1):60-78. [DOI:10.1360/N112017-00124]
Fan H, Ling H B. SANet: structure-aware network for visual tracking[C ] //Proceedings of 2017 IEEE Conference on Computer Vision and Pattern Recognition Workshops. Honolulu, HI, USA: IEEE, 2017: 2217-2224.[ DOI: 10.1109/CVPRW.2017.275 http://dx.doi.org/10.1109/CVPRW.2017.275 ]
Danelljan M, Robinson A, Khan F S, et al. Beyond correlation filters: learning continuous convolution operators for visual tracking[C]//14th European Conference on Computer Vision. Amsterdam, The Netherlands: Springer, 2016: 472-488.
Zhang W, Kang B S. Recent advances in correlation filter-based object tracking:a review[J]. Journal of Image and Graphics, 2017, 22(8):1017-1033.
张微, 康宝生.相关滤波目标跟踪进展综述[J].中国图象图形学报, 2017, 22(8):1017-1033. [DOI:10.11834/jig.170092]
Lukezic A, Vojir T, Zajc L C, et al. Discriminative correlation filter with channel and spatial reliability[C ] //Proceedings of 2017 IEEE Conference on Computer Vision and Pattern Recognition. Honolulu, Hawaii, USA: IEEE Computer Society, 2017: 4847-4856.[ DOI: 10.1109/CVPR.2017.515 http://dx.doi.org/10.1109/CVPR.2017.515 ]
Held D, Thrun S, Savarese S, et al. Learning to track at 100 FPS with deep regression networks[C ] //Proceedings of 14th European Conference on Computer Vision. Amsterdam, Netherlands: Springer, 2016: 749-765.[ DOI: 10.1007/978-3-319-46448-0_45 http://dx.doi.org/10.1007/978-3-319-46448-0_45 ]
Wang L J, Ouyang W L, Wang X G, et al. Visual tracking with fully convolutional networks[C ] //Proceedings of 2015 IEEE International Conference on Computer Vision. Santiago, Chile: IEEE, 2015: 3119-3127.[ DOI: 10.1109/ICCV.2015.357 http://dx.doi.org/10.1109/ICCV.2015.357 ]
Bolme D S, Beveridge J R, Draper B A, et al. Visual object tracking using adaptive correlation filters[C ] //Proceedings of 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition. San Francisco, CA, USA: IEEE, 2010: 2544-2550.[ DOI: 10.1109/CVPR.2010.5539960 http://dx.doi.org/10.1109/CVPR.2010.5539960 ]
Henriques J F, Caseiro R, Martins P, et al. Exploiting the circulant structure of tracking-by-detection with kernels[C ] //Proceedings of the 12th European Conference on Computer Vision. Florence, Italy: Springer, 2012: 702-715.[ DOI: 10.1007/978-3-642-33765-9_50 http://dx.doi.org/10.1007/978-3-642-33765-9_50 ]
Henriques J F, Caseiro R, Martins P, et al. High-speed tracking with kernelized correlation filters[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2015, 37(3):583-596.[DOI:10.1109/TPAMI.2014.2345390]
Danelljan M, Häger G, Khan F S, et al. Discriminative Scale Space Tracking[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2017, 39(8):1561-1575.[DOI:10.1109/TPAMI.2016.2609928]
Bertinetto L, Valmadre J, Golodetz S, et al. Staple: complementary learners for real-time tracking[C ] //Proceedings of 2016 IEEE Conference on Computer Vision and Pattern Recognition. Las Vegas, NV, USA: IEEE, 2016: 1401-1409.[ DOI: 10.1109/CVPR.2016.156 http://dx.doi.org/10.1109/CVPR.2016.156 ]
Ma C, Yang X K, Zhang C Y, et al. Long-term correlation tracking[C ] //Proceedings of 2015 IEEE Conference on Computer Vision and Pattern Recognition. Boston, MA, USA: IEEE, 2015: 5388-5396.[ DOI: 10.1109/CVPR.2015.7299177 http://dx.doi.org/10.1109/CVPR.2015.7299177 ]
Bian J W, Lin W Y, Matsushita Y, et al. GMS: grid-based motion statistics for fast, ultra-robust feature correspondence[C ] //Proceedings of 2017 IEEE Conference on Computer Vision and Pattern Recognition. Honolulu, Hawaii, USA: IEEE Computer Society, 2017: 2828-2837.[ DOI: doi:10.1109/CVPR.2017.302 http://dx.doi.org/doi:10.1109/CVPR.2017.302 ]
相关作者
相关机构
京公网安备11010802024621