泡泡一分钟: Deep-LK for Efficient Adaptive Object Tracking

陶福
2023-12-01

 Deep-LK for Efficient Adaptive Object Tracking

 "链接:https://pan.baidu.com/s/1Hn-CVgiR7WV0jvaYBv5G_A 提取码:mp97"

用于高效自适应对象跟踪的Deep-LK方法

In this paper, we present a new approach for efficient regression-based object tracking. Our approach is closely related to the Generic Object Tracking Using Re- gression Networks (GOTURN) framework [1]. We make the following contributions. First, we demonstrate that there is a theoretical relationship between Siamese regression networks like GOTURN and the classical Inverse Compositional Lucas & Kanade (IC-LK) algorithm. Further, we demonstrate that unlike GOTURN, IC-LK adapts its regressor to the appearance of the current tracked frame. We argue that the lack of such property in GOTURN attributes to its poor performance on unseen objects and/or viewpoints. Second, we propose a novel framework for object tracking inspired by the IC-LK framework, which we refer to as Deep-LK. Finally, we show impressive results demonstrating that Deep-LK substantially outperforms GOTURN and demonstrate comparable tracking performance against current state-of-the-art deep trackers on high frame-rate sequences whilst being an order of magnitude (100 FPS) computationally efficient.

在本文中,我们提出了一种有效的基于回归的对象跟踪的新方法。 我们的方法与使用回归网络的通用对象跟踪(GOTURN)框架密切相关[1]。我们做出以下贡献。 首先,我们证明了像GOTURN这样的连体回归网络和经典的反向组合Lucas&Kanade(IC-LK)算法之间存在着理论上的关系。此外,我们证明了与GOTURN不同,IC-LK使其回归器适应当前跟踪帧的外观。我们认为GOTURN中缺少这样的属性归因于它在看不见的对象和/或视点上的不良表现。 其次,我们提出了一个新颖的对象跟踪框架,它受IC-LK框架的启发,我们将其称为Deep-LK。

转载于:https://www.cnblogs.com/feifanrensheng/p/10685670.html

 类似资料: