First published: 2016/04/06 (8 years ago) Abstract: Machine learning techniques are often used in computer vision due to their
ability to leverage large amounts of training data to improve performance.
Unfortunately, most generic object trackers are still trained from scratch
online and do not benefit from the large number of videos that are readily
available for offline training. We propose a method for offline training of
neural networks that can track novel objects at test-time at 100 fps. Our
tracker is significantly faster than previous methods that use neural networks
for tracking, which are typically very slow to run and not practical for
real-time applications. Our tracker uses a simple feed-forward network with no
online training required. The tracker learns a generic relationship between
object motion and appearance and can be used to track novel objects that do not
appear in the training set. We test our network on a standard tracking
benchmark to demonstrate our tracker's state-of-the-art performance. Further,
our performance improves as we add more videos to our offline training set. To
the best of our knowledge, our tracker is the first neural-network tracker that
learns to track generic objects at 100 fps.