EE Seminar: Handling drift induced by background clutter in on-line visual tracking

Speaker: Shaul Oron

Ph.D. student under the supervision of Prof. Shai Avidan

 

Sunday, March 19th, 2017 at 15:00
Room 011, Kitot Bldg., Faculty of Engineering

Handling drift induced by background clutter in on-line visual tracking

 

Abstract

 

Visual tracking is a challenging task that attracts considerable interest in the field of computer vision. Despite recent advances many open questions still remain. One of the most basic challenges is handling tracking drift and, in particular, drift induced by background clutter. We address this challenge from several different angles.

 

We first consider a constrained setup of real-time tracking of objects from a known class (i.e., vehicles). In such a setup, one can train, offline, a non-real-time object class detector. Such a detector can help reduce the effects of background clutter. We discuss how this can be done, and focus on integrating the detector in a real-time tracking system. Our research leads to several novel fusion strategies that outperform individual system components as well as naive fusion techniques. We demonstrate the advantages of our method in a wide range of experiments.

 

We then turn our attention to model-free visual tracking in which the object class is unknown. In order to combat clutter induced drift in this setup we propose a novel, unified tracking framework. In this framework a parametric rigid transformation is assumed, and the likelihood of pixels belonging to either foreground or background is explicitly modeled. The resulting tracking algorithm provides an extension to a well known tracking method. Competitive performance is demonstrated when evaluated on a standard tracking benchmark.

 

Finally, we propose a novel similarity measure between point sets that can be used for template matching. This similarity measure alleviates the need for explicit background modeling and the constraints of a rigid geometric transformation. We investigate key properties of this similarity measure and show that it can be 

applied successfully to visual tracking. The resulting tracking algorithm is able to cope with background clutter and non-rigid deformations in a fully  unsupervised manner. Tracking performance is demonstrated on several benchmarks and compared to many leading tracking methods, with promising results.

 

19 במרץ 2017, 15:00 
חדר 011, בניין כיתות-חשמל 
אוניברסיטת תל אביב עושה כל מאמץ לכבד זכויות יוצרים. אם בבעלותך זכויות יוצרים בתכנים שנמצאים פה ו/או השימוש
שנעשה בתכנים אלה לדעתך מפר זכויות, נא לפנות בהקדם לכתובת שכאן >>