Trajectories as Topics : Multi-Object Tracking by Topic Discovery

This paper proposes a new approach to multi-object tracking by semantic topic discovery. We dynamically cluster frame-by-frame detections and treat objects as topics, allowing the application of the Dirichlet process mixture model. The tracking problem is cast as a topic-discovery task, where the vi...

Ausführliche Beschreibung

Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on image processing : a publication of the IEEE Signal Processing Society. - 1992. - 28(2019), 1 vom: 22. Jan., Seite 240-252
1. Verfasser: Luo, Wenhan (VerfasserIn)
Weitere Verfasser: Stenger, Bjorn, Zhao, Xiaowei, Kim, Tae-Kyun
Format: Online-Aufsatz
Sprache:English
Veröffentlicht: 2019
Zugriff auf das übergeordnete Werk:IEEE transactions on image processing : a publication of the IEEE Signal Processing Society
Schlagworte:Journal Article
Beschreibung
Zusammenfassung:This paper proposes a new approach to multi-object tracking by semantic topic discovery. We dynamically cluster frame-by-frame detections and treat objects as topics, allowing the application of the Dirichlet process mixture model. The tracking problem is cast as a topic-discovery task, where the video sequence is treated analogously to a document. It addresses tracking issues such as object exclusivity constraints as well as tracking management without the need for heuristic thresholds. Variation of object appearance is modeled as the dynamics of word co-occurrence and handled by updating the cluster parameters across the sequence in the dynamical clustering procedure. We develop two kinds of visual representation based on super-pixel and deformable part model and integrate them into the model of automatic topic discovery for tracking rigid and non-rigid objects, respectively. In experiments on public data sets, we demonstrate the effectiveness of the proposed algorithm
Beschreibung:Date Completed 24.09.2018
Date Revised 24.09.2018
published: Print-Electronic
Citation Status PubMed-not-MEDLINE
ISSN:1941-0042
DOI:10.1109/TIP.2018.2866955