Activity recognition using a mixture of vector fields

The analysis of moving objects in image sequences (video) has been one of the major themes in computer vision. In this paper, we focus on video-surveillance tasks; more specifically, we consider pedestrian trajectories and propose modeling them through a small set of motion/vector fields together wi...

Ausführliche Beschreibung

Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on image processing : a publication of the IEEE Signal Processing Society. - 1992. - 22(2013), 5 vom: 15. Mai, Seite 1712-25
1. Verfasser: Nascimento, Jacinto C (VerfasserIn)
Weitere Verfasser: Figueiredo, Mário A T, Marques, Jorge S
Format: Online-Aufsatz
Sprache:English
Veröffentlicht: 2013
Zugriff auf das übergeordnete Werk:IEEE transactions on image processing : a publication of the IEEE Signal Processing Society
Schlagworte:Journal Article Research Support, Non-U.S. Gov't
Beschreibung
Zusammenfassung:The analysis of moving objects in image sequences (video) has been one of the major themes in computer vision. In this paper, we focus on video-surveillance tasks; more specifically, we consider pedestrian trajectories and propose modeling them through a small set of motion/vector fields together with a space-varying switching mechanism. Despite the diversity of motion patterns that can occur in a given scene, we show that it is often possible to find a relatively small number of typical behaviors, and model each of these behaviors by a "simple" motion field. We increase the expressiveness of the formulation by allowing the trajectories to switch from one motion field to another, in a space-dependent manner. We present an expectation-maximization algorithm to learn all the parameters of the model, and apply it to trajectory classification tasks. Experiments with both synthetic and real data support the claims about the performance of the proposed approach
Beschreibung:Date Completed 09.09.2013
Date Revised 20.03.2013
published: Print-Electronic
Citation Status PubMed-not-MEDLINE
ISSN:1941-0042
DOI:10.1109/TIP.2012.2226899