Model-Agnostic Temporal Regularizer for Object Localization Using Motion Fields

Video analysis often requires locating and tracking target objects. In some applications, the localization system has access to the full video, which allows fine-grain motion information to be estimated. This paper proposes capturing this information through motion fields and using it to improve the...

Ausführliche Beschreibung

Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on image processing : a publication of the IEEE Signal Processing Society. - 1992. - 31(2022) vom: 08., Seite 2478-2487
1. Verfasser: Santiago, Carlos (VerfasserIn)
Weitere Verfasser: Medley, Daniela O, Marques, Jorge S, Nascimento, Jacinto C
Format: Online-Aufsatz
Sprache:English
Veröffentlicht: 2022
Zugriff auf das übergeordnete Werk:IEEE transactions on image processing : a publication of the IEEE Signal Processing Society
Schlagworte:Journal Article
Beschreibung
Zusammenfassung:Video analysis often requires locating and tracking target objects. In some applications, the localization system has access to the full video, which allows fine-grain motion information to be estimated. This paper proposes capturing this information through motion fields and using it to improve the localization results. The learned motion fields act as a model-agnostic temporal regularizer that can be used with any localization system based on keypoints. Unlike optical flow-based strategies, our motion fields are estimated from the model domain, based on the trajectories described by the object keypoints. Therefore, they are not affected by poor imaging conditions. The benefits of the proposed strategy are shown on three applications: 1) segmentation of cardiac magnetic resonance; 2) facial model alignment; and 3) vehicle tracking. In each case, combining popular localization methods with the proposed regularizer leads to improvement in overall accuracies and reduces gross errors
Beschreibung:Date Revised 21.03.2022
published: Print-Electronic
Citation Status PubMed-not-MEDLINE
ISSN:1941-0042
DOI:10.1109/TIP.2022.3155947