Efficient Non-Consecutive Feature Tracking for Robust Structure-From-Motion

Structure-from-motion (SfM) largely relies on feature tracking. In image sequences, if disjointed tracks caused by objects moving in and out of the field of view, occasional occlusion, or image noise are not handled well, corresponding SfM could be affected. This problem becomes severer for large-sc...

Ausführliche Beschreibung

Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on image processing : a publication of the IEEE Signal Processing Society. - 1992. - 25(2016), 12 vom: 24. Dez., Seite 5957-5970
1. Verfasser: Guofeng Zhang (VerfasserIn)
Weitere Verfasser: Haomin Liu, Zilong Dong, Jiaya Jia, Tien-Tsin Wong, Hujun Bao
Format: Online-Aufsatz
Sprache:English
Veröffentlicht: 2016
Zugriff auf das übergeordnete Werk:IEEE transactions on image processing : a publication of the IEEE Signal Processing Society
Schlagworte:Journal Article
Beschreibung
Zusammenfassung:Structure-from-motion (SfM) largely relies on feature tracking. In image sequences, if disjointed tracks caused by objects moving in and out of the field of view, occasional occlusion, or image noise are not handled well, corresponding SfM could be affected. This problem becomes severer for large-scale scenes, which typically requires to capture multiple sequences to cover the whole scene. In this paper, we propose an efficient non-consecutive feature tracking framework to match interrupted tracks distributed in different subsequences or even in different videos. Our framework consists of steps of solving the feature "dropout" problem when indistinctive structures, noise or large image distortion exists, and of rapidly recognizing and joining common features located in different subsequences. In addition, we contribute an effective segment-based coarse-to-fine SfM algorithm for robustly handling large data sets. Experimental results on challenging video data demonstrate the effectiveness of the proposed system
Beschreibung:Date Revised 20.11.2019
published: Print-Electronic
Citation Status PubMed-not-MEDLINE
ISSN:1941-0042
DOI:10.1109/TIP.2016.2607425