Templateless Non-Rigid Reconstruction and Motion Tracking With a Single RGB-D Camera

We present a novel templateless approach for nonrigid reconstruction and motion tracking using a single RGB-D camera. Without any template prior, our system achieves accurate reconstruction and tracking for considerably deformable objects. To robustly register the input sequence of partial depth sca...

Ausführliche Beschreibung

Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on image processing : a publication of the IEEE Signal Processing Society. - 1992. - 26(2017), 12 vom: 17. Dez., Seite 5966-5979
1. Verfasser: Kangkan Wang (VerfasserIn)
Weitere Verfasser: Guofeng Zhang, Shihong Xia
Format: Online-Aufsatz
Sprache:English
Veröffentlicht: 2017
Zugriff auf das übergeordnete Werk:IEEE transactions on image processing : a publication of the IEEE Signal Processing Society
Schlagworte:Journal Article
Beschreibung
Zusammenfassung:We present a novel templateless approach for nonrigid reconstruction and motion tracking using a single RGB-D camera. Without any template prior, our system achieves accurate reconstruction and tracking for considerably deformable objects. To robustly register the input sequence of partial depth scans with dynamic motion, we propose an efficient local-to-global hierarchical optimization framework inspired by the idea of traditional structure-from-motion. Our proposed framework mainly consists of two stages, local nonrigid bundle adjustment and global optimization. To eliminate error accumulation during the nonrigid registration of loop motion sequences, we split the full sequence into several segments and apply local nonrigid bundle adjustment to align each segment locally. Global optimization is then adopted to combine all segments and handle the drift problem through loop-closure constraint. By fitting to the input partial data, a deforming 3D model sequence of dynamic objects is finally generated. Experiments on both synthetic and real test data sets and comparisons with state of the art demonstrate that our approach can handle considerable motions robustly and efficiently, and reconstruct high-quality 3D model sequences without drift
Beschreibung:Date Completed 11.12.2018
Date Revised 11.12.2018
published: Print-Electronic
Citation Status PubMed-not-MEDLINE
ISSN:1941-0042
DOI:10.1109/TIP.2017.2740624