Local Deformable 3D Reconstruction with Cartan's Connections
3D reconstruction of deformable objects using inter-image visual motion from monocular images has been studied under Shape-from-Template (SfT) and Non-Rigid Structure-from-Motion (NRSfM). Most methods have been developed for simple deformation models, primarily isometry. They may treat a surface as...
Veröffentlicht in: | IEEE transactions on pattern analysis and machine intelligence. - 1979. - 42(2020), 12 vom: 01. Dez., Seite 3011-3026 |
---|---|
1. Verfasser: | |
Weitere Verfasser: | , |
Format: | Online-Aufsatz |
Sprache: | English |
Veröffentlicht: |
2020
|
Zugriff auf das übergeordnete Werk: | IEEE transactions on pattern analysis and machine intelligence |
Schlagworte: | Journal Article |
Zusammenfassung: | 3D reconstruction of deformable objects using inter-image visual motion from monocular images has been studied under Shape-from-Template (SfT) and Non-Rigid Structure-from-Motion (NRSfM). Most methods have been developed for simple deformation models, primarily isometry. They may treat a surface as a discrete set of points and draw constraints from the points only or they may use a non-parametric representation and use both points and differentials to express constraints. We propose a differential framework based on Cartan's theory of connections and moving frames. It is applicable to SfT and NRSfM, and to deformation models other than isometry. It utilises infinitesimal-level assumptions on the surface's geometry and mappings. It has the following properties. 1) It allows one to derive existing solutions in a simpler way. 2) It models SfT and NRSfM in a unified way. 3) It allows us to introduce a new skewless deformation model and solve SfT and NRSfM for it. 4) It facilitates a generic solution to SfT which does not require deformation modeling. Our framework is complete: it solves deformable 3D reconstruction for a whole class of algebraic deformation models including isometry. We compared our solutions with the state-of-the-art methods and show that ours outperform in terms of both accuracy and computation time |
---|---|
Beschreibung: | Date Revised 04.11.2020 published: Print-Electronic Citation Status PubMed-not-MEDLINE |
ISSN: | 1939-3539 |
DOI: | 10.1109/TPAMI.2019.2920821 |