Articulated Motion-Aware NeRF for 3D Dynamic Appearance and Geometry Reconstruction by Implicit Motion States

We propose a self-supervised approach for 3D dynamic reconstruction of articulated motions based on Generative Adversarial Networks and Neural Radiance Fields. Our method reconstructs articulated objects and recover their continuous motions and attributes from an unordered, discontinuous image set....

Ausführliche Beschreibung

Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on visualization and computer graphics. - 1996. - PP(2024) vom: 14. Mai
1. Verfasser: Shi, Yahao (VerfasserIn)
Weitere Verfasser: Tao, Ye, Yang, Mingjia, Liu, Yun, Yi, Li, Zhou, Bin
Format: Online-Aufsatz
Sprache:English
Veröffentlicht: 2024
Zugriff auf das übergeordnete Werk:IEEE transactions on visualization and computer graphics
Schlagworte:Journal Article
Beschreibung
Zusammenfassung:We propose a self-supervised approach for 3D dynamic reconstruction of articulated motions based on Generative Adversarial Networks and Neural Radiance Fields. Our method reconstructs articulated objects and recover their continuous motions and attributes from an unordered, discontinuous image set. Notably, we treat motion states as time-independent, recognizing that articulated objects can exhibit identical motions at different times. The key insight of our approach utilizes generative adversarial networks to create a continuous implicit motion state space. Initially, we employ a motion network extracts discrete motion states from images as anchors. These anchors are then expanded across the latent space using generative adversarial networks. Subsequently, motion state latent codes are input into motion-aware neural radiance fields for dynamic appearance and geometry reconstruction. To deduce motion attributes from the continuously generated motions, we adopt a cluster-based strategy. We thoroughly evaluate and validate our method on both synthesized and real data, demonstrating superior fidelity in appearances, geometries, and motion attributes of articulated objects compared to state-of-the-art methods
Beschreibung:Date Revised 23.05.2024
published: Print-Electronic
Citation Status Publisher
ISSN:1941-0506
DOI:10.1109/TVCG.2024.3400830