Textureless Deformable Object Tracking with Invisible Markers
Tracking and reconstructing deformable objects with little texture is challenging due to the lack of features. Here we introduce "invisible markers" for accurate and robust correspondence matching and tracking. Our markers are visible only under ultraviolet (UV) light. We build a novel ima...
Veröffentlicht in: | IEEE transactions on pattern analysis and machine intelligence. - 1979. - PP(2024) vom: 18. Sept. |
---|---|
1. Verfasser: | |
Weitere Verfasser: | , , , , , |
Format: | Online-Aufsatz |
Sprache: | English |
Veröffentlicht: |
2024
|
Zugriff auf das übergeordnete Werk: | IEEE transactions on pattern analysis and machine intelligence |
Schlagworte: | Journal Article |
Zusammenfassung: | Tracking and reconstructing deformable objects with little texture is challenging due to the lack of features. Here we introduce "invisible markers" for accurate and robust correspondence matching and tracking. Our markers are visible only under ultraviolet (UV) light. We build a novel imaging system for capturing videos of deformed objects under their original untouched appearance (which may have little texture) and, simultaneously, with our markers. We develop an algorithm that first establishes accurate correspondences using video frames with markers, and then transfers them to the untouched views as ground-truth labels. In this way, we are able to generate high-quality labeled data for training learning-based algorithms. We contribute a large real-world dataset, DOT, for tracking deformable objects with little or no texture. Our dataset has about one million video frames of various types of deformable objects. We provide ground truth tracked correspondences in both 2D and 3D. We benchmark state-of-the-art methods on optical flow and deformable object reconstruction using our dataset, which poses great challenges. By training on DOT, their performance significantly improves, not only on our dataset, but also on other unseen data |
---|---|
Beschreibung: | Date Revised 18.09.2024 published: Print-Electronic Citation Status Publisher |
ISSN: | 1939-3539 |
DOI: | 10.1109/TPAMI.2024.3463422 |