Real-Time View Planning for Unstructured Lumigraph Modeling

We propose an algorithm for generating an unstructured lumigraph in real-time from an image stream. This problem has important applications in mixed reality, such as telepresence, interior design or as-built documentation. Unlike conventional texture optimization in structure from motion, our method...

Ausführliche Beschreibung

Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on visualization and computer graphics. - 1996. - 25(2019), 11 vom: 12. Nov., Seite 3063-3072
1. Verfasser: Erat, Okan (VerfasserIn)
Weitere Verfasser: Hoell, Markus, Haubenwallner, Karl, Pirchheim, Christian, Schmalstieg, Dieter
Format: Online-Aufsatz
Sprache:English
Veröffentlicht: 2019
Zugriff auf das übergeordnete Werk:IEEE transactions on visualization and computer graphics
Schlagworte:Journal Article
Beschreibung
Zusammenfassung:We propose an algorithm for generating an unstructured lumigraph in real-time from an image stream. This problem has important applications in mixed reality, such as telepresence, interior design or as-built documentation. Unlike conventional texture optimization in structure from motion, our method must choose views from the input stream in a strictly incremental manner, since only a small number of views can be stored or transmitted. This requires formulating an online variant of the well-known view-planning problem, which must take into account what parts of the scene have already been seen and how the lumigraph sample distribution could improve in the future. We address this highly unconstrained problem by regularizing the scene structure using a regular grid structure. Upon the grid structure, we define a coverage metric describing how well the lumigraph samples cover the grid in terms of spatial and angular resolution, and we greedily keep incoming views if they improve the coverage. We evaluate the performance of our algorithm quantitatively and qualitatively on a variety of synthetic and real scenes, and demonstrate visually appealing results obtained at real-time frame rates (in the range of 3Hz-100Hz per incoming image, depending on configuration)
Beschreibung:Date Revised 04.03.2020
published: Print-Electronic
Citation Status PubMed-not-MEDLINE
ISSN:1941-0506
DOI:10.1109/TVCG.2019.2932237