Visual-Preserving Mesh Repair

Mesh repair is a long-standing challenge in computer graphics and related fields. Converting defective meshes into watertight manifold meshes can greatly benefit downstream applications such as geometric processing, simulation, fabrication, learning, and synthesis. In this work, by assuming the mode...

Ausführliche Beschreibung

Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on visualization and computer graphics. - 1996. - 30(2024), 9 vom: 15. Aug., Seite 6586-6597
1. Verfasser: Zheng, Zhongtian (VerfasserIn)
Weitere Verfasser: Gao, Xifeng, Pan, Zherong, Li, Wei, Wang, Peng-Shuai, Wang, Guoping, Wu, Kui
Format: Online-Aufsatz
Sprache:English
Veröffentlicht: 2024
Zugriff auf das übergeordnete Werk:IEEE transactions on visualization and computer graphics
Schlagworte:Journal Article
Beschreibung
Zusammenfassung:Mesh repair is a long-standing challenge in computer graphics and related fields. Converting defective meshes into watertight manifold meshes can greatly benefit downstream applications such as geometric processing, simulation, fabrication, learning, and synthesis. In this work, by assuming the model is visually correct, we first introduce three visual measures for visibility, orientation, and openness, based on ray-tracing. We then present a novel mesh repair framework incorporating visual measures with several critical steps, i.e., open surface closing, face reorientation, and global optimization, to effectively repair meshes with defects (e.g., gaps, holes, self-intersections, degenerate elements, and inconsistent orientations) and preserve visual appearances. Our method reduces unnecessary mesh complexity without compromising geometric accuracy or visual quality while preserving input attributes such as UV coordinates for rendering. We evaluate our approach on hundreds of models randomly selected from ShapeNet and Thingi10K, demonstrating its effectiveness and robustness compared to existing approaches
Beschreibung:Date Revised 01.08.2024
published: Print-Electronic
Citation Status PubMed-not-MEDLINE
ISSN:1941-0506
DOI:10.1109/TVCG.2023.3348829