Semi-Supervised 3D Shape Segmentation via Self Refining

3D shape segmentation is a fundamental and crucial task in the field of image processing and 3D shape analysis. To segment 3D shapes using data-driven methods, a fully labeled dataset is usually required. However, obtaining such a dataset can be a daunting task, as manual face-level labeling is both...

Ausführliche Beschreibung

Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on image processing : a publication of the IEEE Signal Processing Society. - 1992. - 33(2024) vom: 18., Seite 2044-2057
1. Verfasser: Shu, Zhenyu (VerfasserIn)
Weitere Verfasser: Wu, Teng, Shen, Jiajun, Xin, Shiqing, Liu, Ligang
Format: Online-Aufsatz
Sprache:English
Veröffentlicht: 2024
Zugriff auf das übergeordnete Werk:IEEE transactions on image processing : a publication of the IEEE Signal Processing Society
Schlagworte:Journal Article
Beschreibung
Zusammenfassung:3D shape segmentation is a fundamental and crucial task in the field of image processing and 3D shape analysis. To segment 3D shapes using data-driven methods, a fully labeled dataset is usually required. However, obtaining such a dataset can be a daunting task, as manual face-level labeling is both time-consuming and labor-intensive. In this paper, we present a semi-supervised framework for 3D shape segmentation that uses a small, fully labeled set of 3D shapes, as well as a weakly labeled set of 3D shapes with sparse scribble labels. Our framework first employs an auxiliary network to generate initial fully labeled segmentation labels for the sparsely labeled dataset, which helps in training the primary network. During training, the self-refine module uses increasingly accurate predictions of the primary network to improve the labels generated by the auxiliary network. Our proposed method achieves better segmentation performance than previous semi-supervised methods, as demonstrated by extensive benchmark tests, while also performing comparably to supervised methods
Beschreibung:Date Revised 18.03.2024
published: Print-Electronic
Citation Status PubMed-not-MEDLINE
ISSN:1941-0042
DOI:10.1109/TIP.2024.3374200