Point-NAS : A Novel Neural Architecture Search Framework for Point Cloud Analysis
Recently, point-based networks have exhibited extraordinary potential for 3D point cloud processing. However, owing to the meticulous design of both parameters and hyperparameters inside the network, constructing a promising network for each point cloud task can be an expensive endeavor. In this wor...
Veröffentlicht in: | IEEE transactions on image processing : a publication of the IEEE Signal Processing Society. - 1992. - 32(2023) vom: 14., Seite 6526-6542 |
---|---|
1. Verfasser: | |
Weitere Verfasser: | , , , , , |
Format: | Online-Aufsatz |
Sprache: | English |
Veröffentlicht: |
2023
|
Zugriff auf das übergeordnete Werk: | IEEE transactions on image processing : a publication of the IEEE Signal Processing Society |
Schlagworte: | Journal Article |
Zusammenfassung: | Recently, point-based networks have exhibited extraordinary potential for 3D point cloud processing. However, owing to the meticulous design of both parameters and hyperparameters inside the network, constructing a promising network for each point cloud task can be an expensive endeavor. In this work, we develop a novel one-shot search framework called Point-NAS to automatically determine optimum architectures for various point cloud tasks. Specifically, we design an elastic feature extraction (EFE) module that serves as a basic unit for architecture search, which expands seamlessly alongside both the width and depth of the network for efficient feature extraction. Based on the EFE module, we devise a searching space, which is encoded into a supernet to provide a wide number of latent network structures for a particular point cloud task. To fully optimize the weights of the supernet, we propose a weight coupling sandwich rule that samples the largest, smallest, and multiple medium models at each iteration and fuses their gradients to update the supernet. Furthermore, we present a united gradient adjustment algorithm that mitigates gradient conflict induced by distinct gradient directions of sampled models and supernet, thus expediting the convergence of the supernet and assuring that it can be comprehensively trained. Pursuant to the provided techniques, the trained supernet enables a multitude of subnets to be incredibly well-optimized. Finally, we conduct an evolutionary search for the supernet under resource constraints to find promising architectures for different tasks. Experimentally, the searched Point-NAS with weights inherited from the supernet realizes outstanding results across a variety of benchmarks. i.e., 94.2% and 88.9% overall accuracy under ModelNet40 and ScanObjectNN, 68.6% mIoU under S3DIS, 63.6% and 69.3% mAP0.25 under SUN RGB-D and ScanNet V2 datasets |
---|---|
Beschreibung: | Date Revised 04.12.2023 published: Print-Electronic Citation Status PubMed-not-MEDLINE |
ISSN: | 1941-0042 |
DOI: | 10.1109/TIP.2023.3331223 |