Parameter-Efficient Fine-Tuning in Spectral Domain for Point Cloud Learning

Recently, leveraging pre-training techniques to enhance point cloud models has become a prominent research topic. However, existing approaches typically require full fine-tuning of pre-trained models to achieve satisfactory performance on downstream tasks, which is storage-intensive and computationa...

Ausführliche Beschreibung

Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on pattern analysis and machine intelligence. - 1979. - PP(2025) vom: 01. Aug.
1. Verfasser: Liang, Dingkang (VerfasserIn)
Weitere Verfasser: Feng, Tianrui, Zhou, Xin, Zhang, Yumeng, Zou, Zhikang, Bai, Xiang
Format: Online-Aufsatz
Sprache:English
Veröffentlicht: 2025
Zugriff auf das übergeordnete Werk:IEEE transactions on pattern analysis and machine intelligence
Schlagworte:Journal Article
LEADER 01000naa a22002652c 4500
001 NLM390493112
003 DE-627
005 20250802232612.0
007 cr uuu---uuuuu
008 250802s2025 xx |||||o 00| ||eng c
024 7 |a 10.1109/TPAMI.2025.3594749  |2 doi 
028 5 2 |a pubmed25n1518.xml 
035 |a (DE-627)NLM390493112 
035 |a (NLM)40748789 
040 |a DE-627  |b ger  |c DE-627  |e rakwb 
041 |a eng 
100 1 |a Liang, Dingkang  |e verfasserin  |4 aut 
245 1 0 |a Parameter-Efficient Fine-Tuning in Spectral Domain for Point Cloud Learning 
264 1 |c 2025 
336 |a Text  |b txt  |2 rdacontent 
337 |a ƒaComputermedien  |b c  |2 rdamedia 
338 |a ƒa Online-Ressource  |b cr  |2 rdacarrier 
500 |a Date Revised 01.08.2025 
500 |a published: Print-Electronic 
500 |a Citation Status Publisher 
520 |a Recently, leveraging pre-training techniques to enhance point cloud models has become a prominent research topic. However, existing approaches typically require full fine-tuning of pre-trained models to achieve satisfactory performance on downstream tasks, which is storage-intensive and computationally demanding. To address this issue, we propose a novel Parameter-Efficient Fine-Tuning (PEFT) method for point cloud, called PointGST (Point cloud Graph Spectral Tuning). PointGST freezes the pre-trained model and introduces a lightweight, trainable Point Cloud Spectral Adapter (PCSA) for fine-tuning parameters in the spectral domain. The core idea is built on two observations: 1) The inner tokens from frozen models might present confusion in the spatial domain; 2) Task-specific intrinsic information is important for transferring the general knowledge to the downstream task. Specifically, PointGST transfers the point tokens from the spatial domain to the spectral domain, effectively de-correlating confusion among tokens by using orthogonal components for separation. Moreover, the generated spectral basis involves intrinsic information about the downstream point clouds, enabling more targeted tuning. As a result, PointGST facilitates the efficient transfer of general knowledge to downstream tasks while significantly reducing training costs. Extensive experiments on challenging point cloud datasets across various tasks demonstrate that PointGST not only outperforms its fully fine-tuning counterpart but also significantly reduces trainable parameters, making it a promising solution for efficient point cloud learning. Moreover, it achieves superior accuracies of 99.48%, 97.76%, and 96.18% on the ScanObjNN OBJ_BG, OBJ_ONLY, and PB_T50_RS datasets, respectively, establishing a new state-of-the-art, while using only 0.67% of the trainable parameters 
650 4 |a Journal Article 
700 1 |a Feng, Tianrui  |e verfasserin  |4 aut 
700 1 |a Zhou, Xin  |e verfasserin  |4 aut 
700 1 |a Zhang, Yumeng  |e verfasserin  |4 aut 
700 1 |a Zou, Zhikang  |e verfasserin  |4 aut 
700 1 |a Bai, Xiang  |e verfasserin  |4 aut 
773 0 8 |i Enthalten in  |t IEEE transactions on pattern analysis and machine intelligence  |d 1979  |g PP(2025) vom: 01. Aug.  |w (DE-627)NLM098212257  |x 1939-3539  |7 nnas 
773 1 8 |g volume:PP  |g year:2025  |g day:01  |g month:08 
856 4 0 |u http://dx.doi.org/10.1109/TPAMI.2025.3594749  |3 Volltext 
912 |a GBV_USEFLAG_A 
912 |a SYSFLAG_A 
912 |a GBV_NLM 
912 |a GBV_ILN_350 
951 |a AR 
952 |d PP  |j 2025  |b 01  |c 08