Are Graph Convolutional Networks With Random Weights Feasible?

Graph Convolutional Networks (GCNs), as a prominent example of graph neural networks, are receiving extensive attention for their powerful capability in learning node representations on graphs. There are various extensions, either in sampling and/or node feature aggregation, to further improve GCNs&...

Ausführliche Beschreibung

Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on pattern analysis and machine intelligence. - 1979. - 45(2023), 3 vom: 14. März, Seite 2751-2768
1. Verfasser: Huang, Changqin (VerfasserIn)
Weitere Verfasser: Li, Ming, Cao, Feilong, Fujita, Hamido, Li, Zhao, Wu, Xindong
Format: Online-Aufsatz
Sprache:English
Veröffentlicht: 2023
Zugriff auf das übergeordnete Werk:IEEE transactions on pattern analysis and machine intelligence
Schlagworte:Journal Article
LEADER 01000naa a22002652 4500
001 NLM342259539
003 DE-627
005 20231226013707.0
007 cr uuu---uuuuu
008 231226s2023 xx |||||o 00| ||eng c
024 7 |a 10.1109/TPAMI.2022.3183143  |2 doi 
028 5 2 |a pubmed24n1140.xml 
035 |a (DE-627)NLM342259539 
035 |a (NLM)35704541 
040 |a DE-627  |b ger  |c DE-627  |e rakwb 
041 |a eng 
100 1 |a Huang, Changqin  |e verfasserin  |4 aut 
245 1 0 |a Are Graph Convolutional Networks With Random Weights Feasible? 
264 1 |c 2023 
336 |a Text  |b txt  |2 rdacontent 
337 |a ƒaComputermedien  |b c  |2 rdamedia 
338 |a ƒa Online-Ressource  |b cr  |2 rdacarrier 
500 |a Date Completed 07.04.2023 
500 |a Date Revised 11.04.2023 
500 |a published: Print-Electronic 
500 |a Citation Status PubMed-not-MEDLINE 
520 |a Graph Convolutional Networks (GCNs), as a prominent example of graph neural networks, are receiving extensive attention for their powerful capability in learning node representations on graphs. There are various extensions, either in sampling and/or node feature aggregation, to further improve GCNs' performance, scalability and applicability in various domains. Still, there is room for further improvements on learning efficiency because performing batch gradient descent using the full dataset for every training iteration, as unavoidable for training (vanilla) GCNs, is not a viable option for large graphs. The good potential of random features in speeding up the training phase in large-scale problems motivates us to consider carefully whether GCNs with random weights are feasible. To investigate theoretically and empirically this issue, we propose a novel model termed Graph Convolutional Networks with Random Weights (GCN-RW) by revising the convolutional layer with random filters and simultaneously adjusting the learning objective with regularized least squares loss. Theoretical analyses on the model's approximation upper bound, structure complexity, stability and generalization, are provided with rigorous mathematical proofs. The effectiveness and efficiency of GCN-RW are verified on semi-supervised node classification task with several benchmark datasets. Experimental results demonstrate that, in comparison with some state-of-the-art approaches, GCN-RW can achieve better or matched accuracies with less training time cost 
650 4 |a Journal Article 
700 1 |a Li, Ming  |e verfasserin  |4 aut 
700 1 |a Cao, Feilong  |e verfasserin  |4 aut 
700 1 |a Fujita, Hamido  |e verfasserin  |4 aut 
700 1 |a Li, Zhao  |e verfasserin  |4 aut 
700 1 |a Wu, Xindong  |e verfasserin  |4 aut 
773 0 8 |i Enthalten in  |t IEEE transactions on pattern analysis and machine intelligence  |d 1979  |g 45(2023), 3 vom: 14. März, Seite 2751-2768  |w (DE-627)NLM098212257  |x 1939-3539  |7 nnns 
773 1 8 |g volume:45  |g year:2023  |g number:3  |g day:14  |g month:03  |g pages:2751-2768 
856 4 0 |u http://dx.doi.org/10.1109/TPAMI.2022.3183143  |3 Volltext 
912 |a GBV_USEFLAG_A 
912 |a SYSFLAG_A 
912 |a GBV_NLM 
912 |a GBV_ILN_350 
951 |a AR 
952 |d 45  |j 2023  |e 3  |b 14  |c 03  |h 2751-2768