Graph Spiking Attention Network : Sparsity, Efficiency and Robustness

Existing Graph Attention Networks (GATs) generally adopt the self-attention mechanism to learn graph edge attention, which usually return dense attention coefficients over all neighbors and thus are prone to be sensitive to graph edge noises. To overcome this problem, sparse GATs are desirable and h...

Ausführliche Beschreibung

Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on pattern analysis and machine intelligence. - 1979. - 47(2025), 11 vom: 01. Okt., Seite 10862-10869
1. Verfasser: Wang, Beibei (VerfasserIn)
Weitere Verfasser: Jiang, Bo, Tang, Jin, Bai, Lu, Luo, Bin
Format: Online-Aufsatz
Sprache:English
Veröffentlicht: 2025
Zugriff auf das übergeordnete Werk:IEEE transactions on pattern analysis and machine intelligence
Schlagworte:Journal Article