Graph Spiking Attention Network : Sparsity, Efficiency and Robustness
Existing Graph Attention Networks (GATs) generally adopt the self-attention mechanism to learn graph edge attention, which usually return dense attention coefficients over all neighbors and thus are prone to be sensitive to graph edge noises. To overcome this problem, sparse GATs are desirable and h...
Description complète
Détails bibliographiques
| Publié dans: | IEEE transactions on pattern analysis and machine intelligence. - 1979. - 47(2025), 11 vom: 01. Okt., Seite 10862-10869
|
| Auteur principal: |
Wang, Beibei
(Auteur) |
| Autres auteurs: |
Jiang, Bo,
Tang, Jin,
Bai, Lu,
Luo, Bin |
| Format: | Article en ligne
|
| Langue: | English |
| Publié: |
2025
|
| Accès à la collection: | IEEE transactions on pattern analysis and machine intelligence
|
| Sujets: | Journal Article |