Beyond Self-Attention : External Attention Using Two Linear Layers for Visual Tasks

Attention mechanisms, especially self-attention, have played an increasingly important role in deep feature representation for visual tasks. Self-attention updates the feature at each position by computing a weighted sum of features using pair-wise affinities across all positions to capture the long...

Ausführliche Beschreibung

Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on pattern analysis and machine intelligence. - 1979. - 45(2023), 5 vom: 05. Mai, Seite 5436-5447
1. Verfasser: Guo, Meng-Hao (VerfasserIn)
Weitere Verfasser: Liu, Zheng-Ning, Mu, Tai-Jiang, Hu, Shi-Min
Format: Online-Aufsatz
Sprache:English
Veröffentlicht: 2023
Zugriff auf das übergeordnete Werk:IEEE transactions on pattern analysis and machine intelligence
Schlagworte:Journal Article