HPformer : Low-Parameter Transformer With Temporal Dependency Hierarchical Propagation for Health Informatics

Transformers based on Self-Attention (SA) mechanism have demonstrated unrivaled superiority in numerous areas. Compared to RNN-based networks, Transformers can learn the temporal dependency representation of an entire sequence in parallel, while efficiently dealing with long-range dependencies. Howe...

Ausführliche Beschreibung

Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on pattern analysis and machine intelligence. - 1979. - 47(2025), 11 vom: 28. Okt., Seite 10770-10786
1. Verfasser: Lee, Wu (VerfasserIn)
Weitere Verfasser: Shi, Yuliang, Yu, Han, Cheng, Lin, Wang, Xinjun, Yan, Zhongmin, Kong, Fanyu
Format: Online-Aufsatz
Sprache:English
Veröffentlicht: 2025
Zugriff auf das übergeordnete Werk:IEEE transactions on pattern analysis and machine intelligence
Schlagworte:Journal Article