High-Similarity-Pass Attention for Single Image Super-Resolution

Recent developments in the field of non-local attention (NLA) have led to a renewed interest in self-similarity-based single image super-resolution (SISR). Researchers usually use the NLA to explore non-local self-similarity (NSS) in SISR and achieve satisfactory reconstruction results. However, a s...

Description complète

Détails bibliographiques
Publié dans:IEEE transactions on image processing : a publication of the IEEE Signal Processing Society. - 1992. - 33(2024) vom: 04., Seite 610-624
Auteur principal: Su, Jian-Nan (Auteur)
Autres auteurs: Gan, Min, Chen, Guang-Yong, Guo, Wenzhong, Chen, C L Philip
Format: Article en ligne
Langue:English
Publié: 2024
Accès à la collection:IEEE transactions on image processing : a publication of the IEEE Signal Processing Society
Sujets:Journal Article
LEADER 01000caa a22002652c 4500
001 NLM366812696
003 DE-627
005 20250305155525.0
007 cr uuu---uuuuu
008 240114s2024 xx |||||o 00| ||eng c
024 7 |a 10.1109/TIP.2023.3348293  |2 doi 
028 5 2 |a pubmed25n1222.xml 
035 |a (DE-627)NLM366812696 
035 |a (NLM)38190673 
040 |a DE-627  |b ger  |c DE-627  |e rakwb 
041 |a eng 
100 1 |a Su, Jian-Nan  |e verfasserin  |4 aut 
245 1 0 |a High-Similarity-Pass Attention for Single Image Super-Resolution 
264 1 |c 2024 
336 |a Text  |b txt  |2 rdacontent 
337 |a ƒaComputermedien  |b c  |2 rdamedia 
338 |a ƒa Online-Ressource  |b cr  |2 rdacarrier 
500 |a Date Revised 11.01.2024 
500 |a published: Print-Electronic 
500 |a Citation Status PubMed-not-MEDLINE 
520 |a Recent developments in the field of non-local attention (NLA) have led to a renewed interest in self-similarity-based single image super-resolution (SISR). Researchers usually use the NLA to explore non-local self-similarity (NSS) in SISR and achieve satisfactory reconstruction results. However, a surprising phenomenon that the reconstruction performance of the standard NLA is similar to that of the NLA with randomly selected regions prompted us to revisit NLA. In this paper, we first analyzed the attention map of the standard NLA from different perspectives and discovered that the resulting probability distribution always has full support for every local feature, which implies a statistical waste of assigning values to irrelevant non-local features, especially for SISR which needs to model long-range dependence with a large number of redundant non-local features. Based on these findings, we introduced a concise yet effective soft thresholding operation to obtain high-similarity-pass attention (HSPA), which is beneficial for generating a more compact and interpretable distribution. Furthermore, we derived some key properties of the soft thresholding operation that enable training our HSPA in an end-to-end manner. The HSPA can be integrated into existing deep SISR models as an efficient general building block. In addition, to demonstrate the effectiveness of the HSPA, we constructed a deep high-similarity-pass attention network (HSPAN) by integrating a few HSPAs in a simple backbone. Extensive experimental results demonstrate that HSPAN outperforms state-of-the-art approaches on both quantitative and qualitative evaluations. Our code and a pre-trained model were uploaded to GitHub (https://github.com/laoyangui/HSPAN) for validation 
650 4 |a Journal Article 
700 1 |a Gan, Min  |e verfasserin  |4 aut 
700 1 |a Chen, Guang-Yong  |e verfasserin  |4 aut 
700 1 |a Guo, Wenzhong  |e verfasserin  |4 aut 
700 1 |a Chen, C L Philip  |e verfasserin  |4 aut 
773 0 8 |i Enthalten in  |t IEEE transactions on image processing : a publication of the IEEE Signal Processing Society  |d 1992  |g 33(2024) vom: 04., Seite 610-624  |w (DE-627)NLM09821456X  |x 1941-0042  |7 nnas 
773 1 8 |g volume:33  |g year:2024  |g day:04  |g pages:610-624 
856 4 0 |u http://dx.doi.org/10.1109/TIP.2023.3348293  |3 Volltext 
912 |a GBV_USEFLAG_A 
912 |a SYSFLAG_A 
912 |a GBV_NLM 
912 |a GBV_ILN_350 
951 |a AR 
952 |d 33  |j 2024  |b 04  |h 610-624