Deep Shearlet Residual Learning Network for Single Image Super-Resolution

Recently, the residual learning strategy has been integrated into the convolutional neural network (CNN) for single image super-resolution (SISR), where the CNN is trained to estimate the residual images. Recognizing that a residual image usually consists of high-frequency details and exhibits carto...

Ausführliche Beschreibung

Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on image processing : a publication of the IEEE Signal Processing Society. - 1992. - 30(2021) vom: 04., Seite 4129-4142
1. Verfasser: Geng, Tianyu (VerfasserIn)
Weitere Verfasser: Liu, Xiao-Yang, Wang, Xiaodong, Sun, Guiling
Format: Online-Aufsatz
Sprache:English
Veröffentlicht: 2021
Zugriff auf das übergeordnete Werk:IEEE transactions on image processing : a publication of the IEEE Signal Processing Society
Schlagworte:Journal Article
Beschreibung
Zusammenfassung:Recently, the residual learning strategy has been integrated into the convolutional neural network (CNN) for single image super-resolution (SISR), where the CNN is trained to estimate the residual images. Recognizing that a residual image usually consists of high-frequency details and exhibits cartoon-like characteristics, in this paper, we propose a deep shearlet residual learning network (DSRLN) to estimate the residual images based on the shearlet transform. The proposed network is trained in the shearlet transform-domain which provides an optimal sparse approximation of the cartoon-like image. Specifically, to address the large statistical variation among the shearlet coefficients, a dual-path training strategy and a data weighting technique are proposed. Extensive evaluations on general natural image datasets as well as remote sensing image datasets show that the proposed DSRLN scheme achieves close results in PSNR to the state-of-the-art deep learning methods, using much less network parameters
Beschreibung:Date Revised 12.04.2021
published: Print-Electronic
Citation Status PubMed-not-MEDLINE
ISSN:1941-0042
DOI:10.1109/TIP.2021.3069317