Global Attention Super-Resolution Algorithm for Nature Image Edge Enhancement
Zhihao Zhang,
Zhitong Su,
Wei Song and
Keqing Ning ()
Additional contact information
Zhihao Zhang: Information Institute, North China University of Technology, Beijing 100144, China
Zhitong Su: Information Institute, North China University of Technology, Beijing 100144, China
Wei Song: Information Institute, North China University of Technology, Beijing 100144, China
Keqing Ning: Information Institute, North China University of Technology, Beijing 100144, China
Sustainability, 2022, vol. 14, issue 21, 1-16
Abstract:
Single-image super-resolution (SR) has long been a research hotspot in computer vision, playing a crucial role in practical applications such as medical imaging, public security and remote sensing imagery. However, all currently available methods focus on reconstructing texture details, resulting in blurred edges and incomplete structures in the reconstructed images. To address this problem, an edge-enhancement-based global attention image super-resolution network (EGAN) combining channel- and self-attention mechanisms is proposed for modeling the hierarchical features and intra-layer features in multiple dimensions. Specifically, the channel contrast-aware attention (CCA) module learns the correlations between the intra-layer feature channels and enhances the contrast in the feature maps for richer features in the edge structures. The cyclic shift window multi-head self-attention (CS-MSA) module captures the long-range dependencies between layered features and captures more valuable features in the global information network. Experiments are conducted on five benchmark datasets for × 2, × 3 and × 4 SR. The experimental results show that for × 4 SR, our network improves the average PSNR by 0.12 dB, 0.19 dB and 0.12 dB over RCAN, HAN and NLSN, respectively, and can reconstruct a clear and complete edge structure.
Keywords: single-image super-resolution; deep learning; global attention; channel contrast-aware attention; cyclic shift window multi-head self-attention (search for similar items in EconPapers)
JEL-codes: O13 Q Q0 Q2 Q3 Q5 Q56 (search for similar items in EconPapers)
Date: 2022
References: View complete reference list from CitEc
Citations: View citations in EconPapers (1)
Downloads: (external link)
https://www.mdpi.com/2071-1050/14/21/13865/pdf (application/pdf)
https://www.mdpi.com/2071-1050/14/21/13865/ (text/html)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:gam:jsusta:v:14:y:2022:i:21:p:13865-:d:952893
Access Statistics for this article
Sustainability is currently edited by Ms. Alexandra Wu
More articles in Sustainability from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().