Back to Search Start Over

Incorporating Transformer Designs into Convolutions for Lightweight Image Super-Resolution

Authors :
Wu, Gang
Jiang, Junjun
Bai, Yuanchao
Liu, Xianming
Publication Year :
2023

Abstract

In recent years, the use of large convolutional kernels has become popular in designing convolutional neural networks due to their ability to capture long-range dependencies and provide large receptive fields. However, the increase in kernel size also leads to a quadratic growth in the number of parameters, resulting in heavy computation and memory requirements. To address this challenge, we propose a neighborhood attention (NA) module that upgrades the standard convolution with a self-attention mechanism. The NA module efficiently extracts long-range dependencies in a sliding window pattern, thereby achieving similar performance to large convolutional kernels but with fewer parameters. Building upon the NA module, we propose a lightweight single image super-resolution (SISR) network named TCSR. Additionally, we introduce an enhanced feed-forward network (EFFN) in TCSR to improve the SISR performance. EFFN employs a parameter-free spatial-shift operation for efficient feature aggregation. Our extensive experiments and ablation studies demonstrate that TCSR outperforms existing lightweight SISR methods and achieves state-of-the-art performance. Our codes are available at \url{https://github.com/Aitical/TCSR}.<br />Comment: 9 pages, 9 figures

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2303.14324
Document Type :
Working Paper