Dual branch attention network for image super-resolution.

Journal: Scientific reports
Published Date:

Abstract

The advancement of deep convolutional neural networks (CNNs) has resulted in remarkable achievements in image super-resolution methods utilizing CNNs. However, these methods have been limited by a narrow perceptual field and often require a high number of parameters and computational complexity, making them unsuitable for resource-constrained devices. Recently, the Transformer architecture has shown significant potential in image super-resolution due to its ability to perceive global features. Yet, the quadratic computational complexity of self-attention mechanisms in these Transformer-based methods leads to substantial computational and parameter overhead, limiting their practical application. To address these challenges, we introduce the Dual Branch Attention Network (DBAN), a novel Transformer model that integrates prior knowledge from traditional dictionary learning with the global feature perception capabilities of Transformers, enabling image super-resolution. Our model features a "token dictionary" mechanism that uses auxiliary labeling to provide external prior information, enhancing cross-attention and self-attention computations while maintaining a linear relationship between computational complexity and image size. We also propose a Feature Aggregation Module (FAM) that efficiently extracts local contextual information and performs channel feature fusion, substantially enhancing the model's performance and efficiency. By reasonably arranging the number of modules and the depth of the network, we reduce the complexity of the model. Extensive experiments have demonstrated that our DBAN achieves excellent performance.

Authors

  • Yiwei Hu
    Key Laboratory for Space Bioscience and Biotechnology, School of Life Sciences, Northwestern Polytechnical University, Xi'an 710072, China.
  • Yisu Ge
    College of Computer Science and Technology, Zhejiang University of Technology, Hangzhou 310023, China.
  • Mingming Qi
    School of Data Science and Artificial Intelligence, Wenzhou University of Technology, Wenzhou, 325000, China. Webqmm1974@163.com.
  • Shuhua Xu
    College of Computer Science and Artificial Intelligence, Wenzhou University, Wenzhou, 325000, China. xu_shuhua2001@163.com.

Keywords

No keywords available for this article.