AANet: Adaptive Attention Network for COVID-19 Detection From Chest X-Ray Images.

Journal: IEEE transactions on neural networks and learning systems
Published Date:

Abstract

Accurate and rapid diagnosis of COVID-19 using chest X-ray (CXR) plays an important role in large-scale screening and epidemic prevention. Unfortunately, identifying COVID-19 from the CXR images is challenging as its radiographic features have a variety of complex appearances, such as widespread ground-glass opacities and diffuse reticular-nodular opacities. To solve this problem, we propose an adaptive attention network (AANet), which can adaptively extract the characteristic radiographic findings of COVID-19 from the infected regions with various scales and appearances. It contains two main components: an adaptive deformable ResNet and an attention-based encoder. First, the adaptive deformable ResNet, which adaptively adjusts the receptive fields to learn feature representations according to the shape and scale of infected regions, is designed to handle the diversity of COVID-19 radiographic features. Then, the attention-based encoder is developed to model nonlocal interactions by self-attention mechanism, which learns rich context information to detect the lesion regions with complex shapes. Extensive experiments on several public datasets show that the proposed AANet outperforms state-of-the-art methods.

Authors

  • Zhijie Lin
  • Zhaoshui He
  • Shengli Xie
    School of Automation, Guangdong Key Laboratory of IoT Information Technology, Guangdong University of Technology, Guangzhou, 510006, China. Electronic address: shlxie@gdut.edu.cn.
  • Xu Wang
    Weldon School of Biomedical Engineering, Purdue University, West Lafayette, IN 47907.
  • Ji Tan
  • Jun Lu
    School of Acupuncture-moxibustion and Tuina, Beijing University of Chinese Medicine, Beijing 100029, China.
  • Beihai Tan