Endoscopic Image Classification Based on Explainable Deep Learning.

Journal: Sensors (Basel, Switzerland)
PMID:

Abstract

Deep learning has achieved remarkably positive results and impacts on medical diagnostics in recent years. Due to its use in several proposals, deep learning has reached sufficient accuracy to implement; however, the algorithms are black boxes that are hard to understand, and model decisions are often made without reason or explanation. To reduce this gap, explainable artificial intelligence (XAI) offers a huge opportunity to receive informed decision support from deep learning models and opens the black box of the method. We conducted an explainable deep learning method based on ResNet152 combined with Grad-CAM for endoscopy image classification. We used an open-source KVASIR dataset that consisted of a total of 8000 wireless capsule images. The heat map of the classification results and an efficient augmentation method achieved a high positive result with 98.28% training and 93.46% validation accuracy in terms of medical image classification.

Authors

  • Doniyorjon Mukhtorov
    Department of IT Convergence Engineering, Gachon University, Sujeong-Gu, Seongnam-Si 461-701, Republic of Korea.
  • Madinakhon Rakhmonova
    Department of IT Convergence Engineering, Gachon University, Sujeong-Gu, Seongnam-Si 461-701, Republic of Korea.
  • Shakhnoza Muksimova
    Department of IT Convergence Engineering, Gachon University, Sujeong-Gu, Seongnam-Si 461-701, Republic of Korea.
  • Young-Im Cho
    Department of Computer Engineering, Gachon University, Seongnam 1342, Republic of Korea.