An annotation-free whole-slide training approach to pathological classification of lung cancer types using deep learning.

Journal: Nature communications
Published Date:

Abstract

Deep learning for digital pathology is hindered by the extremely high spatial resolution of whole-slide images (WSIs). Most studies have employed patch-based methods, which often require detailed annotation of image patches. This typically involves laborious free-hand contouring on WSIs. To alleviate the burden of such contouring and obtain benefits from scaling up training with numerous WSIs, we develop a method for training neural networks on entire WSIs using only slide-level diagnoses. Our method leverages the unified memory mechanism to overcome the memory constraint of compute accelerators. Experiments conducted on a data set of 9662 lung cancer WSIs reveal that the proposed method achieves areas under the receiver operating characteristic curve of 0.9594 and 0.9414 for adenocarcinoma and squamous cell carcinoma classification on the testing set, respectively. Furthermore, the method demonstrates higher classification performance than multiple-instance learning as well as strong localization results for small lesions through class activation mapping.

Authors

  • Chi-Long Chen
    Department of Pathology, School of Medicine, College of Medicine, Taipei Medical University, Taipei, Taiwan.
  • Chi-Chung Chen
    Department of Electrical Engineering, National Chiayi University, 300 Syuefu Road, Chiayi City 60004, Taiwan.
  • Wei-Hsiang Yu
  • Szu-Hua Chen
    aetherAI Co., Ltd., Taipei, Taiwan.
  • Yu-Chan Chang
    Department of Biomedical Imaging and Radiological Sciences, National Yang-Ming University, Taipei, Taiwan.
  • Tai-I Hsu
    Genomics Research Center, Academia Sinica, Taipei, Taiwan.
  • Michael Hsiao
    Genomics Research Center, Academia Sinica, Taipei, Taiwan.
  • Chao-Yuan Yeh
  • Cheng-Yu Chen
    Department of Medical Imaging, Taipei Medical University Hospital, No.250, Wu-Hsing St, Taipei, 11031, Taiwan. sandy0932@gmail.com.