Interpretable and Lightweight 3-D Deep Learning Model for Automated ACL Diagnosis.

Journal: IEEE journal of biomedical and health informatics
Published Date:

Abstract

We propose an interpretable and lightweight 3D deep neural network model that diagnoses anterior cruciate ligament (ACL) tears from a knee MRI exam. Previous works focused primarily on achieving better diagnostic accuracy but paid less attention to practical aspects such as explainability and model size. They mainly relied on ImageNet pre-trained 2D deep neural network backbones, such as AlexNet or ResNet, which are computationally expensive. Some of them tried to interpret the models using post-inference visualization tools, such as CAM or Grad-CAM, which lack in generating accurate heatmaps. Our work addresses the two limitations by understanding the characteristics of ACL tear diagnosis. We argue that the semantic features required for classifying ACL tears are locally confined and highly homogeneous. We harness the unique characteristics of the task by incorporating: 1) attention modules and Gaussian positional encoding to reinforce the seeking of local features; 2) squeeze modules and fewer convolutional filters to reflect the homogeneity of the features. As a result, our model is interpretable: our attention modules can precisely highlight the ACL region without any location information given to them. Our model is extremely lightweight: consisting of only 43 K trainable parameters and 7.1 G of Floating-point operations per second (FLOPs), that is 225 times smaller and 91 times lesser than the previous state-of-the-art, respectively. Our model is accurate: our model outperforms the previous state-of-the-art with the average ROC-AUC of 0.983 and 0.980 on the Chiba and Stanford knee datasets, respectively.

Authors

  • YoungSeok Jeon
  • Kensuke Yoshino
  • Shigeo Hagiwara
  • Atsuya Watanabe
  • Swee Tian Quek
    From the Department of Diagnostic Imaging, National University Hospital, 5 Lower Kent Ridge Rd, Singapore 119074 (J.T.P.D.H., A.M., Y.L.T., S.L., Y.S.C., S.E.E., S.T.Q.); Department of Diagnostic Radiology, Yong Loo Lin School of Medicine, National University of Singapore, Singapore (J.T.P.D.H., A.M., Y.L.T., S.L., Y.S.C., S.E.E., S.T.Q.); NUS Graduate School, Integrative Sciences and Engineering Programme, National University of Singapore, Singapore (L.Z.); Department of Computer Science, School of Computing, National University of Singapore, Singapore (K.Y., B.C.O.); Department of Radiology, Dammam Medical Complex, Dammam, Saudi Arabia (D.A.R.A.); Biostatistics Unit, Yong Loo Lin School of Medicine, Singapore (Q.V.Y., Y.H.C.); University Spine Centre, Department of Orthopaedic Surgery, National University Health System, Singapore (J.H.T., N.K.); and Department of Radiological Sciences, University of California, Irvine, Orange, Calif (H.Y.).
  • Hiroshi Yoshioka
    From the Department of Diagnostic Imaging, National University Hospital, 5 Lower Kent Ridge Rd, Singapore 119074 (J.T.P.D.H., A.M., Y.L.T., S.L., Y.S.C., S.E.E., S.T.Q.); Department of Diagnostic Radiology, Yong Loo Lin School of Medicine, National University of Singapore, Singapore (J.T.P.D.H., A.M., Y.L.T., S.L., Y.S.C., S.E.E., S.T.Q.); NUS Graduate School, Integrative Sciences and Engineering Programme, National University of Singapore, Singapore (L.Z.); Department of Computer Science, School of Computing, National University of Singapore, Singapore (K.Y., B.C.O.); Department of Radiology, Dammam Medical Complex, Dammam, Saudi Arabia (D.A.R.A.); Biostatistics Unit, Yong Loo Lin School of Medicine, Singapore (Q.V.Y., Y.H.C.); University Spine Centre, Department of Orthopaedic Surgery, National University Health System, Singapore (J.H.T., N.K.); and Department of Radiological Sciences, University of California, Irvine, Orange, Calif (H.Y.).
  • Mengling Feng
    Saw Swee Hock School of Public Health, National University Health System, National University of Singapore, Singapore.