Better Rough Than Scarce: Proximal Femur Fracture Segmentation With Rough Annotations.

Journal: IEEE transactions on medical imaging
PMID:

Abstract

Proximal femoral fracture segmentation in computed tomography (CT) is essential in the preoperative planning of orthopedic surgeons. Recently, numerous deep learning-based approaches have been proposed for segmenting various structures within CT scans. Nevertheless, distinguishing various attributes between fracture fragments and soft tissue regions in CT scans frequently poses challenges, which have received comparatively limited research attention. Besides, the cornerstone of contemporary deep learning methodologies is the availability of annotated data, while detailed CT annotations remain scarce. To address the challenge, we propose a novel weakly-supervised framework, namely Rough Turbo Net (RT-Net), for the segmentation of proximal femoral fractures. We emphasize the utilization of human resources to produce rough annotations on a substantial scale, as opposed to relying on limited fine-grained annotations that demand a substantial time to create. In RT-Net, rough annotations pose fractured-region constraints, which have demonstrated significant efficacy in enhancing the accuracy of the network. Conversely, the fine annotations can provide more details for recognizing edges and soft tissues. Besides, we design a spatial adaptive attention module (SAAM) that adapts to the spatial distribution of the fracture regions and align feature in each decoder. Moreover, we propose a fine-edge loss which is applied through an edge discrimination network to penalize the absence or imprecision edge features. Extensive quantitative and qualitative experiments demonstrate the superiority of RT-Net to state-of-the-art approaches. Furthermore, additional experiments show that RT-Net has the capability to produce pseudo labels for raw CT images that can further improve fracture segmentation performance and has the potential to improve segmentation performance on public datasets. The code is available at: https://github.com/zyairelu/RT-Net.

Authors

  • Xu Lu
    Department of Computer Science, Guangdong Polytechnic Normal University, Guangzhou, 510006, China. Electronic address: bruda@126.com.
  • Zengzhen Cui
  • Yihua Sun
  • Hee Guan Khor
  • Ao Sun
    School of Mechanical and Electronic Engineering, Nanjing Forestry University, Nanjing 210037, China.
  • Longfei Ma
    Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, China.
  • Fang Chen
  • Shan Gao
    Department of Mathematics and Statistics, Yunnan University, China.
  • Yun Tian
  • Fang Zhou
    Center of Robot Minimally Invasive Surgery, Sichuan Academy of Medical Sciences, Sichuan Provincial People's Hospital, Chengdu, Sichuan 61000, China.
  • Yang Lv
  • Hongen Liao