Semiautomated segmentation of breast tumor on automatic breast ultrasound image using a large-scale model with customized modules.

Journal: Scientific reports
Published Date:

Abstract

To verify the capability of the Segment Anything Model for medical images in 3D (SAM-Med3D), tailored with low-rank adaptation (LoRA) strategies, in segmenting breast tumors in Automated Breast Ultrasound (ABUS) images. This retrospective study collected data from 329 patients diagnosed with breast cancer (average age 54 years). The dataset was randomly divided into training (n = 204), validation (n = 29), and test sets (n = 59). Two experienced radiologists manually annotated the regions of interest of each sample in the dataset, which served as ground truth for training and evaluating the SAM-Med3D model with additional customized modules. For semi-automatic tumor segmentation, points were randomly sampled within the lesion areas to simulate the radiologists' clicks in real-world scenarios. The segmentation performance was evaluated using the Dice coefficient. A total of 492 cases (200 from the "Tumor Detection, Segmentation, and Classification Challenge on Automated 3D Breast Ultrasound (TDSC-ABUS) 2023 challenge") were subjected to semi-automatic segmentation inference. The average Dice Similariy Coefficient (DSC) scores for the training, validation, and test sets of the Lishui dataset were 0.75, 0.78, and 0.75, respectively. The Breast Imaging Reporting and Data System (BI-RADS) categories of all samples range from BI-RADS 3 to 6, yielding an average DSC coefficient between 0.73 and 0.77. By categorizing the samples (lesion volumes ranging from 1.64 to 100.03 cm) based on lesion size, the average DSC falls between 0.72 and 0.77.And the overall average DSC for the TDSC-ABUS 2023 challenge dataset was 0.79, with the test set achieving a sora-of-art scores of 0.79. The SAM-Med3D model with additional customized modules demonstrates good performance in semi-automatic 3D ABUS breast cancer tumor segmentation, indicating its feasibility for application in computer-aided diagnosis systems.

Authors

  • Yi Zhou
    Eye Center of Xiangya Hospital, Central South University, Changsha, Hunan, China.
  • Mingtao Ye
    College of Computer Science and Technology, Zhejiang University of Technology, Hangzhou, Zhejiang, China.
  • Haiyang Ye
    Institute of Intelligent Media Computing, Hangzhou Dianzi University, Hangzhou, 310018, China.
  • Shuqi Zeng
    Department of Breast Surgery, The Fifth Affliated Hospital of Wenzhou Medical University, LiShui Municipal Central Hospital, Lishui, 323000, China.
  • Xi Shu
    Department of Breast Surgery, The Fifth Affliated Hospital of Wenzhou Medical University, LiShui Municipal Central Hospital, Lishui, 323000, China.
  • Ying Pan
    Department of Endocrinology, Kunshan Hospital Affiliated to Jiangsu University, Kunshan, China.
  • Aifen Wu
    Department of Ultrasound, The Fifth Affliated Hospital of Wenzhou Medical University, LiShui Municipal Central Hospital, Lishui, 323000, China.
  • Pengpeng Liu
    Department of Breast Surgery, The Fifth Affliated Hospital of Wenzhou Medical University, LiShui Municipal Central Hospital, Lishui, 323000, China.
  • Guodao Zhang
    Department of Digital Media Technology, Hangzhou Dianzi University, Hangzhou, 310018, China. Electronic address: guodaozhang@zjut.edu.cn.
  • Shibin Cai
    Department of Breast Surgery, The Fifth Affliated Hospital of Wenzhou Medical University, LiShui Municipal Central Hospital, Lishui, 323000, China. caisb614@foxmail.com.
  • Shuzheng Chen
    Department of Breast Surgery, The Fifth Affliated Hospital of Wenzhou Medical University, LiShui Municipal Central Hospital, Lishui, 323000, China. dr.susan@163.com.