Syn-Net: A Synchronous Frequency-Perception Fusion Network for Breast Tumor Segmentation in Ultrasound Images.
Journal:
IEEE journal of biomedical and health informatics
Published Date:
Mar 6, 2025
Abstract
Accurate breast tumor segmentation in ultrasound images is a crucial step in medical diagnosis and locating the tumor region. However, segmentation faces numerous challenges due to the complexity of ultrasound images, similar intensity distributions, variable tumor morphology, and speckle noise. To address these challenges and achieve precise segmentation of breast tumors in complex ultrasound images, we propose a Synchronous Frequency-perception Fusion Network (Syn-Net). Initially, we design a synchronous dual-branch encoder to extract local and global feature information simultaneously from complex ultrasound images. Secondly, we introduce a novel Frequency- perception Cross-Feature Fusion (FrCFusion) Block, which utilizes Discrete Cosine Transform (DCT) to learn all-frequency features and effectively fuse local and global features while mitigating issues arising from similar intensity distributions. In addition, we develop a Full-Scale Deep Supervision method that not only corrects the influence of speckle noise on segmentation but also effectively guides decoder features towards the ground truth. We conduct extensive experiments on three publicly available ultrasound breast tumor datasets. Comparison with 14 state-of-the-art deep learning segmentation methods demonstrates that our approach exhibits superior sensitivity to different ultrasound images, variations in tumor size and shape, speckle noise, and similarity in intensity distribution between surrounding tissues and tumors. On the BUSI and Dataset B datasets, our method achieves better Dice scores compared to state-of-the-art methods, indicating superior performance in ultrasound breast tumor segmentation.