Prior guided deep difference meta-learner for fast adaptation to stylized segmentation.

Journal: Machine learning: science and technology
Published Date:

Abstract

Radiotherapy treatment planning requires segmenting anatomical structures in various styles, influenced by guidelines, protocols, preferences, or dose planning needs. Deep learning-based auto-segmentation models, trained on anatomical definitions, may not match local clinicians' styles at new institutions. Adapting these models can be challenging without sufficient resources. We hypothesize that consistent differences between segmentation styles and anatomical definitions can be learned from initial patients and applied to pre-trained models for more precise segmentation. We propose a Prior-guided deep difference meta-learner (DDL) to learn and adapt these differences. We collected data from 440 patients for model development and 30 for testing. The dataset includes contours of the prostate clinical target volume (CTV), parotid, and rectum. We developed a deep learning framework that segments new images with a matching style using example styles as a prior, without model retraining. The pre-trained segmentation models were adapted to three different clinician styles for post-operative CTV for prostate, parotid gland, and rectum segmentation. We tested the model's ability to learn unseen styles and compared its performance with transfer learning, using varying amounts of prior patient style data (0-10 patients). Performance was quantitatively evaluated using dice similarity coefficient (DSC) and Hausdorff distance. With exposure to only three patients for the model, the average DSC (%) improved from 78.6, 71.9, 63.0, 69.6, 52.2 and 46.3-84.4, 77.8, 73.0, 77.8, 70.5, 68.1, for CTV, CTV, CTV, Parotid, Rectum, and Rectum, respectively. The proposed Prior-guided DDL is a fast and effortless network for adapting a structure to new styles. The improved segmentation accuracy may result in reduced contour editing time, providing a more efficient and streamlined clinical workflow.

Authors

  • Dan Nguyen
    University of Massachusetts Chan Medical School, Worcester, Massachusetts.
  • Anjali Balagopal
    Medical Artificial Intelligence and Automation (MAIA) Laboratory, Department of Radiation Oncology, UT Southwestern Medical Center, United States of America.
  • Ti Bai
  • Michael Dohopolski
    Department of Radiation Oncology, UT Southwestern Medical Center, Dallas, TX, United States of America.
  • Mu-Han Lin
    Department of Radiation Oncology, University of Texas Southwestern Medical Center, Dallas, Texas.
  • Steve Jiang

Keywords

No keywords available for this article.