Learning With Privileged Multimodal Knowledge for Unimodal Segmentation.

Journal: IEEE transactions on medical imaging
Published Date:

Abstract

Multimodal learning usually requires a complete set of modalities during inference to maintain performance. Although training data can be well-prepared with high-quality multiple modalities, in many cases of clinical practice, only one modality can be acquired and important clinical evaluations have to be made based on the limited single modality information. In this work, we propose a privileged knowledge learning framework with the 'Teacher-Student' architecture, in which the complete multimodal knowledge that is only available in the training data (called privileged information) is transferred from a multimodal teacher network to a unimodal student network, via both a pixel-level and an image-level distillation scheme. Specifically, for the pixel-level distillation, we introduce a regularized knowledge distillation loss which encourages the student to mimic the teacher's softened outputs in a pixel-wise manner and incorporates a regularization factor to reduce the effect of incorrect predictions from the teacher. For the image-level distillation, we propose a contrastive knowledge distillation loss which encodes image-level structured information to enrich the knowledge encoding in combination with the pixel-level distillation. We extensively evaluate our method on two different multi-class segmentation tasks, i.e., cardiac substructure segmentation and brain tumor segmentation. Experimental results on both tasks demonstrate that our privileged knowledge learning is effective in improving unimodal segmentation and outperforms previous methods.

Authors

  • Cheng Chen
    Key Laboratory of Precision and Intelligent Chemistry, School of Chemistry and Materials Science, University of Science and Technology of China, China.
  • Qi Dou
  • Yueming Jin
    Department of Computer Science and Engineering, The Chinese University of Hong Kong, Hong Kong, China.
  • Quande Liu
  • Pheng Ann Heng