Continual Nuclei Segmentation via Prototype-Wise Relation Distillation and Contrastive Learning.

Journal: IEEE transactions on medical imaging
Published Date:

Abstract

Deep learning models have achieved remarkable success in multi-type nuclei segmentation. These models are mostly trained at once with the full annotation of all types of nuclei available, while lack the ability of continually learning new classes due to the problem of catastrophic forgetting. In this paper, we study the practical and important class-incremental continual learning problem, where the model is incrementally updated to new classes without accessing to previous data. We propose a novel continual nuclei segmentation method, to avoid forgetting knowledge of old classes and facilitate the learning of new classes, by achieving feature-level knowledge distillation with prototype-wise relation distillation and contrastive learning. Concretely, prototype-wise relation distillation imposes constraints on the inter-class relation similarity, encouraging the encoder to extract similar class distribution for old classes in the feature space. Prototype-wise contrastive learning with a hard sampling strategy enhances the intra-class compactness and inter-class separability of features, improving the performance on both old and new classes. Experiments on two multi-type nuclei segmentation benchmarks, i.e., MoNuSAC and CoNSeP, demonstrate the effectiveness of our method with superior performance over many competitive methods. Codes are available at https://github.com/zzw-szu/CoNuSeg.

Authors

  • Huisi Wu
  • Zhaoze Wang
  • Zebin Zhao
    School of Management, Harbin Institute of Technology, Harbin, 150001, China. zhaozebin@hit.edu.cn.
  • Cheng Chen
    Key Laboratory of Precision and Intelligent Chemistry, School of Chemistry and Materials Science, University of Science and Technology of China, China.
  • Jing Qin
    School of Nursing, The Hong Kong Polytechnic University, Hong Kong, China.