Neighborhood relation-based knowledge distillation for image classification.
Journal:
Neural networks : the official journal of the International Neural Network Society
Published Date:
Apr 1, 2025
Abstract
As an efficient model compression method, recent knowledge distillation methods primarily transfer the knowledge from a large teacher model to a small student model by minimizing the differences between the predictions from teacher and student. However, the relationship between different samples has not been well-investigated, since recent relational distillation methods mainly construct the knowledge from all randomly selected samples, e.g., the similarity matrix of mini-batch samples. In this paper, we propose Neighborhood Relation-Based Knowledge Distillation (NRKD) to consider the local structure as the novel relational knowledge for better knowledge transfer. Specifically, we first find a subset of samples with their K-nearest neighbors according to the similarity matrix of mini-batch samples and then build the neighborhood relationship knowledge for knowledge distillation, where the characterized relational knowledge can be transferred by both intermediate feature maps and output logits. We perform extensive experiments on several popular image classification datasets for knowledge distillation, including CIFAR10, CIFAR100, Tiny ImageNet, and ImageNet. Experimental results demonstrate that the proposed NRKD yields competitive results, compared to the state-of-the art distillation methods. Our codes are available at: https://github.com/xinxiaoxiaomeng/NRKD.git.