Heterogeneous context interaction network for vehicle re-identification.

Journal: Neural networks : the official journal of the International Neural Network Society
PMID:

Abstract

Capturing global and subtle discriminative information using attention mechanisms is essential to address the challenge of inter-class high similarity for vehicle re-identification (Re-ID) task. Mixing self-information of nodes or modeling context based on pairwise dependencies between nodes are the core ideas of current advanced attention mechanisms. This paper aims to explore how to utilize both dependency context and self-context in an efficient way to facilitate attention to learn more effectively. We propose a heterogeneous context interaction (HCI) attention mechanism that infers the weights of nodes from the interactions of global dependency contexts and local self-contexts to enhance the effect of attention learning. To reduce computational complexity, global dependency contexts are modeled by aggregating number-compressed pairwise dependencies, and the interactions of heterogeneous contexts are restricted to a certain range. Based on this mechanism, we propose a heterogeneous context interaction network (HCI-Net), which uses channel heterogeneous context interaction module (CHCI) and spatial heterogeneous context interaction module (SHCI), and introduces a rigid partitioning strategy to extract important global and fine-grained features. In addition, we design a non-similarity constraint (NSC) that forces the HCI-Net to learn diverse subtle discriminative information. The experiment results on two large datasets, VeRi-776 and VehicleID, show that our proposed HCI-Net achieves the state-of-the-art performance. In particular, the mean average precision (mAP) reaches 83.8% on VeRi-776 dataset.

Authors

  • Ke Sun
  • Xiyu Pang
    School of Information Science and Electrical Engineering, Shandong Jiaotong University, No. 5001, Haitang Road, Changqing District, Jinan, 250357, Shan Dong, China; School of Software, Shandong University, No. 1500, Shunhua Road, High-tech Industrial Development Zone, Jinan, 250101, Shan Dong, China. Electronic address: xiyupang@126.com.
  • Meifeng Zheng
    School of Information Science and Electrical Engineering, Shandong Jiaotong University, No. 5001, Haitang Road, Changqing District, Jinan, 250357, Shan Dong, China. Electronic address: fhhh2023@126.com.
  • Xiushan Nie
    School of Computer Science and Technology, Shandong Jianzhu University, No. 1000, Fengming Road, Lingang Development Zone, Jinan, 250101, Shan Dong, China. Electronic address: niexsh@hotmail.com.
  • Xi Li
  • Houren Zhou
    School of Information Science and Electrical Engineering, Shandong Jiaotong University, No. 5001, Haitang Road, Changqing District, Jinan, 250357, Shan Dong, China. Electronic address: lovercherry996@163.com.
  • Yilong Yin