Deep parameter-free attention hashing for image retrieval.

Journal: Scientific reports
Published Date:

Abstract

Deep hashing method is widely applied in the field of image retrieval because of its advantages of low storage consumption and fast retrieval speed. There is a defect of insufficiency feature extraction when existing deep hashing method uses the convolutional neural network (CNN) to extract images semantic features. Some studies propose to add channel-based or spatial-based attention modules. However, embedding these modules into the network can increase the complexity of model and lead to over fitting in the training process. In this study, a novel deep parameter-free attention hashing (DPFAH) is proposed to solve these problems, that designs a parameter-free attention (PFA) module in ResNet18 network. PFA is a lightweight module that defines an energy function to measure the importance of each neuron and infers 3-D attention weights for feature map in a layer. A fast closed-form solution for this energy function proves that the PFA module does not add any parameters to the network. Otherwise, this paper designs a novel hashing framework that includes the hash codes learning branch and the classification branch to explore more label information. The like-binary codes are constrained by a regulation term to reduce the quantization error in the continuous relaxation. Experiments on CIFAR-10, NUS-WIDE and Imagenet-100 show that DPFAH method achieves better performance.

Authors

  • Wenjing Yang
    State Key Laboratory of High Performance Computing, National University of Defense Technology, Changsha 410073, China.
  • Liejun Wang
    College of Information Science and Engineering, Xinjiang University, Urumqi, Xinjiang, China.
  • Shuli Cheng
    College of Information Science and Engineering, Xinjiang University, Urumqi, 830046, China.