Word Embedding Distribution Propagation Graph Network for Few-Shot Learning.

Journal: Sensors (Basel, Switzerland)
Published Date:

Abstract

Few-shot learning (FSL) is of great significance to the field of machine learning. The ability to learn and generalize using a small number of samples is an obvious distinction between artificial intelligence and humans. In the FSL domain, most graph neural networks (GNNs) focus on transferring labeled sample information to an unlabeled query sample, ignoring the important role of semantic information during the classification process. Our proposed method embeds semantic information of classes into a GNN, creating a word embedding distribution propagation graph network (WPGN) for FSL. We merge the attention mechanism with our backbone network, use the Mahalanobis distance to calculate the similarity of classes, select the Funnel ReLU (FReLU) function as the activation function of the Transform layer, and update the point graph and word embedding distribution graph. In extensive experiments on FSL benchmarks, compared with the baseline model, the accuracy of the WPGN on the 5-way-1/2/5 shot tasks increased by 9.03, 4.56, and 4.15%, respectively.

Authors

  • Chaoran Zhu
    College of Computer Science and Technology, Changchun University of Science and Technology, Changchun 130022, China.
  • Ling Wang
    The State Key Laboratory of Ophthalmology, Zhongshan Ophthalmic Center, Sun Yat-sen University, #7 Jinsui Road, Guangzhou, Guangdong 510230, China.
  • Cheng Han
    State Key Laboratory of Polymer Materials Engineering, Polymer Research Institute, Sichuan University, Chengdu, 610065, China.