Graph explicit pooling for graph-level representation learning.

Journal: Neural networks : the official journal of the International Neural Network Society
PMID:

Abstract

Graph pooling has been increasingly recognized as crucial for Graph Neural Networks (GNNs) to facilitate hierarchical graph representation learning. Existing graph pooling methods commonly consist of two stages: selecting top-ranked nodes and discarding the remaining to construct coarsened graph representations. However, this paper highlights two key issues with these methods: (1) The process of selecting nodes to discard frequently employs additional Graph Convolutional Networks or Multilayer Perceptrons, lacking a thorough evaluation of each node's impact on the final graph representation and subsequent prediction tasks. (2) Current graph pooling methods tend to directly discard the noise segment (dropped) of the graph without accounting for the latent information contained within these elements. To address the first issue, we introduce a novel Graph explicit Pooling (GrePool) method, which selects nodes by explicitly leveraging the relationships between the nodes and final representation vectors crucial for classification. The second issue is addressed using an extended version of GrePool (i.e., GrePool+), which applies a uniform loss on the discarded nodes. This addition is designed to augment the training process and improve classification accuracy. Furthermore, we conduct comprehensive experiments across 12 widely used datasets to validate our proposed method's effectiveness, including the Open Graph Benchmark datasets. Our experimental results uniformly demonstrate that GrePool outperforms 14 baseline methods for most datasets. Likewise, implementing GrePool+ enhances GrePool's performance without incurring additional computational costs. The code is available at https://github.com/LiuChuang0059/GrePool.

Authors

  • Chuang Liu
    Logistics Engineering School, Chengdu Vocational & Technical College of Industry, Chengdu, China.
  • Wenhang Yu
    Changjiang Schinta Software Technology Co. LTD., Wuhan, China; Internet+ Intelligent Water Conservancy Key Laboratory of Changjiang Water Resources Commission, Wuhan, China. Electronic address: haldate@whu.edu.cn.
  • Kuang Gao
    School of Computer Science, Wuhan University, Wuhan, 430071, China.
  • Xueqi Ma
    School of Computing and Information Systems, The University of Melbourne, Parkville, VIC 3010, Australia. Electronic address: xueqim@student.unimelb.edu.au.
  • Yibing Zhan
    JD Explore Academy, Beijing, China. Electronic address: zhanyibing@jd.com.
  • Jia Wu
  • Wenbin Hu
  • Bo Du
    School of Computer Science, Wuhan University, Wuhan, 430072, China. Electronic address: remoteking@whu.edu.cn.