Beyond low-pass filtering on large-scale graphs via Adaptive Filtering Graph Neural Networks.

Journal: Neural networks : the official journal of the International Neural Network Society
Published Date:

Abstract

Graph Neural Networks (GNNs) have emerged as a crucial deep learning framework for graph-structured data. However, existing GNNs suffer from the scalability limitation, which hinders their practical implementation in industrial settings. Many scalable GNNs have been proposed to address this limitation. However, they have been proven to act as low-pass graph filters, which discard the valuable middle- and high-frequency information. This paper proposes a novel graph neural network named Adaptive Filtering Graph Neural Networks (AFGNN), which can capture all frequency information on large-scale graphs. AFGNN consists of two stages. The first stage utilizes low-, middle-, and high-pass graph filters to extract comprehensive frequency information without introducing additional parameters. This computation is a one-time task and is pre-computed before training, ensuring its scalability. The second stage incorporates a node-level attention-based feature combination, enabling the generation of customized graph filters for each node, contrary to existing spectral GNNs that employ uniform graph filters for the entire graph. AFGNN is suitable for mini-batch training, and can enhance scalability and efficiently capture all frequency information from large-scale graphs. We evaluate AFGNN by comparing its ability to capture all frequency information with spectral GNNs, and its scalability with scalable GNNs. Experimental results illustrate that AFGNN surpasses both scalable GNNs and spectral GNNs, highlighting its superiority.

Authors

  • Qi Zhang
    Department of Gastroenterology, The Affiliated Hospital of Qingdao University, Qingdao, China.
  • Jinghua Li
    Institute of Information on Traditional Chinese Medicine, China Academy of Chinese Medical Sciences, Beijing 100700, China. Electronic address: zingarlee@hotmail.com.
  • Yanfeng Sun
    Beijing Key Laboratory of Multimedia and Intelligent Software Technology, Beijing Artificial Intelligence Institute, Faculty of Information Technology, Beijing University of Technology, Beijing 100124, China. Electronic address: yfsun@bjut.edu.cn.
  • Shaofan Wang
    Beijing Key Laboratory of Multimedia and Intelligent Software Technology, Beijing Artificial Intelligence Institute, Faculty of Information Technology, Beijing University of Technology, Beijing 100124, China. Electronic address: wangshaofan@bjut.du.cn.
  • Junbin Gao
  • Baocai Yin
    iFLYTEK Research, Hefei, China.