Label as Equilibrium: A performance booster for Graph Neural Networks on node classification.
Journal:
Neural networks : the official journal of the International Neural Network Society
PMID:
40015034
Abstract
Graph Neural Network (GNN) is effective in graph mining and has become a dominant solution to the node classification task. Recently, a series of label reuse approaches emerged to boost the node classification performance of GNN. They repeatedly input the predicted node class labels into the underlying GNN to update the predictions. However, there are two issues in label reuse that prevent it from performing better. First, re-inputting predictions that are close to the training labels makes the GNN over-fitting, resulting in generalization loss and performance degradation. Second, the repeated iterations consume unaffordable memory for gradient descent, leading to compromised optimization and suboptimal results. To address these issues, we propose an advanced label reuse approach termed Label as Equilibrium (LaE). It has (1) an improved masking strategy with supervision concealment that resolves prediction over-fitting and (2) an infinite number of iterations which is optimizable within constant memory consumption. Excessive node classification experiments demonstrate the superiority of LaE. It significantly increases the accuracy scores of prevailing GNNs by 2.31% on average and outperforms previous label reuse approaches on eight real-world datasets by 1.60% on average. Considering the wide application of label reuse, many state-of-the-art GNNs can benefit from our techniques. Code to reproduce all our experiments is released at https://github.com/cf020031308/LaE.