GHNN: Graph Harmonic Neural Networks for semi-supervised graph-level classification

Neural Netw. 2022 Jul:151:70-79. doi: 10.1016/j.neunet.2022.03.018. Epub 2022 Mar 24.

Abstract

Graph classification aims to predict the property of the whole graph, which has attracted growing attention in the graph learning community. This problem has been extensively studied in the literature of both graph convolutional networks and graph kernels. Graph convolutional networks can learn effective node representations via message passing to mine graph topology in an implicit way, whereas graph kernels can explicitly utilize graph structural knowledge for classification. Due to the scarcity of labeled data in real-world applications, semi-supervised algorithms are anticipated for this problem. In this paper, we propose Graph Harmonic Neural Network (GHNN) which combines the advantages of both worlds to sufficiently leverage the unlabeled data, and thus overcomes label scarcity in semi-supervised scenarios. Specifically, our GHNN consists of a graph convolutional network (GCN) module and a graph kernel network (GKN) module that explore graph topology information from complementary perspectives. To fully leverage the unlabeled data, we develop a novel harmonic contrastive loss and a harmonic consistency loss to harmonize the training of two modules by giving priority to high-quality unlabeled data, thereby reconciling prediction consistency between both of them. In this manner, the two modules mutually enhance each other to sufficiently explore the graph topology of both labeled and unlabeled data. Extensive experiments on a variety of benchmarks demonstrate the effectiveness of our approach over competitive baselines.

Keywords: Graph classification; Graph kernels; Graph neural networks; Semi-supervised learning.

MeSH terms

  • Algorithms*
  • Attention
  • Benchmarking
  • Learning
  • Neural Networks, Computer*