[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
research-article

Global-local graph neural networks for node-classification

Published: 18 October 2024 Publication History

Abstract

The task of graph node classification is often approached by utilizing a local Graph Neural Network (GNN), that learns only local information from the node input features and their adjacency. In this paper, we propose to improve the performance of node classification GNNs by utilizing both global and local information, specifically by learning label- and node- features. We therefore call our method Global-Local-GNN (GLGNN). To learn proper label features, for each label, we maximize the similarity between its features and nodes features that belong to the label, while maximizing the distance between nodes that do not belong to the considered label. We then use the learnt label features to predict the node classification map. We demonstrate our GLGNN using three different GNN backbones, and show that our approach improves baseline performance, revealing the importance of global information utilization for node classification.

Highlights

We propose to learn label features to capture global information of the input graph.
We fuse label and node features to predict a node-classification map.
We qualitatively demonstrate our method by illustrating the learnt label and node features.
We quantitatively demonstrate the benefit of using our global label features approach on 12 real-world datasets.

References

[1]
T.N. Kipf, M. Welling, Semi-Supervised Classification with Graph Convolutional Networks, in: International Conference on Learning Representations, 2017.
[2]
Wang Y., Sun Y., Liu Z., Sarma S.E., Bronstein M.M., Solomon J.M., Dynamic graph CNN for learning on point clouds, ACM Trans. Graph. 38 (5) (2019).
[3]
Eliasof M., Boesen T., Haber E., Keasar C., Treister E., Mimetic neural networks: A unified framework for protein design and folding, Front. Bioinform. 2 (2022).
[4]
Krizhevsky A., Sutskever I., Hinton G., Imagenet classification with deep convolutional neural networks, Adv. Neural Inf. Process. Syst. 61 (2012) 1097–1105.
[5]
K. He, X. Zhang, S. Ren, J. Sun, Deep residual learning for image recognition, in: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 2016, pp. 770–778.
[6]
P. Veličković, G. Cucurull, A. Casanova, A. Romero, P. Liò, Y. Bengio, Graph Attention Networks, in: International Conference on Learning Representations, 2018.
[7]
Chen M., Wei Z., Huang Z., Ding B., Li Y., Simple and deep graph convolutional networks, in: Daumé III H., Singh A. (Eds.), Proceedings of the 37th International Conference on Machine Learning, in: Proceedings of Machine Learning Research, vol. 119, PMLR, 2020, pp. 1725–1735.
[8]
Feng X., Zhou S., Zhu Z., Wang L., Hua G., Local to global feature learning for salient object detection, Pattern Recognit. Lett. 162 (2022).
[9]
Dwivedi V.P., Joshi C.K., Luu A.T., Laurent T., Bengio Y., Bresson X., Benchmarking graph neural networks, J. Mach. Learn. Res. 24 (43) (2023) 1–48.
[10]
Li H., Wang B., Cui L., Bai L., Hancock E.R., LGL-GNN: Learning global and local information for graph neural networks, in: Structural, Syntactic, and Statistical Pattern Recognition: Joint IAPR International Workshops, S+ SSPR 2020, Padua, Italy, January 21–22, 2021, Proceedings, Springer, 2021, pp. 129–138.
[11]
N.T. Huang, S. Villar, C. Priebe, D. Zheng, C. Huang, L. Yang, V. Braverman, From Local to Global: Spectral-Inspired Graph Neural Networks, in: NeurIPS 2022 Workshop: New Frontiers in Graph Learning, 2022.
[12]
Bruna J., Zaremba W., Szlam A., LeCun Y., Spectral networks and locally connected networks on graphs, 2013, arXiv preprint arXiv:1312.6203.
[13]
Gilmer J., Schoenholz S.S., Riley P.F., Vinyals O., Dahl G.E., Neural message passing for quantum chemistry, in: Proceedings of the 34th International Conference on Machine Learning-Volume 70, JMLR. org, 2017.
[14]
Srivastava N., Hinton G., Krizhevsky A., Sutskever I., Salakhutdinov R., Dropout: A simple way to prevent neural networks from overfitting, J. Mach. Learn. Res. 15 (1) (2014) 1929–1958.
[15]
Y. Rong, W. Huang, T. Xu, J. Huang, DropEdge: Towards Deep Graph Convolutional Networks on Node Classification, in: International Conference on Learning Representations, 2020.
[16]
Do T.H., Nguyen D.M., Bekoulis G., Munteanu A., Deligiannis N., Graph convolutional neural networks with node transition probability-based message passing and DropNode regularization, Expert Syst. Appl. 174 (2021).
[17]
L. Zhao, L. Akoglu, PairNorm: Tackling Oversmoothing in GNNs, in: International Conference on Learning Representations, 2020.
[18]
Chen D., Lin Y., Li W., Li P., Zhou J., Sun X., Measuring and relieving the over-smoothing problem for graph neural networks from the topological view, Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, 2020, pp. 3438–3445.
[19]
Verma V., Lamb A., Beckham C., Najafi A., Mitliagkas I., Lopez-Paz D., Bengio Y., Manifold mixup: Better representations by interpolating hidden states, in: International Conference on Machine Learning, PMLR, 2019, pp. 6438–6447.
[20]
Verma V., Qu M., Kawaguchi K., Lamb A., Bengio Y., Kannala J., Tang J., Graphmix: Improved training of gnns for semi-supervised learning, Proceedings of the AAAI Conference on Artificial Intelligence, vol. 35, 2021, pp. 10024–10032.
[21]
R.D. Hjelm, A. Fedorov, S. Lavoie-Marchildon, K. Grewal, P. Bachman, A. Trischler, Y. Bengio, Learning deep representations by mutual information estimation and maximization, in: International Conference on Learning Representations, 2019.
[22]
P. Veličković, W. Fedus, W.L. Hamilton, P. Liò, Y. Bengio, R.D. Hjelm, Deep Graph Infomax, in: International Conference on Learning Representations, 2019.
[23]
F.-Y. Sun, J. Hoffman, V. Verma, J. Tang, InfoGraph: Unsupervised and Semi-supervised Graph-Level Representation Learning via Mutual Information Maximization, in: International Conference on Learning Representations, 2020.
[24]
Bo D., Hu B., Wang X., Zhang Z., Shi C., Zhou J., Regularizing graph neural networks via consistency-diversity graph augmentations, Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, 2022.
[25]
Manessi F., Rozza A., Graph-based neural network models with multiple self-supervised auxiliary tasks, Pattern Recognit. Lett. 148 (2021).
[26]
Wei F., Ping M., Mei K., Structure-based graph convolutional networks with frequency filter, Pattern Recognit. Lett. 164 (2022) 161–165.
[27]
Zhang G., Cheng D., Zhang S., FPGNN: Fair path graph neural network for mitigating discrimination, World Wide Web (2023) 1–18.
[28]
Zhang G., Cheng D., Yuan G., Zhang S., Learning fair representations via rebalancing graph structure, Inf. Process. Manage. 61 (1) (2024).
[29]
X. Fu, Y. Wei, Q. Sun, H. Yuan, J. Wu, H. Peng, J. Li, Hyperbolic Geometric Graph Representation Learning for Hierarchy-imbalance Node Classification, in: Proceedings of the ACM Web Conference 2023, 2023.
[30]
Wang H., Leskovec J., Unifying graph convolutional neural networks and label propagation, 2020, arXiv preprint arXiv:2002.06755.
[31]
Q. Huang, H. He, A. Singh, S.-N. Lim, A. Benson, Combining Label Propagation and Simple Models out-performs Graph Neural Networks, in: International Conference on Learning Representations, 2021.
[32]
Fukushima K., Cognitron: A self-organizing multilayered neural network, Biol. Cybern. 20 (3–4) (1975) 121–136.
[33]
Rosenblatt F., et al., Principles of Neurodynamics: Perceptrons and the Theory of Brain Mechanisms, Spartan books Washington, DC, 1962.
[34]
M. Sandler, A. Howard, M. Zhu, A. Zhmoginov, L.-C. Chen, MobileNetV2: Inverted Residuals and Linear Bottlenecks, in: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 2018.
[35]
Bridle J., Training stochastic model recognition algorithms as networks can lead to maximum mutual information estimation of parameters, Advances in Neural Information Processing Systems, vol. 2, 1989.
[36]
Khosla P., Teterwak P., Wang C., Sarna A., Tian Y., Isola P., Maschinot A., Liu C., Krishnan D., Supervised contrastive learning, Advances in Neural Information Processing Systems, vol. 33, 2020, pp. 18661–18673.
[37]
McCallum A.K., Nigam K., Rennie J., Seymore K., Automating the construction of internet portals with machine learning, Inf. Retr. 3 (2) (2000) 127–163.
[38]
Glorot X., Bengio Y., Understanding the difficulty of training deep feedforward neural networks, in: Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics, JMLR Workshop and Conference Proceedings, 2010, pp. 249–256.
[39]
Sen P., Namata G., Bilgic M., Getoor L., Galligher B., Eliassi-Rad T., Collective classification in network data, AI Mag. 29 (3) (2008).
[40]
Namata G., London B., Getoor L., Huang B., Edu U., Query-driven active surveying for collective classification, 10th International Workshop on Mining and Learning with Graphs, vol. 8, 2012, p. 1.
[41]
H. Pei, B. Wei, K.C.-C. Chang, Y. Lei, B. Yang, Geom-GCN: Geometric Graph Convolutional Networks, in: International Conference on Learning Representations, 2020.
[42]
Rozemberczki B., Allen C., Sarkar R., Multi-scale attributed node embedding, J. Complex Netw. 9 (2) (2021).
[43]
D. Kingma, J. Ba, Adam: A Method for Stochastic Optimization, in: International Conference on Learning Representations, ICLR, San Diego, CA, USA, 2015.
[44]
Paszke A., Gross S., Massa F., Lerer A., Bradbury J., Chanan G., Killeen T., Lin Z., Gimelshein N., Antiga L., Desmaison A., Kopf A., Yang E., DeVito Z., Raison M., Tejani A., Chilamkurthy S., Steiner B., Fang L., Bai J., Chintala S., PyTorch: An imperative style, high-performance deep learning library, in: Wallach H., Larochelle H., Beygelzimer A., d’s Alché-Buc F., Fox E., Garnett R. (Eds.), Advances in Neural Information Processing Systems 32, Curran Associates, Inc, 2019.
[45]
Burgess J., Rtx on—the nvidia turing gpu, IEEE Micro 40 (2) (2020).
[46]
Yang Z., Cohen W., Salakhudinov R., Revisiting semi-supervised learning with graph embeddings, in: International Conference on Machine Learning, PMLR, 2016, pp. 40–48.
[47]
D. Kim, A. Oh, How to find your friendly neighborhood: Graph attention design with self-supervision, in: International Conference on Learning Representations, 2020.
[48]
J. Klicpera, A. Bojchevski, S. Günnemann, Combining Neural Networks with Personalized PageRank for Classification on Graphs, in: International Conference on Learning Representations, 2019.
[49]
Xu K., Li C., Tian Y., Sonobe T., Kawarabayashi K.-i., Jegelka S., Representation learning on graphs with jumping knowledge networks, in: Dy J., Krause A. (Eds.), Proceedings of the 35th International Conference on Machine Learning, in: Proceedings of Machine Learning Research, vol. 80, PMLR, 2018, pp. 5453–5462.
[50]
Chamberlain B., Rowbottom J., Gorinova M.I., Bronstein M., Webb S., Rossi E., GRAND: Graph neural diffusion, in: Meila M., Zhang T. (Eds.), Proceedings of the 38th International Conference on Machine Learning, in: Proceedings of Machine Learning Research, vol. 139, PMLR, 2021.
[51]
Eliasof M., Haber E., Treister E., PDE-GCN: Novel architectures for graph neural networks motivated by partial differential equations, Adv. Neural Inf. Process. Syst. 34 (2021) 3836–3849.
[52]
Eliasof M., Haber E., Treister E., pathGCN: Learning general graph spatial operators from paths, in: International Conference on Machine Learning, PMLR, 2022, pp. 5878–5891.
[53]
Zhou K., Huang X., Zha D., Chen R., Li L., Choi S.-H., Hu X., Dirichlet energy constrained learning for deep graph neural networks, Adv. Neural Inf. Process. Syst. 34 (2021).
[54]
Yang H., Ma K., Cheng J., Rethinking graph regularization for graph neural networks, Proceedings of the AAAI Conference on Artificial Intelligence, vol. 35, 2021, pp. 4573–4581.
[55]
S. Suresh, V. Budde, J. Neville, P. Li, J. Ma, Breaking the Limit of Graph Neural Networks by Improving the Assortativity of Graphs with Local Mixing Patterns, in: Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery & Data Mining, 2021.
[56]
Zhu J., Yan Y., Zhao L., Heimann M., Akoglu L., Koutra D., Beyond homophily in graph neural networks: Current limitations and effective designs, Adv. Neural Inf. Process. Syst. 33 (2020).
[57]
Yan Y., Hashemi M., Swersky K., Yang Y., Koutra D., Two sides of the same coin: Heterophily and oversmoothing in graph convolutional neural networks, 2021, arXiv preprint arXiv:2102.06462.
[58]
Bo D., Wang X., Shi C., Shen H., Beyond low-frequency information in graph convolutional networks, Proceedings of the AAAI Conference on Artificial Intelligence, vol. 35, 2021, pp. 3950–3957.
[59]
E. Chien, J. Peng, P. Li, O. Milenkovic, Adaptive Universal Generalized PageRank Graph Neural Network, in: International Conference on Learning Representations, 2021.
[60]
K. Xu, W. Hu, J. Leskovec, S. Jegelka, How Powerful are Graph Neural Networks?, in: International Conference on Learning Representations, 2019.
[61]
C. Morris, N.M. Kriege, F. Bause, K. Kersting, P. Mutzel, M. Neumann, TUDataset: A collection of benchmark datasets for learning with graphs, in: ICML 2020 Workshop on Graph Representation Learning and beyond, GRL+ 2020, 2020.

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image Pattern Recognition Letters
Pattern Recognition Letters  Volume 184, Issue C
Aug 2024
246 pages

Publisher

Elsevier Science Inc.

United States

Publication History

Published: 18 October 2024

Author Tags

  1. Graph Neural Networks
  2. Global features
  3. Node classification

Qualifiers

  • Research-article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 0
    Total Downloads
  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 30 Dec 2024

Other Metrics

Citations

View Options

View options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media