Abstract
Many types of data can be represented by graphs consisting of nodes with features. The analysis of these graphs is facilitated by representing nodes as vectors, such that neighboring nodes and nodes with similar features have representations that are in close proximity in the vector space. Graph neural networks (GNNs) have been developed to learn such representations automatically, but their disadvantage is that the aggregation distance, which determines how many other nodes are considered relevant to each node, is fixed by the number of layers in the GNN. If the aggregation distance is too large, over-smoothing can result. This paper proposes TwinGNN for learning high-quality representations of nodes. Instead of using a fixed distance for the entire graph, TwinGNN learns the suitable distance for each node. We evaluated TwinGNN experimentally with several benchmark graph datasets. The results showed that TwinGNN successfully avoided over-smoothing. Its performance did not deteriorate as the number of GNN layers increased. Moreover, it performed better than the existing GNNs on almost all datasets.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
In this paper, we address the binary classification of nodes, although the proposed method can be extended to multi-class node classification, multi-label node classification, and graph classification.
- 2.
References
Fortunato, S.: Community detection in graphs. Phys. Rep. 486(3–5), 75–174 (2010)
Kipf, T.N., Welling, M.: Semi-Supervised Classification with Graph Convolutional Networks. arXiv preprint arXiv:1609.02907 (2016)
Veličković, P., et al.: Graph Attention Networks. arXiv preprint, arXiv:1710.10903 (2017)
Hamilton, W.L., et al.: Inductive representation learning on large graphs. In: Proceedings of Int’l Conference on Neural Information Processing Systems, pp. 1025–1035 (2017)
Xu, K., et al.; How Powerful are Graph Neural Networks? arXiv preprint, arXiv:1810.00826 (2018)
Li, G., et al.: Deepergcn: All You Need to Train Deeper GCNs. arXiv preprint, arXiv:2006.07739 (2020)
He, K., et al.: Deep Residual Learning for Image Recognition. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 770–778 (2016)
Huang, G., et al.: Densely connected convolutional networks. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 4700–4708 (2017)
Veit, A., et al.: Residual Networks Behave like Ensembles of Relatively Shallow Networks. Adv. Neural. Inf. Process. Syst. 29, 550–558 (2016)
Bromley, J., et al.: Signature Verification using a "Siamese" Time Delay Neural Network. Advances in Neural Information Processing Systems, 6 (1993)
Yost, G.P.: Lectures on Probability and Statistics (1985)
Xu, K., et al.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462. PMLR (2018)
Levie, R., et al.: Cayleynets: graph convolutional neural networks with complex rational spectral filters. IEEE Trans. Signal Process. 67(1), 97–109 (2018)
Hu, W., et al.: Open Graph Benchmark: Datasets for Machine Learning on Graphs. arXiv preprint, arXiv:2005.00687 (2020)
Pei, H., et al.: Geom-gcn: Geometric Graph Convolutional Networks. arXiv preprint, arXiv:2002.05287 (2020)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Yajima, Y., Inokuchi, A. (2022). Why Deeper Graph Neural Network Performs Worse? Discussion and Improvement About Deep GNNs. In: Pimenidis, E., Angelov, P., Jayne, C., Papaleonidas, A., Aydin, M. (eds) Artificial Neural Networks and Machine Learning – ICANN 2022. ICANN 2022. Lecture Notes in Computer Science, vol 13530. Springer, Cham. https://doi.org/10.1007/978-3-031-15931-2_60
Download citation
DOI: https://doi.org/10.1007/978-3-031-15931-2_60
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-15930-5
Online ISBN: 978-3-031-15931-2
eBook Packages: Computer ScienceComputer Science (R0)