Abstract
Graph Neural Network (GNN) achieves great success for node-level and graph-level tasks via encoding meaningful topological structures of networks in various domains, ranging from social to biological networks. However, repeated aggregation operations lead to excessive mixing of node representations, particularly in dense regions with multiple GNN layers, resulting in nearly indistinguishable embeddings. This phenomenon leads to the oversmoothing problem that hampers downstream graph analytics tasks. To overcome this issue, we propose a novel and flexible truss-based graph sparsification model that prunes edges from dense regions of the graph. Pruning redundant edges in dense regions helps to prevent the aggregation of excessive neighborhood information during hierarchical message passing and pooling in GNN models. We then utilize our sparsification model in the state-of-the-art baseline GNNs and pooling models, such as GIN, SAGPool, GMT, DiffPool, MinCutPool, HGP-SL, DMonPool, and AdamGNN. Extensive experiments on different real-world datasets show that our model significantly improves the performance of the baseline GNN models in the graph classification task.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
Code available at: https://github.com/TanvirKu/TGS-ECML-PKDD-24.
- 2.
\(Gain(\mathbb {G}) = \frac{\text {\texttt {TGS}} -\text {Original} }{\text {Original}} \times 100\% \).
References
Miao, X., et al.: DeGNN: improving graph neural networks with graph decomposition. In: Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery & Data Mining, pp. 1223–1233 (2021)
Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018)
Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735. PMLR (2020)
Wang, Y., Derr, T.: Tree decomposed graph neural network. In: Proceedings of the 30th ACM International Conference on Information & Knowledge Management, pp. 2040–2049 (2021)
Wang, J., Cheng, J.: Truss decomposition in massive networks. arXiv preprint arXiv:1205.6693 (2012)
Scarselli, F., Gori, M., Tsoi, A.C., Hagenbuchner, M., Monfardini, G.: The graph neural network model. IEEE Trans. Neural Netw. 20(1), 61–80 (2008)
Kipf, T.N., Welling, M.: Semi-supervised classification with graph convolutional networks. arXiv preprint arXiv:1609.02907 (2016)
Ying, Z., You, J., Morris, C., Ren, X., Hamilton, W., Leskovec, J.: Hierarchical graph representation learning with differentiable pooling. Advances in Neural Information Processing Systems, vol. 31 (2018)
Bianchi, F.M., Grattarola, D., Alippi, C.: Spectral clustering with graph neural networks for graph pooling. In: International Conference on Machine Learning, pp. 874–883. PMLR (2020)
Tsitsulin, A., Palowitch, J., Perozzi, B., Müller, E.: Graph clustering with graph neural networks. J. Mach. Learn. Res. 24(127), 1–21 (2023)
Baek, J., Kang, M., Hwang, S.J.: Accurate learning of graph representations with graph multiset pooling. arXiv preprint arXiv:2102.11533 (2021)
Zhang, Z., et al.: Hierarchical graph pooling with structure learning. arXiv preprint arXiv:1911.05954 (2019)
Lee, J., Lee, I., Kang, J.: Self-attention graph pooling. In: International Conference on Machine Learning, pp. 3734–3743. PMLR (2019)
Zhong, Z., Li, C.T., Pang, J.: Multi-grained semantics-aware graph neural networks. IEEE Trans. Knowl. Data Eng. (2022)
Morris, C., Kriege, N.M., Bause, F., Kersting, K., Mutzel, P., Neumann, M.: TUDataset: a collection of benchmark datasets for learning with graphs. arXiv preprint arXiv:2007.08663 (2020)
Van Belle, R., Van Damme, C., Tytgat, H., De Weerdt, J.: Inductive graph representation learning for fraud detection. Expert Syst. Appl. 193, 116463 (2022)
Saifuddin, K.M., Bumgardner, B., Tanvir, F., Akbas, E.: HyGNN: drug-drug interaction prediction via hypergraph neural network. In: 2023 IEEE 39th International Conference on Data Engineering (ICDE), pp. 1503–1516. IEEE (2023)
Zhang, S., Guo, Y., Zhao, P., Zheng, C., Chen, X.: A graph-based temporal attention framework for multi-sensor traffic flow forecasting. IEEE Trans. Intell. Transp. Syst. 23(7), 7743–7758 (2021)
Deng, Y.: Recommender systems based on graph embedding techniques: a review. IEEE Access 10, 51587–51633 (2022)
Rong, Y., Huang, W., Xu, T., Huang, J.: DropEdge: towards deep graph convolutional networks on node classification. arXiv preprint arXiv:1907.10903 (2019)
Huang, R., Li, P.: Hub-hub connections matter: improving edge dropout to relieve over-smoothing in graph neural networks. Knowl.-Based Syst. 270, 110556 (2023)
Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, no. 04, pp. 3438–3445 (2020)
Wang, Y., Wang, H., Jin, H., Huang, X., Wang, X.: Exploring graph capsual network for graph classification. Inf. Sci. 581, 932–950 (2021)
Huang, X., Cheng, H., Qin, L., Tian, W., Yu, J.X.: Querying k-truss community in large and dynamic graphs. In: Proceedings of the 2014 ACM SIGMOD International Conference on Management of Data, pp. 1311–1322 (2014)
Akbas, E., Zhao, P.: Truss-based community search: a truss-equivalence based indexing approach. Proc. VLDB Endow. 10(11), 1298–1309 (2017)
Diab, S., Olabi, M.G., El Hajj, I.: KTrussExplorer: exploring the design space of k-truss decomposition optimizations on GPUs. In: 2020 IEEE High Performance Extreme Computing Conference (HPEC), pp. 1–8. IEEE (2020)
Wang, Z., Ji, S.: Second-order pooling for graph neural networks. IEEE Trans. Pattern Anal. Mach. Intell. (2020)
Ranjan, E., Sanyal, S., Talukdar, P.: ASAP: adaptive structure aware pooling for learning hierarchical graph representations. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, no. 04, pp. 5470–5477 (2020)
Zuo, X., Yuan, H., Yang, B., Wang, H., Wang, Y.: Exploring graph capsual network and graphormer for graph classification. Inf. Sci. 640, 119045 (2023)
Bacciu, D., Conte, A., Grossi, R., Landolfi, F., Marino, A.: K-plex cover pooling for graph neural networks. Data Min. Knowl. Discov. 35(5), 2200–2220 (2021)
Zhang, W., et al.: Node dependent local smoothing for scalable graph learning. In: Advances in Neural Information Processing Systems, vol. 34, pp. 20321–20332 (2021)
Liu, C., et al.: Graph pooling for graph neural networks: progress, challenges, and opportunities. arXiv preprint arXiv:2204.07321 (2022)
Acknowledgements
This work is partially supported by the U.S. National Science Foundation (NSF) under Grant No. 2308206 (CRII: III).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Hossain, T., Saifuddin, K.M., Islam, M.I.K., Tanvir, F., Akbas, E. (2024). Tackling Oversmoothing in GNN via Graph Sparsification. In: Bifet, A., et al. Machine Learning and Knowledge Discovery in Databases. Research Track and Demo Track. ECML PKDD 2024. Lecture Notes in Computer Science(), vol 14948. Springer, Cham. https://doi.org/10.1007/978-3-031-70371-3_10
Download citation
DOI: https://doi.org/10.1007/978-3-031-70371-3_10
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-70370-6
Online ISBN: 978-3-031-70371-3
eBook Packages: Computer ScienceComputer Science (R0)