[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to main content

Tackling Oversmoothing in GNN via Graph Sparsification

  • Conference paper
  • First Online:
Machine Learning and Knowledge Discovery in Databases. Research Track and Demo Track (ECML PKDD 2024)

Abstract

Graph Neural Network (GNN) achieves great success for node-level and graph-level tasks via encoding meaningful topological structures of networks in various domains, ranging from social to biological networks. However, repeated aggregation operations lead to excessive mixing of node representations, particularly in dense regions with multiple GNN layers, resulting in nearly indistinguishable embeddings. This phenomenon leads to the oversmoothing problem that hampers downstream graph analytics tasks. To overcome this issue, we propose a novel and flexible truss-based graph sparsification model that prunes edges from dense regions of the graph. Pruning redundant edges in dense regions helps to prevent the aggregation of excessive neighborhood information during hierarchical message passing and pooling in GNN models. We then utilize our sparsification model in the state-of-the-art baseline GNNs and pooling models, such as GIN, SAGPool, GMT, DiffPool, MinCutPool, HGP-SL, DMonPool, and AdamGNN. Extensive experiments on different real-world datasets show that our model significantly improves the performance of the baseline GNN models in the graph classification task.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
£29.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
GBP 19.95
Price includes VAT (United Kingdom)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
GBP 49.99
Price includes VAT (United Kingdom)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
GBP 59.99
Price includes VAT (United Kingdom)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    Code available at: https://github.com/TanvirKu/TGS-ECML-PKDD-24.

  2. 2.

    \(Gain(\mathbb {G}) = \frac{\text {\texttt {TGS}} -\text {Original} }{\text {Original}} \times 100\% \).

References

  1. Miao, X., et al.: DeGNN: improving graph neural networks with graph decomposition. In: Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery & Data Mining, pp. 1223–1233 (2021)

    Google Scholar 

  2. Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018)

  3. Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735. PMLR (2020)

    Google Scholar 

  4. Wang, Y., Derr, T.: Tree decomposed graph neural network. In: Proceedings of the 30th ACM International Conference on Information & Knowledge Management, pp. 2040–2049 (2021)

    Google Scholar 

  5. Wang, J., Cheng, J.: Truss decomposition in massive networks. arXiv preprint arXiv:1205.6693 (2012)

  6. Scarselli, F., Gori, M., Tsoi, A.C., Hagenbuchner, M., Monfardini, G.: The graph neural network model. IEEE Trans. Neural Netw. 20(1), 61–80 (2008)

    Article  Google Scholar 

  7. Kipf, T.N., Welling, M.: Semi-supervised classification with graph convolutional networks. arXiv preprint arXiv:1609.02907 (2016)

  8. Ying, Z., You, J., Morris, C., Ren, X., Hamilton, W., Leskovec, J.: Hierarchical graph representation learning with differentiable pooling. Advances in Neural Information Processing Systems, vol. 31 (2018)

    Google Scholar 

  9. Bianchi, F.M., Grattarola, D., Alippi, C.: Spectral clustering with graph neural networks for graph pooling. In: International Conference on Machine Learning, pp. 874–883. PMLR (2020)

    Google Scholar 

  10. Tsitsulin, A., Palowitch, J., Perozzi, B., Müller, E.: Graph clustering with graph neural networks. J. Mach. Learn. Res. 24(127), 1–21 (2023)

    MathSciNet  Google Scholar 

  11. Baek, J., Kang, M., Hwang, S.J.: Accurate learning of graph representations with graph multiset pooling. arXiv preprint arXiv:2102.11533 (2021)

  12. Zhang, Z., et al.: Hierarchical graph pooling with structure learning. arXiv preprint arXiv:1911.05954 (2019)

  13. Lee, J., Lee, I., Kang, J.: Self-attention graph pooling. In: International Conference on Machine Learning, pp. 3734–3743. PMLR (2019)

    Google Scholar 

  14. Zhong, Z., Li, C.T., Pang, J.: Multi-grained semantics-aware graph neural networks. IEEE Trans. Knowl. Data Eng. (2022)

    Google Scholar 

  15. Morris, C., Kriege, N.M., Bause, F., Kersting, K., Mutzel, P., Neumann, M.: TUDataset: a collection of benchmark datasets for learning with graphs. arXiv preprint arXiv:2007.08663 (2020)

  16. Van Belle, R., Van Damme, C., Tytgat, H., De Weerdt, J.: Inductive graph representation learning for fraud detection. Expert Syst. Appl. 193, 116463 (2022)

    Article  Google Scholar 

  17. Saifuddin, K.M., Bumgardner, B., Tanvir, F., Akbas, E.: HyGNN: drug-drug interaction prediction via hypergraph neural network. In: 2023 IEEE 39th International Conference on Data Engineering (ICDE), pp. 1503–1516. IEEE (2023)

    Google Scholar 

  18. Zhang, S., Guo, Y., Zhao, P., Zheng, C., Chen, X.: A graph-based temporal attention framework for multi-sensor traffic flow forecasting. IEEE Trans. Intell. Transp. Syst. 23(7), 7743–7758 (2021)

    Article  Google Scholar 

  19. Deng, Y.: Recommender systems based on graph embedding techniques: a review. IEEE Access 10, 51587–51633 (2022)

    Article  Google Scholar 

  20. Rong, Y., Huang, W., Xu, T., Huang, J.: DropEdge: towards deep graph convolutional networks on node classification. arXiv preprint arXiv:1907.10903 (2019)

  21. Huang, R., Li, P.: Hub-hub connections matter: improving edge dropout to relieve over-smoothing in graph neural networks. Knowl.-Based Syst. 270, 110556 (2023)

    Article  Google Scholar 

  22. Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, no. 04, pp. 3438–3445 (2020)

    Google Scholar 

  23. Wang, Y., Wang, H., Jin, H., Huang, X., Wang, X.: Exploring graph capsual network for graph classification. Inf. Sci. 581, 932–950 (2021)

    Article  Google Scholar 

  24. Huang, X., Cheng, H., Qin, L., Tian, W., Yu, J.X.: Querying k-truss community in large and dynamic graphs. In: Proceedings of the 2014 ACM SIGMOD International Conference on Management of Data, pp. 1311–1322 (2014)

    Google Scholar 

  25. Akbas, E., Zhao, P.: Truss-based community search: a truss-equivalence based indexing approach. Proc. VLDB Endow. 10(11), 1298–1309 (2017)

    Article  Google Scholar 

  26. Diab, S., Olabi, M.G., El Hajj, I.: KTrussExplorer: exploring the design space of k-truss decomposition optimizations on GPUs. In: 2020 IEEE High Performance Extreme Computing Conference (HPEC), pp. 1–8. IEEE (2020)

    Google Scholar 

  27. Wang, Z., Ji, S.: Second-order pooling for graph neural networks. IEEE Trans. Pattern Anal. Mach. Intell. (2020)

    Google Scholar 

  28. Ranjan, E., Sanyal, S., Talukdar, P.: ASAP: adaptive structure aware pooling for learning hierarchical graph representations. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, no. 04, pp. 5470–5477 (2020)

    Google Scholar 

  29. Zuo, X., Yuan, H., Yang, B., Wang, H., Wang, Y.: Exploring graph capsual network and graphormer for graph classification. Inf. Sci. 640, 119045 (2023)

    Article  Google Scholar 

  30. Bacciu, D., Conte, A., Grossi, R., Landolfi, F., Marino, A.: K-plex cover pooling for graph neural networks. Data Min. Knowl. Discov. 35(5), 2200–2220 (2021)

    Article  MathSciNet  Google Scholar 

  31. Zhang, W., et al.: Node dependent local smoothing for scalable graph learning. In: Advances in Neural Information Processing Systems, vol. 34, pp. 20321–20332 (2021)

    Google Scholar 

  32. Liu, C., et al.: Graph pooling for graph neural networks: progress, challenges, and opportunities. arXiv preprint arXiv:2204.07321 (2022)

Download references

Acknowledgements

This work is partially supported by the U.S. National Science Foundation (NSF) under Grant No. 2308206 (CRII: III).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Tanvir Hossain .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Hossain, T., Saifuddin, K.M., Islam, M.I.K., Tanvir, F., Akbas, E. (2024). Tackling Oversmoothing in GNN via Graph Sparsification. In: Bifet, A., et al. Machine Learning and Knowledge Discovery in Databases. Research Track and Demo Track. ECML PKDD 2024. Lecture Notes in Computer Science(), vol 14948. Springer, Cham. https://doi.org/10.1007/978-3-031-70371-3_10

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-70371-3_10

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-70370-6

  • Online ISBN: 978-3-031-70371-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics