[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/3580305.3599476acmconferencesArticle/Chapter ViewAbstractPublication PageskddConference Proceedingsconference-collections
research-article
Free access

PROSE: Graph Structure Learning via Progressive Strategy

Published: 04 August 2023 Publication History

Abstract

Graph Neural Networks (GNNs) have been a powerful tool to acquire high-quality node representations dealing with graphs, which strongly depends on a promising graph structure. In the real world scenarios, it is inevitable to introduce noises in graph topology. To prevent GNNs from the disturbance of irrelevant edges or missing edges, graph structure learning is proposed and has attracted considerable attentions in recent years. In this paper, we argue that current graph structure learning methods still pay no regard to the status of nodes and just judge all of their connections simultaneously using a monotonous standard, which will lead to indeterminacy and instability in the optimization process. We designate these methods as status-unaware models. To demonstrate the rationality of our point of view, we conduct exploratory experiments on publicly available datasets, and discover some exciting observations. Afterwards, we propose a new model named Graph Structure Learning via Progressive Strategy (PROSE) according to the observations, which uses a progressive strategy to acquire ideal graph structure in a status-aware way. Concretely, PROSE consists of progressive structure splitting module (PSS) and progressive structure refining module (PSR) to modify node connections according to their global potency, and we also introduce horizontal position encoding and vertical position encoding in order to capture fruitful graph topology information ignored by previous methods. On several widely-used graph datasets, we conduct extensive experiments to demonstrate the effectiveness of our model, and the source code 1 https://github.com/tigerbunny2023/PROSE is provided.

Supplementary Material

MP4 File (0018#-2min-promo.mp4)
Considering that each node has its own characteristics, we believe graph structure learning via progressive strategy should be adopted, rather than refining all connections at once. In other words, the topology refinement of whole graph should be completed in multiple steps, and only a few nodes? edges would be modified at each step. The proposed method PROSE contains two main modules, namely Progressive Structure Splitting module (PSS) and Progressive Structure Refining module (PSR). In general, they are endowed with the symmetrical architecture, and the former provides guidance for the latter in determining which nodes? topology is to be adjusted at each step. Specifically, PSS is responsible for selecting celebrity nodes through a scoring function. The real graph structure optimization is carried out in PSR, and the subgraph topology obtained last in PSS will be adjusted first in PSR.
MP4 File (rtfp0018#-2min-promo.mp4)
Considering that each node has its own characteristics, we believe graph structure learning via progressive strategy should be adopted, rather than refining all connections at once. In other words, the topology refinement of whole graph should be completed in multiple steps, and only a few nodes? edges would be modified at each step. The proposed method PROSE contains two main modules, namely Progressive Structure Splitting module (PSS) and Progressive Structure Refining module (PSR). In general, they are endowed with the symmetrical architecture, and the former provides guidance for the latter in determining which nodes? topology is to be adjusted at each step. Specifically, PSS is responsible for selecting celebrity nodes through a scoring function. The real graph structure optimization is carried out in PSR, and the subgraph topology obtained last in PSS will be adjusted first in PSR.

References

[1]
Joseph Berger, Bernard P Cohen, and Morris Zelditch Jr. 1972. Status characteris- tics and social interaction. American Sociological Review (1972), 241--255.
[2]
Joan Bruna, Wojciech Zaremba, Arthur Szlam, and Yann LeCun. 2014. Spectral Networks and Locally Connected Networks on Graphs. In ICLR.
[3]
Guanrong Chen, Xiaofan Wang, and Xiang Li. 2012. Introduction to complex networks: models, structures and dynamics. Higher Education Press.
[4]
Yu Chen, Lingfei Wu, and Mohammed Zaki. 2020. Iterative deep graph learning for graph neural networks: Better and robust node embeddings. In NeurIPS. 19314--19326.
[5]
Michaël Defferrard, Xavier Bresson, and Pierre Vandergheynst. 2016. Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering. In NeurIPS. 3837--3845.
[6]
Jinlong Du, Senzhang Wang, Hao Miao, and Jiaqiang Zhang. 2021. Multi-Channel Pooling Graph Neural Networks. In IJCAI. 1442--1448.
[7]
Bahare Fatemi, Layla El Asri, and Seyed Mehran Kazemi. 2021. SLAPS: Self- supervision improves structure learning for graph neural networks. In NeurIPS. 22667--22681.
[8]
Luca Franceschi, Mathias Niepert, Massimiliano Pontil, and Xiao He. 2019. Learning Discrete Structures for Graph Neural Networks. In ICML. 1972--1982.
[9]
Hongyang Gao and Shuiwang Ji. 2019. Graph U-Nets. In ICML, Vol. 97. 2083--2092.
[10]
Justin Gilmer, Samuel S. Schoenholz, Patrick F. Riley, Oriol Vinyals, and George E. Dahl. 2017. Neural Message Passing for Quantum Chemistry. In ICML. 1263--1272.
[11]
William L. Hamilton, Zhitao Ying, and Jure Leskovec. 2017. Inductive Represen- tation Learning on Large Graphs. In NeurIPS. 1024--1034.
[12]
Xiangnan He, Kuan Deng, Xiang Wang, Yan Li, Yongdong Zhang, and Meng Wang. 2020. Lightgcn: Simplifying and powering graph convolution network for recommendation. In SIGIR. 639--648.
[13]
Bo Jiang, Ziyan Zhang, Doudou Lin, Jin Tang, and Bin Luo. 2019. Semi-Supervised Learning With Graph Learning-Convolutional Networks. In CVPR. 11313--11320.
[14]
Wei Jin, Yao Ma, Xiaorui Liu, Xianfeng Tang, Suhang Wang, and Jiliang Tang. 2020. Graph structure learning for robust graph neural networks. In KDD. 66--74.
[15]
Thomas N. Kipf and Max Welling. 2017. Semi-Supervised Classification with Graph Convolutional Networks. In ICLR.
[16]
Kuan Li, Yang Liu, Xiang Ao, Jianfeng Chi, Jinghua Feng, Hao Yang, and Qing He. 2022. Reliable Representations Make A Stronger Defender: Unsupervised Structure Refinement for Robust GNN. In KDD. 925--935.
[17]
Maosen Li, Siheng Chen, Ya Zhang, and Ivor W. Tsang. 2020. Graph Cross Networks with Vertex Infomax Pooling. In NeurIPS.
[18]
Ruoyu Li, Sheng Wang, Feiyun Zhu, and Junzhou Huang. 2018. Adaptive Graph Convolutional Neural Networks. In AAAI. 3546--3553.
[19]
Nian Liu, Xiao Wang, Lingfei Wu, Yu Chen, Xiaojie Guo, and Chuan Shi. 2022. Compact Graph Structure Learning via Mutual Information Compression. In WWW. 1601--1610.
[20]
Yixin Liu, Yu Zheng, Daokun Zhang, Hongxu Chen, Hao Peng, and Shirui Pan. 2022. Towards unsupervised deep graph structure learning. In WWW. 1392--1403.
[21]
Marlaine E Lockheed, Abigail M Harris, and William P Nemceff. 1983. Sex and social influence: Does sex function as a status characteristic in mixed-sex groups of children? Journal of Educational Psychology 75, 6 (1983), 877.
[22]
Miller McPherson, Lynn Smith-Lovin, and James M Cook. 2001. Birds of a feather: Homophily in social networks. Annual review of sociology (2001), 415--444.
[23]
Lawrence Page, Sergey Brin, Rajeev Motwani, and Terry Winograd. 1999. The Page Rank citation ranking: Bringing order to the web. Technical Report. Stanford InfoLab.
[24]
Fabian Pedregosa, Gaël Varoquaux, Alexandre Gramfort, Vincent Michel, Bertrand Thirion, Olivier Grisel, Mathieu Blondel, Peter Prettenhofer, Ron Weiss, Vincent Dubourg, Jake Vander Plas, Alexandre Passos, David Cournapeau, Matthieu Brucher, Matthieu Perrot, and Edouard Duchesnay. 2011. Scikit-learn: Machine Learning in Python. J. Mach. Learn. Res. 12 (2011), 2825--2830.
[25]
Jiezhong Qiu, Qibin Chen, Yuxiao Dong, Jing Zhang, Hongxia Yang, Ming Ding, Kuansan Wang, and Jie Tang. 2020. GCC: Graph Contrastive Coding for Graph Neural Network Pre-Training. In KDD. 1150--1160.
[26]
David I. Shuman, Sunil K. Narang, Pascal Frossard, Antonio Ortega, and Pierre Vandergheynst. 2013. The Emerging Field of Signal Processing on Graphs: Extending High-Dimensional Data Analysis to Networks and Other Irregular Domains. IEEE Signal Process. Mag. 30, 3 (2013), 83--98.
[27]
Qingyun Sun, Jianxin Li, Hao Peng, Jia Wu, Xingcheng Fu, Cheng Ji, and S Yu Philip. 2022. Graph Structure Learning with Variational Information Bottleneck. In AAAI. 4165--4174.
[28]
Petar Velickovic, Guillem Cucurull, Arantxa Casanova, Adriana Romero, Pietro Liò, and Yoshua Bengio. 2018. Graph Attention Networks. In ICLR.
[29]
Petar Velickovic, William Fedus, William L. Hamilton, Pietro Liò, Yoshua Bengio, and R. Devon Hjelm. 2019. Deep Graph Infomax. In ICLR.
[30]
Ruijia Wang, Shuai Mou, Xiao Wang, Wanpeng Xiao, Qi Ju, Chuan Shi, and Xing Xie. 2021. Graph Structure Estimation Neural Networks. In WWW. 342--353.
[31]
Xiao Wang, Meiqi Zhu, Deyu Bo, Peng Cui, Chuan Shi, and Jian Pei. 2020. AM-GCN: Adaptive Multi-channel Graph Convolutional Networks. In KDD. 1243--1253.
[32]
Felix Wu, Amauri H. Souza Jr., Tianyi Zhang, Christopher Fifty, Tao Yu, and Kilian Q. Weinberger. 2019. Simplifying Graph Convolutional Networks. In ICML. 6861--6871.
[33]
Zonghan Wu, Shirui Pan, Fengwen Chen, Guodong Long, Chengqi Zhang, and Philip S. Yu. 2021. A Comprehensive Survey on Graph Neural Networks. IEEE Trans. Neural Networks Learn. Syst. 32, 1 (2021), 4--24.
[34]
Keyulu Xu, Weihua Hu, Jure Leskovec, and Stefanie Jegelka. 2019. How Powerful are Graph Neural Networks?. In ICLR.
[35]
Cheng Yang, Chunchen Wang, Yuanfu Lu, Xumeng Gong, Chuan Shi, Wei Wang, and Xu Zhang. 2022. Few-shot Link Prediction in Dynamic Networks. In WSDM. 1245--1255.
[36]
Liang Yang, Zesheng Kang, Xiaochun Cao, Di Jin, Bo Yang, and Yuanfang Guo. 2019. Topology Optimization based Graph Convolutional Network. In IJCAI. 4054--4061.
[37]
Zhilin Yang, William W. Cohen, and Ruslan Salakhutdinov. 2016. Revisiting Semi-Supervised Learning with Graph Embeddings. In ICML. 40--48.
[38]
Zhitao Ying, Jiaxuan You, Christopher Morris, Xiang Ren, Will Hamilton, and Jure Leskovec. 2018. Hierarchical graph representation learning with differentiable pooling. In NeurIPS. 4805--4815.
[39]
Donghan Yu, Ruohong Zhang, Zhengbao Jiang, Yuexin Wu, and Yiming Yang. 2020. Graph-Revised Convolutional Network. In ECML-PKDD. 378--393.
[40]
Seongjun Yun, Minbyul Jeong, Raehyun Kim, Jaewoo Kang, and Hyunwoo J. Kim. 2019. Graph Transformer Networks. In NeurIPS. 11960--11970.
[41]
Jianan Zhao, Xiao Wang, Chuan Shi, Binbin Hu, Guojie Song, and Yanfang Ye. 2021. Heterogeneous graph structure learning for graph neural networks. In AAAI. 4697--4705.
[42]
Cheng Zheng, Bo Zong, Wei Cheng, Dongjin Song, Jingchao Ni, Wenchao Yu, Haifeng Chen, and Wei Wang. 2020. Robust graph representation learning via neural sparsification. In ICML. 11458--11468.
[43]
Jie Zhou, Ganqu Cui, Shengding Hu, Zhengyan Zhang, Cheng Yang, Zhiyuan Liu, Lifeng Wang, Changcheng Li, and Maosong Sun. 2020. Graph neural networks: A review of methods and applications. AI Open 1 (2020), 57--81.
[44]
Yanqiao Zhu, Weizhi Xu, Jinghao Zhang, Qiang Liu, Shu Wu, and Liang Wang. 2021. Deep graph structure learning for robust representations: A survey. CoRR abs/2103.03036 (2021).
[45]
Yanqiao Zhu, Yichen Xu, Feng Yu, Qiang Liu, Shu Wu, and Liang Wang. 2021. Graph Contrastive Learning with Adaptive Augmentation. In WWW. 2069--2080.

Cited By

View all
  • (2024)Learning latent structures in network games via data-dependent gated-prior graph variational autoencodersProceedings of the 41st International Conference on Machine Learning10.5555/3692070.3694442(57507-57526)Online publication date: 21-Jul-2024

Index Terms

  1. PROSE: Graph Structure Learning via Progressive Strategy

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    KDD '23: Proceedings of the 29th ACM SIGKDD Conference on Knowledge Discovery and Data Mining
    August 2023
    5996 pages
    ISBN:9798400701030
    DOI:10.1145/3580305
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 04 August 2023

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. graph neural networks
    2. graph structure learning
    3. node representation learning
    4. progressive strategy

    Qualifiers

    • Research-article

    Conference

    KDD '23
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 1,133 of 8,635 submissions, 13%

    Upcoming Conference

    KDD '25

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)510
    • Downloads (Last 6 weeks)67
    Reflects downloads up to 15 Jan 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Learning latent structures in network games via data-dependent gated-prior graph variational autoencodersProceedings of the 41st International Conference on Machine Learning10.5555/3692070.3694442(57507-57526)Online publication date: 21-Jul-2024

    View Options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Login options

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media