[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
research-article

H-Diffu: Hyperbolic Representations for Information Diffusion Prediction

Published: 01 September 2023 Publication History

Abstract

With the proliferation of online social networks, a great deal of online user action data has been generated. Such data has enabled the study of information diffusion prediction, which is a fundamental problem for understanding the propagation of information on social media platforms. In diffusion prediction models, there are two standard components, i.e., a social graph and information diffusion cascades. We observe that both components exhibit latent hierarchical structures. However, most existing models are designed based on euclidean spaces, and hence cannot effectively capture complex patterns, especially hierarchical structures. Therefore, we investigate a novel research problem to learn hyperbolic representations for information diffusion prediction. To reflect the different characteristics of social graphs and diffusion cascades, we encode them into two latent hyperbolic spaces with different trainable curvatures. In addition, to model influence dependencies, we propose a co-attention mechanism to capture the processes of diffusion cascades using positional embeddings. Given a set of activated seed users, we jointly exploit diffusion cascades and social links to predict which users will be influenced. We conduct extensive experiments on four real-world datasets. Empirical results demonstrate that the proposed H-Diffu model significantly outperforms several state-of-the-art diffusion prediction frameworks.

References

[1]
D. Kempe, J. M. Kleinberg, and E. Tardos, “Maximizing the spread of influence through a social network,” in Proc. ACM SIGKDD Int. Conf. Knowl. Discov. Data Mining, 2003, pp. 137–146.
[2]
X. Gao, Z. Cao, S. Li, B. Yao, G. Chen, and S. Tang, “Taxonomy and evaluation for microblog popularity prediction,” ACM Trans. Knowl. Discov. Data, vol. 13, no. 2, pp. 1–40, 2019.
[3]
L. Wu, P. Sun, Y. Fu, R. Hong, X. Wang, and M. Wang, “A neural influence diffusion model for social recommendation,” in Proc. Int. ACM SIGIR Conf. Res. Develop. Inf. Retrieval, 2019, pp. 235–244.
[4]
S. Feng, G. Cong, A. Khan, X. Li, Y. Liu, and Y. M. Chee, “Inf2vec: Latent representation model for social influence embedding,” in Proc. IEEE Int. Conf. Data Eng., 2018, pp. 941–952.
[5]
A. Sankar, X. Zhang, A. Krishnan, and J. Han, “Inf-VAE: A variational autoencoder framework to integrate homophily and influence in diffusion prediction,” in Proc. ACM Int. Conf. Web Search Data Mining, 2020, pp. 510–518.
[6]
K. Saito, R. Nakano, and M. Kimura, “Prediction of information diffusion probabilities for independent cascade model,” in Proc. Int. Conf. Knowl.-Based Intell. Inf. Eng. Syst., 2008, pp. 67–75.
[7]
A. Goyal, F. Bonchi, and L. V. S. Lakshmanan, “Learning influence probabilities in social networks,” in Proc. ACM Int. Conf. Web Search Data Mining, 2010, pp. 241–250.
[8]
Y. Wang, H. Shen, S. Liu, J. Gao, and X. Cheng, “Cascade dynamics modeling with attention-based recurrent neural network,” in Proc. Int. Joint Conf. Artif. Intell., 2017, pp. 2985–2991.
[9]
S. Bourigault, C. Lagnier, S. Lamprier, L. Denoyer, and P. Gallinari, “Learning social network embeddings for predicting information diffusion,” in Proc. ACM Int. Conf. Web Search Data Mining, 2014, pp. 393–402.
[10]
S. Bourigault, S. Lamprier, and P. Gallinari, “Representation learning for information diffusion through social networks: An embedded cascade model,” in Proc. ACM Int. Conf. Web Search Data Mining, 2016, pp. 573–582.
[11]
Z. Wang, C. Chen, and W. Li, “Information diffusion prediction with network regularized role-based user representation learning,” ACM Trans. Knowl. Discov. Data, vol. 13, no. 3, pp. 1–23, 2019.
[12]
Y. Zhang, T. Lyu, and Y. Zhang, “COSINE: Community-preserving social network embedding from information diffusion cascades,” in Proc. AAAI Conf. Artif. Intell., 2018, pp. 2620–2627.
[13]
Z. Wang, C. Chen, and W. Li, “Joint learning of user representation with diffusion sequence and network structure,” IEEE Trans. Knowl. Data Eng., vol. 34, no. 3, pp. 1275–1287, Mar. 2020.
[14]
J. Wang, V. W. Zheng, Z. Liu, and K. C.-C. Chang, “Topological recurrent neural network for diffusion prediction,” in Proc. IEEE Int. Conf. Data Mining, 2017, pp. 475–484.
[15]
M. R. Islam, S. Muthiah, B. Adhikari, B. A. Prakash, and N. Ramakrishnan, “Deepdiffuse: Predicting the ’who’ and ’when’ in cascades,” in Proc. IEEE Int. Conf. Data Mining., 2018, pp. 1055–1060.
[16]
Z. Wang, C. Chen, and W. Li, “A sequential neural information diffusion model with structure attention,” in Proc. ACM Int. Conf. Inf. Knowl. Manage., 2018, pp. 1795–1798.
[17]
C. Yang, M. Sun, H. Liu, S. Han, Z. Liu, and H. Luan, “Neural diffusion model for microscopic cascade study,” IEEE Trans. Knowl. Data Eng., vol. 33, no. 3, pp. 1128–1139, Mar. 2021.
[18]
C. Yang, J. Tang, M. Sun, G. Cui, and Z. Liu, “Multi-scale information diffusion prediction with reinforced recurrent networks,” in Proc. Int. Joint Conf. Artif. Intell., 2019, pp. 4033–4039.
[19]
Z. Wang and W. Li, “Hierarchical diffusion attention network,” in Proc. Int. Joint Conf. Artif. Intell., 2019, pp. 3828–3834.
[20]
Z. Wang, C. Chen, and W. Li, “Attention network for information diffusion prediction,” in Proc. Web Conf., 2018, pp. 65–66.
[21]
E. Ravasz and A.-L. Barabási, “Hierarchical organization in complex networks,” Phys. Rev. E, vol. 67, no. 2, 2003, Art. no.
[22]
M. Nickel and D. Kiela, “Poincaré embeddings for learning hierarchical representations,” in Proc. Int. Conf. Neural Inf. Process. Syst., 2017, pp. 6341–6350.
[23]
A. B. Adcock, B. D. Sullivan, and M. W. Mahoney, “Tree-like structure in large social and information networks,” in Proc. IEEE Int. Conf. Data Mining, 2013, pp. 1–10.
[24]
M. Borassi, A. Chessa, and G. Caldarelli, “Hyperbolicity measures democracy in real-world networks,” Phys. Rev. E, vol. 92, no. 3, 2015, pp. 32812.
[25]
M. Nickel and D. Kiela, “Learning continuous hierarchies in the lorentz model of hyperbolic geometry,” Proc. Int. Conf. Mach. Learn., 2018, pp. 3776–3785.
[26]
D. V. Krioukov, F. Papadopoulos, M. Kitsak, A. Vahdat, and M. Boguñá, “Hyperbolic geometry of complex networks,” Phys. Rev. E, vol. 82, 2010, Art. no.
[27]
W. Chen, L. V. Lakshmanan, and C. Castillo, “Information and influence propagation in social networks,” Synth. Lectures Data Manage., vol. 5, no. 4, pp. 1–177, 2013.
[28]
F. Zhou, X. Xu, G. Trajcevski, and K. Zhang, “A survey of information cascade analysis: Models, predictions and recent advances,” ACM Comput. Surveys, vol. 54, no. 2, pp. 1–36, 2020.
[29]
B. Zong, Y. Wu, A. K. Singh, and X. Yan, “Inferring the underlying structure of information cascades,” in Proc. IEEE Int. Conf. Data Mining, 2012, pp. 1218–1223.
[30]
M. Gomez Rodriguez, J. Leskovec, and A. Krause, “Inferring networks of diffusion and influence,” in Proc. ACM SIGKDD Int. Conf. Knowl. Discov. Data Mining, 2010, pp. 1019–1028.
[31]
WWW, “Can cascades be predicted?,” in Proc. 23rd Int. Conf. World Wide Web, 2014, pp. 925–936.
[32]
Q. Zhao, M. A. Erdogdu, H. Y. He, A. Rajaraman, and J. Leskovec, “Seismic: A self-exciting point process model for predicting tweet popularity,” in Proc. ACM SIGKDD Int. Conf. Knowl. Discov. Data Mining, 2015, pp. 1513–1522.
[33]
C. Li, J. Ma, X. Guo, and Q. Mei, “Deepcas: An end-to-end predictor of information cascades,” in Proc. Int. Conf. World Wide Web, 2017, pp. 577–586.
[34]
C. Li, X. Guo, and Q. Mei, “Joint modeling of text and networks for cascade prediction,” in Proc. AAAI Conf. Artif. Intell., 2018, pp. 640–643.
[35]
J. Qiu, J. Tang, H. Ma, Y. Dong, K. Wang, and J. Tang, “DeepInf: Social influence prediction with deep learning,” in Proc. ACM SIGKDD Int. Conf. Knowl. Discov. Data Mining, 2018, pp. 2110–2119.
[36]
C. Yuan, J. Li, W. Zhou, Y. Lu, X. Zhang, and S. Hu, “DyHGCN: A dynamic heterogeneous graph convolutional network to learn users’ dynamic preferences for information diffusion prediction,” in Proc. Mach. Learn. Knowl. Discov. Databases: Eur. Conf., 2020, pp. 347–363.
[37]
C. De Sa, A. Gu, C. Ré, and F. Sala, “Representation tradeoffs for hyperbolic embeddings,” Proc. Int. Conf. Mach. Learn., 2018, pp. 4460–4469.
[38]
M. T. Law, R. Liao, J. Snell, and R. S. Zemel, “Lorentzian distance learning for hyperbolic representations,” in Proc. Int. Conf. Mach. Learn., 2019, pp. 3672–3681.
[39]
O. Ganea, G. Becigneul, and T. Hofmann, “Hyperbolic entailment cones for learning hierarchical embeddings,” in Proc. Int. Conf. Mach. Learn., 2018, pp. 1646–1655.
[40]
R. Suzuki, R. Takahama, and S. Onoda, “Hyperbolic disk embeddings for directed acyclic graphs,” in Proc. Int. Conf. Mach. Learn., 2019, pp. 6066–6075.
[41]
O. Ganea, G. Bécigneul, and T. Hofmann, “Hyperbolic neural networks,” in Proc. Int. Conf. Neural Inf. Process. Syst., 2018, pp. 5350–5360.
[42]
C. Gulcehre et al., “Hyperbolic attention networks,” in Proc. Int. Conf. Learn. Representations, 2019.
[43]
I. Chami, A. Wolf, D.-C. Juan, F. Sala, S. Ravi, and C. Ré, “Low-dimensional hyperbolic knowledge graph embeddings,” Proc. Conf. Assoc. Comput. Linguistics, 2020, pp. 6901–6914.
[44]
Q. Liu, M. Nickel, and D. Kiela, “Hyperbolic graph neural networks,” in Proc. Int. Conf. Neural Inf. Process. Syst., 2019, pp. 8228–8239.
[45]
I. Chami, Z. Ying, C. Ré, and J. Leskovec, “Hyperbolic graph convolutional neural networks,” in Proc. Int. Conf. Neural Inf. Process. Syst., 2019, pp. 4869–4880.
[46]
S. Zhu, S. Pan, C. Zhou, J. Wu, Y. Cao, and B. Wang, “Graph geometry interaction learning,” Proc. Int. Conf. Neural Inf. Process. Syst., 2020, pp. 7007–7014.
[47]
Y. Zhang, X. Wang, C. Shi, N. Liu, and G. Song, “Lorentzian graph convolutional networks,” in Proc. Web Conf., 2021, pp. 1249–1261.
[48]
L. Vinh Tran, Y. Tay, S. Zhang, G. Cong, and X. Li, “HyperML: A boosting metric learning approach in hyperbolic space for recommender systems,” in Proc. ACM Int. Conf. Web Search Data Mining, 2020, pp. 609–617.
[49]
A. Tifrea, G. Bécigneul, and O.-E. Ganea, “Poincaré glove: Hyperbolic word embeddings,” in Proc. Int. Conf. Learn. Representations, 2019.
[50]
V. Khrulkov, L. Mirvakhabova, E. Ustinova, I. Oseledets, and V. Lempitsky, “Hyperbolic image embeddings,” in Proc. IEEE Conf. Comput. Vis. Pattern Recognit., 2020, pp. 6418–6428.
[51]
P. Kyriakis, I. Fostiropoulos, and P. Bogdan, “Learning hyperbolic representations of topological features,” in Proc. Int. Conf. Learn. Representations, 2021.
[52]
W. Peng, T. Varanka, A. Mostafa, H. Shi, and G. Zhao, “Hyperbolic deep neural networks: A survey,” 2021,.
[53]
R. Albert and A.-L. Barabási, “Statistical mechanics of complex networks,” in Reviews of Modern Physics, Berlin, Germany: Springer, 2002.
[54]
W. Chen, C. Wang, and Y. Wang, “Scalable influence maximization for prevalent viral marketing in large-scale social networks,” in Proc. ACM SIGKDD Int. Conf. Knowl. Discov. Data Mining, 2010, pp. 1029–1038.
[55]
A. Guille, H. Hacid, C. Favre, and D. A. Zighed, “Information diffusion in online social networks: A survey,” ACM SIGMOD Rec., vol. 42, no. 2, pp. 17–28, 2013.
[56]
Y. Zhang, T. Lyu, and Y. Zhang, “Hierarchical community-level information diffusion modeling in social networks,” in Proc. Int. ACM SIGIR Conf. Res. Develop. Inf. Retrieval, 2017, pp. 753–762.
[57]
S. Bonnabel, “Stochastic gradient descent on riemannian manifolds,” IEEE Trans. Autom. Control, vol. 58, no. 9, pp. 2217–2229, Sep. 2013.
[58]
T. Mikolov, K. Chen, G. Corrado, and J. Dean, “Efficient estimation of word representations in vector space,” in Proc. Int. Conf. Learn. Representations, 2013.
[59]
A. Vaswani et al., “Attention is all you need,” in Proc. Int. Conf. Neural Inf. Process. Syst., 2017, pp. 5998–6008.
[60]
A. Lou, I. Katsman, Q. Jiang, S. Belongie, S.-N. Lim, and C. De Sa, “Differentiating through the fréchet mean,” in Proc. Int. Conf. Mach. Learn., 2020, pp. 6393–6403.
[61]
N. O. Hodas and K. Lerman, “The simple rules of social contagion,” Sci. Rep., vol. 4, 2014, Art. no.
[62]
E. Zhong, W. Fan, J. Wang, L. Xiao, and Y. Li, “Comsoc: Adaptive transfer of user behaviors over composite social network,” in Proc. ACM SIGKDD Int. Conf. Knowl. Discov. Data Mining, 2012, pp. 696–704.
[63]
D. P. Kingma and J. Ba, “Adam: A method for stochastic optimization,” in Proc. Int. Conf. Learn. Representations, 2015.

Cited By

View all
  • (2023)Two-Stage Denoising Diffusion Model for Source Localization in Graph Inverse ProblemsMachine Learning and Knowledge Discovery in Databases: Research Track10.1007/978-3-031-43418-1_20(325-340)Online publication date: 18-Sep-2023

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image IEEE Transactions on Knowledge and Data Engineering
IEEE Transactions on Knowledge and Data Engineering  Volume 35, Issue 9
Sept. 2023
1110 pages

Publisher

IEEE Educational Activities Department

United States

Publication History

Published: 01 September 2023

Qualifiers

  • Research-article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 06 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2023)Two-Stage Denoising Diffusion Model for Source Localization in Graph Inverse ProblemsMachine Learning and Knowledge Discovery in Databases: Research Track10.1007/978-3-031-43418-1_20(325-340)Online publication date: 18-Sep-2023

View Options

View options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media