[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
research-article

Learning Graph ODE for Continuous-Time Sequential Recommendation

Published: 03 January 2024 Publication History

Abstract

Sequential recommendation aims at understanding user preference by capturing successive behavior correlations, which are usually represented as the item purchasing sequences based on their past interactions. Existing efforts generally predict the next item via modeling the sequential patterns. Despite effectiveness, there exist two natural deficiencies: (i) user preference is dynamic in nature, and the evolution of collaborative signals is often ignored; and (ii) the observed interactions are often irregularly-sampled, while existing methods model item transitions assuming uniform intervals. Thus, how to effectively model and predict the underlying dynamics for user preference becomes a critical research problem. To tackle the above challenges, in this paper, we focus on continuous-time sequential recommendation and propose a principled graph ordinary differential equation framework named GDERec. Technically, GDERec is characterized by an autoregressive graph ordinary differential equation consisting of two components, which are parameterized by two tailored graph neural networks (GNNs) respectively to capture user preference from the perspective of hybrid dynamical systems. On the one hand, we introduce a novel ordinary differential equation based GNN to <italic>implicitly</italic> model the temporal evolution of the user-item interaction graph. On the other hand, an attention-based GNN is proposed to <italic>explicitly</italic> incorporate collaborative attention to interaction signals when the interaction graph evolves over time. The two customized GNNs are trained alternately in an autoregressive manner to track the evolution of the underlying system from irregular observations, and thus learn effective representations of users and items beneficial to the sequential recommendation. Extensive experiments on five benchmark datasets demonstrate the superiority of our model over various state-of-the-art recommendation methods.

References

[1]
J. Wang, P. Huang, H. Zhao, Z. Zhang, B. Zhao, and D. L. Lee, “Billion-scale commodity embedding for e-commerce recommendation in Alibaba,” in Proc. 24th ACM SIGKDD Int. Conf. Knowl. Discov. Data Mining, 2018, pp. 839–848.
[2]
Z. Li et al., “Hierarchical bipartite graph neural networks: Towards large-scale e-commerce applications,” in Proc. IEEE 36th Int. Conf. Data Eng., 2020, pp. 1677–1688.
[3]
X. Zhang et al., “OA-Mine: Open-world attribute mining for e-commerce products with weak supervision,” in Proc. ACM Web Conf., 2022, pp. 3153–3161.
[4]
E. Min et al., “Divide-and-conquer: Post-user interaction network for fake news detection on social media,” in Proc. ACM Web Conf., 2022, pp. 1148–1158.
[5]
M. Mousavi, H. Davulcu, M. Ahmadi, R. Axelrod, R. Davis, and S. Atran, “Effective messaging on social media: What makes online content go viral?,” in Proc. ACM Web Conf., 2022, pp. 2957–2966.
[6]
J. Song, K. Han, and S.-W. Kim, ““I have no text in my post”: Using visual hints to model user emotions in social media,” in Proc. ACM Web Conf., 2022, pp. 2888–2896.
[7]
A. Mnih and R. R. Salakhutdinov, “Probabilistic matrix factorization,” in Proc. Int. Conf. Neural Inf. Process. Syst., 2007, pp. 1257–1264.
[8]
P. Gopalan, J. M. Hofman, and D. M. Blei, “Scalable recommendation with hierarchical Poisson factorization,” in Proc. 31st Conf. Uncertainty Artif. Intell., 2015, pp. 326–335.
[9]
Y. Koren, R. Bell, and C. Volinsky, “Matrix factorization techniques for recommender systems,” Computer, vol. 42, no. 8, pp. 30–37, 2009.
[10]
S. Rendle, C. Freudenthaler, and L. Schmidt-Thieme, “Factorizing personalized Markov chains for next-basket recommendation,” in Proc. 19th Int. Conf. World Wide Web, 2010, pp. 811–820.
[11]
R. He and J. McAuley, “Fusing similarity models with Markov chains for sparse sequential recommendation,” in Proc. IEEE 16th Int. Conf. Data Mining, 2016, pp. 191–200.
[12]
S. Hochreiter and J. Schmidhuber, “Long short-term memory,” Neural Computation, vol. 9, no. 8, pp. 1735–1780, 1997.
[13]
B. Hidasi, A. Karatzoglou, L. Baltrunas, and D. Tikk, “Session-based recommendations with recurrent neural networks,” 2015,.
[14]
M. Quadrana, A. Karatzoglou, B. Hidasi, and P. Cremonesi, “Personalizing session-based recommendations with hierarchical recurrent neural networks,” in Proc. 11th ACM Conf. Recommender Syst., 2017, pp. 130–137.
[15]
Z. Li, H. Zhao, Q. Liu, Z. Huang, T. Mei, and E. Chen, “Learning from history and present: Next-item recommendation via discriminatively exploiting user behaviors,” in Proc. 24th ACM SIGKDD Int. Conf. Knowl. Discov. Data Mining, 2018, pp. 1734–1743.
[16]
A. Vaswani et al., “Attention is all you need,” in Proc. Int. Conf. Neural Inf. Process. Syst., 2017, pp. 5998–6008.
[17]
J. Li, P. Ren, Z. Chen, Z. Ren, T. Lian, and J. Ma, “Neural attentive session-based recommendation,” in Proc. ACM Conf. Inf. Knowl. Manage., 2017, pp. 1419–1428.
[18]
W.-C. Kang and J. McAuley, “Self-attentive sequential recommendation,” in Proc. IEEE Int. Conf. Data Mining, 2018, pp. 197–206.
[19]
F. Sun et al., “BERT4Rec: Sequential recommendation with bidirectional encoder representations from transformer,” in Proc. 28th ACM Int. Conf. Inf. Knowl. Manage., 2019, pp. 1441–1450.
[20]
L. Wu, S. Li, C.-J. Hsieh, and J. Sharpnack, “SSE-PT: Sequential recommendation via personalized transformer,” in Proc. 14th ACM Conf. Recommender Syst., 2020, pp. 328–337.
[21]
Z. Fan, Z. Liu, J. Zhang, Y. Xiong, L. Zheng, and P. S. Yu, “Continuous-time sequential recommendation with temporal graph collaborative transformer,” in Proc. 30th ACM Int. Conf. Inf. Knowl. Manage., 2021, pp. 433–442.
[22]
Z. Fan et al., “Sequential recommendation via stochastic self-attention,” in Proc. ACM Web Conf., 2022, pp. 2036–2047.
[23]
T. N. Kipf and M. Welling, “Semi-supervised classification with graph convolutional networks,” in Proc. Int. Conf. Learn. Representations, 2017.
[24]
X. Luo et al., “CLEAR: Cluster-enhanced contrast for self-supervised graph representation learning,” IEEE Trans. Neural Netw. Learn. Syst., vol. 35, no. 1, pp. 899–912, Jan. 2024.
[25]
W. Ju et al., “A comprehensive survey on deep graph representation learning,” 2023,.
[26]
W. Ju et al., “TGNN: A joint semi-supervised framework for graph-level classification,” 2023,.
[27]
S. Wu, Y. Tang, Y. Zhu, L. Wang, X. Xie, and T. Tan, “Session-based recommendation with graph neural networks,” in Proc. AAAI Conf. Artif. Intell., 2019, pp. 346–353.
[28]
C. Xu et al., “Graph contextualized self-attention network for session-based recommendation,” in Proc. 28th Int. Joint Conf. Artif. Intell., 2019, pp. 3940–3946.
[29]
Y. Yang, C. Huang, L. Xia, Y. Liang, Y. Yu, and C. Li, “Multi-behavior hypergraph-enhanced transformer for sequential recommendation,” in Proc. 28th ACM SIGKDD Conf. Knowl. Discov. Data Mining, 2022, pp. 2263–2274.
[30]
Y. Qin et al., “DisenPOI: Disentangling sequential and geographical influence for point-of-interest recommendation,” in Proc. 16th ACM Int. Conf. Web Search Data Mining, 2023, pp. 508–516.
[31]
X. Luo et al., “DualGraph: Improving semi-supervised graph classification via dual contrastive learning,” in Proc. IEEE 38th Int. Conf. Data Eng., 2022, pp. 699–712.
[32]
W. Ju et al., “GLCC: A general framework for graph-level clustering,” in Proc. AAAI Conf. Artif. Intell., 2023, pp. 4391–4399.
[33]
W. Ju et al., “Unsupervised graph-level representation learning with hierarchical contrasts,” Neural Netw., vol. 158, pp. 359–368, 2023.
[34]
J. Gilmer, S. S. Schoenholz, P. F. Riley, O. Vinyals, and G. E. Dahl, “Neural message passing for quantum chemistry,” in Proc. Int. Conf. Mach. Learn., 2017, pp. 1263–1272.
[35]
W. Song, Z. Xiao, Y. Wang, L. Charlin, M. Zhang, and J. Tang, “Session-based social recommendation via dynamic graph attention networks,” in Proc. 12th ACM Int. Conf. Web Search Data Mining, 2019, pp. 555–563.
[36]
Y. Wang et al., “DisenCTR: Dynamic graph-based disentangled representation for click-through rate prediction,” in Proc. 45th Int. ACM SIGIR Conf. Res. Develop. Inf. Retrieval, 2022, pp. 2314–2318.
[37]
W. Ju et al., “Kernel-based substructure exploration for next POI recommendation,” in Proc. IEEE Int. Conf. Data Mining, 2022, pp. 221–230.
[38]
Y. Qin, H. Wu, W. Ju, X. Luo, and M. Zhang, “A diffusion model for POI recommendation,” 2023,.
[39]
P. Veličković, G. Cucurull, A. Casanova, A. Romero, P. Lio, and Y. Bengio, “Graph attention networks,” in Proc. Int. Conf. Learn. Representations, 2017.
[40]
R. T. Chen, Y. Rubanova, J. Bettencourt, and D. K. Duvenaud, “Neural ordinary differential equations,” in Proc. Int. Conf. Neural Inf. Process. Syst., 2018, pp. 6572–6583.
[41]
K. He, X. Zhang, S. Ren, and J. Sun, “Deep residual learning for image recognition,” in Proc. IEEE Conf. Comput. Vis. Pattern Recognit., 2016, pp. 770–778.
[42]
Z. Fang, Q. Long, G. Song, and K. Xie, “Spatial-temporal graph ode networks for traffic flow forecasting,” in Proc. 27th ACM SIGKDD Conf. Knowl. Discov. Data Mining, 2021, pp. 364–373.
[43]
J. Choi, H. Choi, J. Hwang, and N. Park, “Graph neural controlled differential equations for traffic forecasting,” in Proc. AAAI Conf. Artif. Intell., 2022, pp. 6367–6374.
[44]
J. Ji, J. Wang, Z. Jiang, J. Jiang, and H. Zhang, “STDEN: Towards physics-guided neural networks for traffic flow prediction,” in Proc. 36th AAAI Conf. Artif. Intell. 34th Conf. Innov. Appl. Artif. Intell. 12th Symp. Educ. Adv. Artif. Intell., 2022, pp. 4048–4056.
[45]
X. Rao, H. Wang, L. Zhang, J. Li, S. Shang, and P. Han, “FOGS: First-order gradient supervision with learning-based graph for traffic flow forecasting,” in Proc. Int. Joint Conf. Artif. Intell., 2022, pp. 3926–3932.
[46]
Y. Chen, B. Yang, Q. Meng, Y. Zhao, and A. Abraham, “Time-series forecasting using a system of ordinary differential equations,” Inf. Sci., vol. 181, no. 1, pp. 106–114, 2011.
[47]
E. De Brouwer, J. Simm, A. Arany, and Y. Moreau, “GRU-ODE-Bayes: Continuous modeling of sporadically-observed time series,” in Proc. Int. Conf. Neural Inf. Process. Syst., 2019, pp. 7377–7388.
[48]
M. Jin, Y. Zheng, Y.-F. Li, S. Chen, B. Yang, and S. Pan, “Multivariate time series forecasting with dynamic graph neural odes,” 2022,.
[49]
Z. Huang, Y. Sun, and W. Wang, “Learning continuous system dynamics from irregularly-sampled partial observations,” in Proc. Int. Conf. Neural Inf. Process. Syst., 2020, pp. 16177–16187.
[50]
Z. Huang, Y. Sun, and W. Wang, “Coupled graph ODE for learning interacting system dynamics,” in Proc. 27th ACM SIGKDD Conf. Knowl. Discov. Data Mining, 2021, pp. 705–715.
[51]
X. Luo et al., “HOPE: High-order graph ode for modeling interacting dynamics,” in Proc. Int. Conf. Mach. Learn., 2023, pp. 23124–23139.
[52]
J. Zhuang, N. Dvornek, X. Li, and J. S. Duncan, “Ordinary differential equations on graph networks,” in Proc. Int. Conf. Learn. Representations, 2019.
[53]
L.-P. Xhonneux, M. Qu, and J. Tang, “Continuous graph neural networks,” in Proc. Int. Conf. Mach. Learn., 2020, pp. 10432–10441.
[54]
X. Wang, X. He, M. Wang, F. Feng, and T.-S. Chua, “Neural graph collaborative filtering,” in Proc. 42nd Int. ACM SIGIR Conf. Res. Develop. Inf. Retrieval, 2019, pp. 165–174.
[55]
C. Runge, “Über die numerische auflösung von differentialgleichungen,” Mathematische Annalen, vol. 46, no. 2, pp. 167–178, 1895.
[56]
D. Xu, C. Ruan, E. Korpeoglu, S. Kumar, and K. Achan, “Self-attention with functional time representation learning,” in Proc. Int. Conf. Neural Inf. Process. Syst., 2019, pp. 15889–15899.
[57]
L. H. Loomis, Introduction to Abstract Harmonic Analysis. Chelmsford, MA, USA: Courier, 2013.
[58]
S. Rendle, C. Freudenthaler, Z. Gantner, and L. Schmidt-Thieme, “BPR: Bayesian personalized ranking from implicit feedback,” 2012,.
[59]
J. R. Dormand and P. J. Prince, “A family of embedded Runge-Kutta formulae,” J. Comput. Appl. Math., vol. 6, no. 1, pp. 19–26, 1980.
[60]
X. He, K. Deng, X. Wang, Y. Li, Y. Zhang, and M. Wang, “LightGCN: Simplifying and powering graph convolution network for recommendation,” in Proc. 43rd Int. ACM SIGIR Conf. Res. Develop. Inf. Retrieval, 2020, pp. 639–648.
[61]
J. Guo, P. Zhang, C. Li, X. Xie, Y. Zhang, and S. Kim, “Evolutionary preference learning via graph nested GRU ODE for session-based recommendation,” in Proc. 31st ACM Int. Conf. Inf. Knowl. Manage., 2022, pp. 624–634.
[62]
J. Li, Y. Wang, and J. McAuley, “Time interval aware self-attention for sequential recommendation,” in Proc. 13th Int. Conf. Web Search Data Mining, 2020, pp. 322–330.
[63]
X. Wang, H. Jin, A. Zhang, X. He, T. Xu, and T.-S. Chua, “Disentangled graph collaborative filtering,” in Proc. 43rd Int. ACM SIGIR Conf. Res. Develop. Inf. Retrieval, 2020, pp. 1001–1010.
[64]
J. Bao and Y. Zhang, “Time-aware recommender system via continuous-time modeling,” in Proc. 30th ACM Int. Conf. Inf. Knowl. Manage., 2021, pp. 2872–2876.
[65]
X. Li et al., “Multi-intention oriented contrastive learning for sequential recommendation,” in Proc. 16th ACM Int. Conf. Web Search Data Mining, 2023, pp. 411–419.
[66]
Y. Yang, C. Huang, L. Xia, C. Huang, D. Luo, and K. Lin, “Debiased contrastive learning for sequential recommendation,” in Proc. ACM Web Conf., 2023, pp. 1063–1073.
[67]
L. van der Maaten and G. Hinton, “Visualizing data using t-SNE,” J. Mach. Learn. Res., vol. 9, no. 11, pp. 2579–2605, 2008.

Cited By

View all
  • (2024)Distributed Recommendation Systems: Survey and Research DirectionsACM Transactions on Information Systems10.1145/3694783Online publication date: 6-Sep-2024
  • (2024)Federated Recommender System Based on Diffusion Augmentation and Guided DenoisingACM Transactions on Information Systems10.1145/3688570Online publication date: 13-Aug-2024
  • (2024)Information Diffusion Prediction with Graph Neural Ordinary Differential Equation NetworkProceedings of the 32nd ACM International Conference on Multimedia10.1145/3664647.3681363(9699-9708)Online publication date: 28-Oct-2024
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image IEEE Transactions on Knowledge and Data Engineering
IEEE Transactions on Knowledge and Data Engineering  Volume 36, Issue 7
July 2024
876 pages

Publisher

IEEE Educational Activities Department

United States

Publication History

Published: 03 January 2024

Qualifiers

  • Research-article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 13 Dec 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Distributed Recommendation Systems: Survey and Research DirectionsACM Transactions on Information Systems10.1145/3694783Online publication date: 6-Sep-2024
  • (2024)Federated Recommender System Based on Diffusion Augmentation and Guided DenoisingACM Transactions on Information Systems10.1145/3688570Online publication date: 13-Aug-2024
  • (2024)Information Diffusion Prediction with Graph Neural Ordinary Differential Equation NetworkProceedings of the 32nd ACM International Conference on Multimedia10.1145/3664647.3681363(9699-9708)Online publication date: 28-Oct-2024
  • (2024)Dynamic Stage-aware User Interest Learning for Heterogeneous Sequential RecommendationProceedings of the 18th ACM Conference on Recommender Systems10.1145/3640457.3688103(465-474)Online publication date: 8-Oct-2024
  • (undefined)Learning Knowledge-diverse Experts for Long-tailed Graph ClassificationACM Transactions on Knowledge Discovery from Data10.1145/3705323

View Options

View options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media