LHGCN: A Laminated Heterogeneous Graph Convolutional Network for Modeling User–Item Interaction in E-Commerce
<p>Alibaba dataset structure pattern (<b>left</b>) and feature distribution (<b>right</b>).</p> "> Figure 2
<p>The overall architecture of LHGCN.</p> "> Figure 3
<p>Laminate generation module.</p> "> Figure 4
<p>Adaptive convolution module (gate outputs z and r and controls 2 types of laminates enjoying different receptive fields); the detailed implementation of attention and gate is shown in <a href="#symmetry-16-01695-f005" class="html-fig">Figure 5</a>.</p> "> Figure 5
<p>Information flow in convolutional layers, exemplified by laminate U.</p> "> Figure 6
<p>Laminate fusion module (the AC Layer is the adaptive convolution layer).</p> "> Figure 7
<p>Effects of the module design: <b>full</b> indicates the full model, <b>-laminate</b> excludes the LGM, <b>-gate</b> excludes the ACM, and <b>base</b> omits both modules mentioned above.</p> "> Figure 8
<p>Comparison of experimental results between GRU and LSTM.</p> "> Figure 9
<p>Multiplex effectiveness analysis.</p> "> Figure 10
<p>Parameter sensitivity analysis.</p> ">
Abstract
:1. Introduction
- We design a laminated heterogeneous graph convolutional network (LHGCN) framework that can effectively aggregate the multiple semantic information contained in nodes in multiplex graphs and improve the accuracy of link prediction between nodes. The laminate generation module can effectively alleviate the semantic confusion caused by node features in the message propagation process, thereby achieving efficient extraction of different semantic features of nodes in the multiplex graph and improving the performance of the model;
- To address the challenges associated with modeling complex relations, we propose an adaptive convolution module using a gating mechanism. Message passing within the framework is carefully designed to flexibly control the influence range of near and far neighbors, thereby adaptively determining the neighborhood influence of each node;
- Extensive experiments are implemented on the DTI, Amazon, Alibaba-s, Alibaba, and Douban datasets, demonstrating that the proposed network obtains competitive performances with ten state-of-the-art approaches in terms of quantitative comparison. Ablation studies show the effectiveness of the module design.
2. Related Works
3. Proposed Method
3.1. Problem Formulation
3.2. Network Architecture
3.2.1. Laminate Generation Module
3.2.2. Adaptive Convolution Module
3.2.3. Laminate Fusion Module
3.3. Loss Function
3.3.1. Positive Sample Loss and Negative Sample Loss
3.3.2. L2 Regularization Loss
3.3.3. Total Loss
4. Experiment
4.1. Datasets
- Amazon: This dataset comes from the e-commerce platform Amazon and is widely used in recommender system research. Nodes represent users or products, and edges represent users’ behaviors;
- Alibaba: This dataset is a large-scale dataset extracted from the Alibaba e-commerce platform for recommendation systems and advertising research. The dataset contains a large amount of user behavior data. Nodes represent users or products, and edges represent various user behaviors such as clicks and inquiries;
- Alibaba-s: This dataset is a small-scale version of Alibaba, mainly used to quickly verify the performance of the model in large-scale recommendation tasks;
- DTI: This dataset is used to study the interaction between drugs and targets. The nodes in the dataset represent drugs or targets, and the edges represent the interactions between them;
- Douban: This dataset comes from Douban, which is a social network dataset where nodes represent users and edges represent social relationships between users, such as friends or common interest groups.
4.2. Experimental Configurations
4.3. Implementation Details
4.4. Quantitative Comparative Experiments for Link Prediction
4.5. Ablation Experiment
4.5.1. Effects of the Module Design
4.5.2. Analyses of the Attention Strategy
4.5.3. Analyses of the Aggregation Across Edges
4.5.4. Analyses of the Gating Mechanism
4.5.5. Analyses of the Activation Function
4.6. Multiplex Effectiveness Analysis
4.7. Parameter Sensitivity Analysis
4.7.1. Effect of the Number of Convolution Layers
4.7.2. Effect of the Weight Coefficient of the Loss Function
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
Abbreviations
ACM | Adaptive Convolution Module |
AUPRC | Area Under the Precision-Recall Curve |
AUROC | Area Under the Receiver Operating Characteristic Curve |
GAT | Graph Attention Networks |
GCN | Graph Convolutional Network |
GNNs | Graph Neural Networks |
GRU | Gated Recurrent Units |
LFM | Laminate Fusion Module |
LGM | Laminate Generation Module |
LHGCN | Laminated Heterogeneous Graph Convolutional Network |
LSTM | Long Short-Term Memory |
References
- Blondel, V.D.; Guillaume, J.L.; Lambiotte, R.; Lefebvre, E. Fast unfolding of communities in large networks. J. Stat. Mech. Theory Exp. 2008, 2008, P10008. [Google Scholar] [CrossRef]
- Jiang, F.; Wang, Z. Pagerank-based collaborative filtering recommendation. In Proceedings of the International Conference on Information Computing and Applications, Tangshan, China, 15–18 October 2010; Springer: Berlin/Heidelberg, Germany, 2010; pp. 597–604. [Google Scholar]
- Abbe, E. Community detection and stochastic block models: Recent developments. J. Mach. Learn. Res. 2018, 18, 1–86. [Google Scholar]
- Yang, X.; Guo, Y.; Liu, Y. Bayesian-inference-based recommendation in online social networks. IEEE Trans. Parallel Distrib. Syst. 2012, 24, 642–651. [Google Scholar] [CrossRef]
- Scarselli, F.; Gori, M.; Tsoi, A.C.; Hagenbuchner, M.; Monfardini, G. The graph neural network model. IEEE Trans. Neural Netw. 2008, 20, 61–80. [Google Scholar] [CrossRef] [PubMed]
- Micheli, A. Neural network for graphs: A contextual constructive approach. IEEE Trans. Neural Netw. 2009, 20, 498–511. [Google Scholar] [CrossRef]
- Defferrard, M.; Bresson, X.; Vandergheynst, P. Convolutional neural networks on graphs with fast localized spectral filtering. Adv. Neural Inf. Process. Syst. 2016, 29, 3844–3852. [Google Scholar]
- Kipf, T.N.; Welling, M. Semi-Supervised Classification with Graph Convolutional Networks. In Proceedings of the International Conference on Learning Representations (ICLR), Toulon, France, 24–26 April 2017. [Google Scholar]
- Wu, Z.; Pan, S.; Chen, F.; Long, G.; Zhang, C.; Philip, S.Y. A comprehensive survey on graph neural networks. IEEE Trans. Neural Netw. Learn. Syst. 2020, 32, 4–24. [Google Scholar] [CrossRef] [PubMed]
- Veličković, P.; Cucurull, G.; Casanova, A.; Romero, A.; Lio, P.; Bengio, Y. Graph attention networks. arXiv 2017, arXiv:1710.10903. [Google Scholar]
- Xu, K.; Hu, W.; Leskovec, J.; Jegelka, S. How powerful are graph neural networks? arXiv 2018, arXiv:1810.00826. [Google Scholar]
- Wu, S.; Sun, F.; Zhang, W.; Xie, X.; Cui, B. Graph neural networks in recommender systems: A survey. ACM Comput. Surv. 2022, 55, 1–37. [Google Scholar] [CrossRef]
- Gao, C.; Zheng, Y.; Li, N.; Li, Y.; Qin, Y.; Piao, J.; Quan, Y.; Chang, J.; Jin, D.; He, X.; et al. A survey of graph neural networks for recommender systems: Challenges, methods, and directions. ACM Trans. Recomm. Syst. 2023, 1, 1–51. [Google Scholar] [CrossRef]
- Lin, X.; Quan, Z.; Wang, Z.J.; Ma, T.; Zeng, X. KGNN: Knowledge Graph Neural Network for Drug-Drug Interaction Prediction. In Proceedings of the IJCAI International Joint Conference on Artificial Intelligence, Okohama, Japan, 11–17 July 2020; Volume 380, pp. 2739–2745. [Google Scholar]
- Jiang, D.; Wu, Z.; Hsieh, C.Y.; Chen, G.; Liao, B.; Wang, Z.; Shen, C.; Cao, D.; Wu, J.; Hou, T. Could graph neural networks learn better molecular representation for drug discovery? A comparison study of descriptor-based and graph-based models. J. Cheminform. 2021, 13, 1–23. [Google Scholar] [CrossRef] [PubMed]
- Zhang, Z.; Chen, L.; Zhong, F.; Wang, D.; Jiang, J.; Zhang, S.; Jiang, H.; Zheng, M.; Li, X. Graph neural network approaches for drug-target interactions. Curr. Opin. Struct. Biol. 2022, 73, 102327. [Google Scholar] [CrossRef] [PubMed]
- Tremblay, N.; Gonçalves, P.; Borgnat, P. Design of graph filters and filterbanks. In Cooperative and Graph Signal Processing; Elsevier: Amsterdam, The Netherlands, 2018; pp. 299–324. [Google Scholar]
- Zheng, Y.; Jin, M.; Pan, S.; Li, Y.F.; Peng, H.; Li, M.; Li, Z. Toward graph self-supervised learning with contrastive adjusted zooming. IEEE Trans. Neural Netw. Learn. Syst. 2022, 35, 8882–8896. [Google Scholar] [CrossRef] [PubMed]
- Zhao, J.; Wang, X.; Shi, C.; Hu, B.; Song, G.; Ye, Y. Heterogeneous graph structure learning for graph neural networks. Proc. AAAI Conf. Artif. Intell. 2021, 35, 4697–4705. [Google Scholar] [CrossRef]
- Lv, Q.; Ding, M.; Liu, Q.; Chen, Y.; Feng, W.; He, S.; Zhou, C.; Jiang, J.; Dong, Y.; Tang, J. Are we really making much progress? In revisiting, benchmarking and refining heterogeneous graph neural networks. In Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery & Data Mining, Virtual Event, 14–18 August 2021; pp. 1150–1160. [Google Scholar]
- Yang, X.; Yan, M.; Pan, S.; Ye, X.; Fan, D. Simple and efficient heterogeneous graph neural network. Proc. AAAI Conf. Artif. Intell. 2023, 37, 10816–10824. [Google Scholar] [CrossRef]
- Simonovsky, M.; Komodakis, N. Graphvae: Towards generation of small graphs using variational autoencoders. In Proceedings of the Artificial Neural Networks and Machine Learning–ICANN 2018: 27th International Conference on Artificial Neural Networks, Rhodes, Greece, 4–7 October 2018; Proceedings, Part I 27. Springer: Berlin/Heidelberg, Germany, 2018; pp. 412–422. [Google Scholar]
- Shi, C.; Hu, B.; Zhao, W.X.; Philip, S.Y. Heterogeneous information network embedding for recommendation. IEEE Trans. Knowl. Data Eng. 2018, 31, 357–370. [Google Scholar] [CrossRef]
- Wang, X.; Ji, H.; Shi, C.; Wang, B.; Ye, Y.; Cui, P.; Yu, P.S. Heterogeneous graph attention network. In Proceedings of the World Wide Web Conference, San Francisco, CA, USA, 13–17 May 2019; pp. 2022–2032. [Google Scholar]
- Chen, Y.; Yan, J.; Jiang, M.; Zhang, T.; Zhao, Z.; Zhao, W.; Zheng, J.; Yao, D.; Zhang, R.; Kendrick, K.M.; et al. Adversarial learning based node-edge graph attention networks for autism spectrum disorder identification. IEEE Trans. Neural Netw. Learn. Syst. 2022, 35, 7275–7286. [Google Scholar] [CrossRef]
- Yue, H.; Hong, P.; Liu, H. Graph–Graph Similarity Network. IEEE Trans. Neural Netw. Learn. Syst. 2022, 35, 9136–9146. [Google Scholar] [CrossRef]
- Brody, S.; Alon, U.; Yahav, E. How attentive are graph attention networks? arXiv 2021, arXiv:2105.14491. [Google Scholar]
- Greff, K.; Srivastava, R.K.; Koutník, J.; Steunebrink, B.R.; Schmidhuber, J. LSTM: A search space odyssey. IEEE Trans. Neural Netw. Learn. Syst. 2016, 28, 2222–2232. [Google Scholar] [CrossRef]
- Dey, R.; Salem, F.M. Gate-variants of gated recurrent unit (GRU) neural networks. In Proceedings of the 2017 IEEE 60th International Midwest Symposium on Circuits and Systems (MWSCAS), Boston, MA, USA, 6–9 August 2017; IEEE: Piscataway, NJ, USA, 2017; pp. 1597–1600. [Google Scholar]
- Hamilton, W.; Ying, Z.; Leskovec, J. Inductive representation learning on large graphs. Adv. Neural Inf. Process. Syst. 2017, 30, 1025–1035. [Google Scholar]
- Liang, X.; Ma, Y.; Cheng, G.; Fan, C.; Yang, Y.; Liu, Z. Meta-path-based heterogeneous graph neural networks in academic network. Int. J. Mach. Learn. Cybern. 2022, 13, 1553–1569. [Google Scholar] [CrossRef]
- Salamat, A.; Luo, X.; Jafari, A. HeteroGraphRec: A heterogeneous graph-based neural networks for social recommendations. Knowl.-Based Syst. 2021, 217, 106817. [Google Scholar] [CrossRef]
- Ji, S.; Pan, S.; Cambria, E.; Marttinen, P.; Philip, S.Y. A survey on knowledge graphs: Representation, acquisition, and applications. IEEE Trans. Neural Netw. Learn. Syst. 2021, 33, 494–514. [Google Scholar] [CrossRef] [PubMed]
- Chen, H.; Yin, H.; Wang, W.; Wang, H.; Nguyen, Q.V.H.; Li, X. PME: Projected metric embedding on heterogeneous networks for link prediction. In Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, London, UK, 19–23 August 2018; pp. 1177–1186. [Google Scholar]
- Xu, L.; Wei, X.; Cao, J.; Yu, P.S. Embedding of embedding (EOE) joint embedding for coupled heterogeneous networks. In Proceedings of the Tenth ACM International Conference on Web Search and Data Mining, Cambridge, UK, 6–10 February 2017; pp. 741–749. [Google Scholar]
- Shi, Y.; Gui, H.; Zhu, Q.; Kaplan, L.; Han, J. Aspem: Embedding learning by aspects in heterogeneous information networks. In Proceedings of the 2018 SIAM International Conference on Data Mining, San Diego, CA, USA, 3–5 May 2018; SIAM: Philadelphia, PA, USA, 2018; pp. 144–152. [Google Scholar]
- Dong, Y.; Chawla, N.V.; Swami, A. metapath2vec: Scalable representation learning for heterogeneous networks. In Proceedings of the 23rd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, Halifax, NS, Canada, 13–17 August 2017; pp. 135–144. [Google Scholar]
- He, Y.; Song, Y.; Li, J.; Ji, C.; Peng, J.; Peng, H. Hetespaceywalk: A heterogeneous spacey random walk for heterogeneous information network embedding. In Proceedings of the 28th ACM International Conference on Information and Knowledge Management, Beijing, China, 3–7 November 2019; pp. 639–648. [Google Scholar]
- Fu, X.; Zhang, J.; Meng, Z.; King, I. Magnn: Metapath aggregated graph neural network for heterogeneous graph embedding. In Proceedings of the Web Conference 2020, Taipei, Taiwan, 20–24 April 2020; pp. 2331–2341. [Google Scholar]
- Hong, H.; Guo, H.; Lin, Y.; Yang, X.; Li, Z.; Ye, J. An attention-based graph neural network for heterogeneous structural learning. In Proceedings of the AAAI Conference on Artificial Intelligence, New York, NY, USA, 7–12 February 2020; Volume 34, pp. 4132–4139. [Google Scholar]
- Schlichtkrull, M.; Kipf, T.N.; Bloem, P.; Van Den Berg, R.; Titov, I.; Welling, M. Modeling relational data with graph convolutional networks. In Proceedings of the Semantic Web: 15th International Conference, ESWC 2018, Heraklion, Greece, 3–7 June 2018; proceedings 15. Springer: Berlin/Heidelberg, Germany, 2018; pp. 593–607. [Google Scholar]
- Liu, Y.; Fan, L.; Wang, X.; Xiao, Z.; Ma, S.; Pang, Y.; Lin, J.C.W. HGBER: Heterogeneous graph neural network with bidirectional encoding representation. IEEE Trans. Neural Netw. Learn. Syst. 2023, 35, 9340–9351. [Google Scholar] [CrossRef]
- Hu, Z.; Dong, Y.; Wang, K.; Sun, Y. Heterogeneous graph transformer. In Proceedings of the Web Conference 2020, Taipei, Taiwan, 20–24 April 2020; pp. 2704–2710. [Google Scholar]
- Xue, H.; Yang, L.; Rajan, V.; Jiang, W.; Wei, Y.; Lin, Y. Multiplex bipartite network embedding using dual hypergraph convolutional networks. In Proceedings of the Web Conference 2021, Ljubljana, Slovenia, 19–23 April 2021; pp. 1649–1660. [Google Scholar]
- Fu, C.; Zheng, G.; Huang, C.; Yu, Y.; Dong, J. Multiplex heterogeneous graph neural network with behavior pattern modeling. In Proceedings of the 29th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, Long Beach, CA, USA, 6–10 August 2023; pp. 482–494. [Google Scholar]
- He, C.; Xie, T.; Rong, Y.; Huang, W.; Li, Y.; Huang, J.; Ren, X.; Shahabi, C. Bipartite graph neural networks for efficient node representation learning. arXiv 2019, arXiv:1906.11994. [Google Scholar]
- Tang, J.; Ravikumar, B.; Alam, Z.; Rebane, A.; Vähä-Koskela, M.; Peddinti, G.; van Adrichem, A.J.; Wakkinen, J.; Jaiswal, A.; Karjalainen, E.; et al. Drug target commons: A community effort to build a consensus knowledge base for drug-target interactions. Cell Chem. Biol. 2018, 25, 224–229. [Google Scholar] [CrossRef]
- Liu, J.; Shi, C.; Hu, B.; Liu, S.; Yu, P.S. Personalized ranking recommendation via integrating multiple feedbacks. In Proceedings of the Advances in Knowledge Discovery and Data Mining: 21st Pacific-Asia Conference, PAKDD 2017, Jeju, Republic of Korea, 23–26 May 2017; Proceedings, Part II 21. Springer: Berlin/Heidelberg, Germany, 2017; pp. 131–143. [Google Scholar]
- Grover, A.; Leskovec, J. node2vec: Scalable feature learning for networks. In Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, San Francisco, CA, USA, 13–17 August 2016; pp. 855–864. [Google Scholar]
- Brockschmidt, M. Gnn-film: Graph neural networks with feature-wise linear modulation. In Proceedings of the International Conference on Machine Learning, PMLR, Virtual, 13–18 July 2020; pp. 1144–1152. [Google Scholar]
- He, M.; Wei, Z.; Feng, S.; Huang, Z.; Li, W.; Sun, Y.; Yu, D. Spectral Heterogeneous Graph Convolutions via Positive Noncommutative Polynomials. In Proceedings of the ACM on Web Conference 2024, Singapore, 13–17 May 2024; pp. 685–696. [Google Scholar]
- Rowshan, Y. The m-bipartite Ramsey number of the K_{2,2} versus K_{6,6}. arXiv 2022, arXiv:2202.13167. [Google Scholar]
Datasets | e-Commerce Datasets | Generalization | |||
---|---|---|---|---|---|
Amazon | Alibaba-s | Alibaba | DTI | Douban | |
Nodes | 9530 | 15,218 | 22,649 | 4837 | 34,890 |
Eedges | 60,658 | 27,036 | 45,734 | 16,458 | 307,088 |
Edges types | 2 | 3 | 3 | 5 | 6 |
Features | 4 | N/A | 18 | N/A | N/A |
Methods | Amazon | Alibaba-s | Alibaba | DTI | Douban | |||||
---|---|---|---|---|---|---|---|---|---|---|
AUROC | AUPRC | AUROC | AUPRC | AUROC | AUPRC | AUROC | AUPRC | AUROC | AUPRC | |
Louvain | 87.05 | 84.79 | 33.99 | 41.41 | 32.05 | 42.49 | 53.86 | 93.31 | 49.62 | 83.28 |
Node2vec | 50.30 | 55.44 | 50.43 | 51.42 | 50.10 | 51.52 | 50.88 | 57.45 | 50.27 | 60.61 |
GCN | 64.93 | 77.45 | 63.08 | 79.59 | 56.87 | 77.66 | 56.95 | 76.00 | 85.44 | 92.66 |
GAT | 66.70 | 70.16 | 53.28 | 54.29 | 55.38 | 54.49 | 76.33 | 80.64 | 67.13 | 87.29 |
GraphSAGE | 69.99 | 69.39 | 64.91 | 65.76 | 66.49 | 60.36 | 79.34 | 82.36 | 78.15 | 92.40 |
FiLM | 88.80 | 92.41 | 90.98 | 87.10 | 85.99 | 81.46 | 92.74 | 93.59 | 82.62 | 90.33 |
HGT | 93.75 | 95.59 | 90.68 | 83.47 | 83.59 | 71.75 | 93.16 | 92.23 | 71.07 | 90.09 |
HAN | 71.07 | 81.56 | 67.08 | 52.84 | 67.29 | 55.84 | 81.25 | 84.94 | 63.19 | 86.41 |
PSHGCN | 87.10 | 92.22 | 90.84 | 82.71 | 85.39 | 82.71 | 90.26 | 92.75 | 76.38 | 92.78 |
DualHGCN | 86.69 | 88.69 | 87.57 | 89.02 | 85.54 | 87.51 | 93.85 | 95.00 | 84.86 | 85.17 |
BPHGNN | 93.79 | 93.67 | 93.64 | 89.41 | 90.53 | 86.50 | 93.91 | 93.53 | 89.20 | 90.17 |
ours | 95.82 | 95.90 | 93.34 | 94.16 | 91.36 | 93.06 | 94.87 | 94.93 | 87.03 | 96.15 |
Methods | Amazon | Alibaba | ||
---|---|---|---|---|
AUROC | AUPRC | AUROC | AUPRC | |
Louvain | 87.05 | 84.79 | 32.05 | 42.49 |
Node2vec | 50.30 | 55.44 | 50.10 | 51.52 |
GCN | 56.26 | 58.19 | 71.38 | 69.16 |
GAT | 62.44 | 67.11 | 59.12 | 59.73 |
GraphSAGE | 85.70 | 78.86 | 84.41 | 81.40 |
FiLM | 63.81 | 73.69 | 85.08 | 83.36 |
HGT | 77.89 | 84.97 | 86.72 | 86.00 |
HAN | 69.45 | 72.22 | 50.51 | 52.09 |
PSHGCN | 57.17 | 74.25 | 77.25 | 75.68 |
DualHGCN | 86.55 | 88.55 | 86.76 | 88.59 |
BPHGNN | 81.85 | 82.20 | 85.08 | 87.55 |
ours | 90.46 | 91.02 | 89.75 | 92.33 |
Methods | Amazon | Alibaba-s | Alibaba | DTI | Douban | |||||
---|---|---|---|---|---|---|---|---|---|---|
AUROC | AUPRC | AUROC | AUPRC | AUROC | AUPRC | AUROC | AUPRC | AUROC | AUPRC | |
static-one | 96.13 | 96.01 | 93.23 | 94.19 | 90.98 | 92.57 | 93.86 | 93.65 | 85.00 | 95.42 |
static-multi | 95.60 | 95.46 | 92.95 | 93.88 | 91.10 | 92.79 | 94.52 | 94.69 | 86.38 | 95.95 |
dynamic | 95.82 | 95.90 | 93.34 | 94.16 | 91.36 | 93.06 | 94.87 | 94.93 | 87.03 | 96.15 |
Methods | Amazon | Alibaba-s | Alibaba | DTI | Douban | |||||
---|---|---|---|---|---|---|---|---|---|---|
AUROC | AUPRC | AUROC | AUPRC | AUROC | AUPRC | AUROC | AUPRC | AUROC | AUPRC | |
proportion | 95.68 | 95.71 | 92.09 | 93.33 | 90.20 | 92.46 | 93.63 | 93.08 | 86.53 | 95.99 |
semantic | 95.81 | 95.76 | 92.72 | 93.76 | 90.23 | 91.84 | 93.93 | 94.00 | 86.82 | 96.09 |
average | 95.82 | 95.90 | 93.34 | 94.16 | 91.36 | 93.06 | 94.87 | 94.93 | 87.03 | 96.15 |
Methods | Amazon | Alibaba-s | Alibaba | DTI | Douban | |||||
---|---|---|---|---|---|---|---|---|---|---|
AUROC | AUPRC | AUROC | AUPRC | AUROC | AUPRC | AUROC | AUPRC | AUROC | AUPRC | |
active | 87.95 | 89.29 | 84.61 | 85.92 | 84.49 | 82.69 | 88.95 | 91.65 | 83.38 | 95.13 |
non-active | 95.82 | 95.90 | 93.34 | 94.16 | 91.36 | 93.06 | 94.87 | 94.93 | 87.03 | 96.15 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Liu, K.; Kang, M.; Li, X.; Dai, W. LHGCN: A Laminated Heterogeneous Graph Convolutional Network for Modeling User–Item Interaction in E-Commerce. Symmetry 2024, 16, 1695. https://doi.org/10.3390/sym16121695
Liu K, Kang M, Li X, Dai W. LHGCN: A Laminated Heterogeneous Graph Convolutional Network for Modeling User–Item Interaction in E-Commerce. Symmetry. 2024; 16(12):1695. https://doi.org/10.3390/sym16121695
Chicago/Turabian StyleLiu, Kang, Mengtao Kang, Xinyu Li, and Wenqing Dai. 2024. "LHGCN: A Laminated Heterogeneous Graph Convolutional Network for Modeling User–Item Interaction in E-Commerce" Symmetry 16, no. 12: 1695. https://doi.org/10.3390/sym16121695
APA StyleLiu, K., Kang, M., Li, X., & Dai, W. (2024). LHGCN: A Laminated Heterogeneous Graph Convolutional Network for Modeling User–Item Interaction in E-Commerce. Symmetry, 16(12), 1695. https://doi.org/10.3390/sym16121695