[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
research-article

Efficient block contrastive learning via parameter-free meta-node approximation

Published: 07 December 2023 Publication History

Abstract

Contrastive learning has recently achieved remarkable success in many domains including graphs. However contrastive loss, especially for graphs, requires a large number of negative samples which is unscalable and computationally prohibitive with a quadratic time complexity. Sub-sampling is not optimal. Incorrect negative sampling leads to sampling bias. In this work, we propose a meta-node based approximation technique that is (a) simple, (b) canproxy all negative combinations (c) in quadratic cluster size time complexity, (d) at graph level, not node level, and (e) exploit graph sparsity. By replacing node-pairs with additive cluster-pairs, we compute the negatives in cluster-time at graph level. The resulting Proxy approximated meta-node Contrastive (PamC) loss, based on simple optimized GPU operations, captures the full set of negatives, yet is efficient with a linear time complexity. By avoiding sampling, we effectively eliminate sample bias. We meet the criterion for larger number of samples, thus achieving block-contrastiveness, which is proven to outperform pair-wise losses. We use learnt soft cluster assignments for the meta-node construction, and avoid possible heterophily and noise added during edge creation. Theoretically, we show that real world graphs easily satisfy conditions necessary for our approximation. Empirically, we show promising accuracy gains over state-of-the-art graph clustering on 6 benchmarks. Importantly, we gain substantially in efficiency; over 2x reduction in training time and over 5x in GPU memory reduction. Additionally, our embeddings, combined with a single learnt linear transformation, is sufficient for node classification; we achieve state-of-the-art on Citeseer classification benchmark. code:https://github.com/gayanku/PAMC

References

[1]
Guo X., Gao L., Liu X., Yin J., Improved deep embedded clustering with local structure preservation, in: IJCAI, 2017, pp. 1753–1759.
[2]
Wang T., Isola P., Understanding contrastive representation learning through alignment and uniformity on the hypersphere, in: International Conference on Machine Learning, PMLR, 2020, pp. 9929–9939.
[3]
Chen M., Wei Z., Huang Z., Ding B., Li Y., Simple and deep graph convolutional networks, in: International Conference on Machine Learning, PMLR, 2020, pp. 1725–1735.
[4]
Logeswaran L., Lee H., An efficient framework for learning sentence representations, in: International Conference on Learning Representations, 2018.
[5]
Chen T., Kornblith S., Norouzi M., Hinton G., A simple framework for contrastive learning of visual representations, in: International Conference on Machine Learning, PMLR, 2020, pp. 1597–1607.
[6]
Kulatilleke G.K., Portmann M., Chandra S.S., SCGC: Self-supervised contrastive graph clustering, 2022, arXiv preprint arXiv:2204.12656.
[7]
Zhu H., Sun K., Koniusz P., Contrastive laplacian eigenmaps, Adv. Neural Inf. Process. Syst. 34 (2021).
[8]
Hamilton W.L., Ying R., Leskovec J., Inductive representation learning on large graphs, in: Proceedings of the 31st International Conference on Neural Information Processing Systems, 2017, pp. 1025–1035.
[9]
Hu Y., You H., Wang Z., Wang Z., Zhou E., Gao Y., Graph-MLP: node classification without message passing in graph, 2021, arXiv preprint arXiv:2106.04051.
[10]
Kipf T.N., Welling M., Variational graph auto-encoders, 2016, arXiv preprint arXiv:1611.07308.
[11]
Velickovic P., Fedus W., Hamilton W.L., Liò P., Bengio Y., Hjelm R.D., Deep graph infomax, ICLR (Poster) 2 (3) (2019) 4.
[12]
Kulatilleke G.K., Portmann M., Ko R., Chandra S.S., FDGATII: Fast dynamic graph attention with initial residual and identity mapping, 2021, arXiv preprint arXiv:2110.11464.
[13]
Park N., Rossi R., Koh E., Burhanuddin I.A., Kim S., Du F., Ahmed N., Faloutsos C., CGC: Contrastive graph clustering forcommunity detection and tracking, in: Proceedings of the ACM Web Conference 2022, 2022, pp. 1115–1126.
[14]
He K., Fan H., Wu Y., Xie S., Girshick R., Momentum contrast for unsupervised visual representation learning, in: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2020, pp. 9729–9738.
[15]
Zhu Y., Xu Y., Yu F., Liu Q., Wu S., Wang L., Deep graph contrastive representation learning, 2020, arXiv preprint arXiv:2006.04131.
[16]
Zhu Y., Xu Y., Yu F., Liu Q., Wu S., Wang L., Graph contrastive learning with adaptive augmentation, in: Proceedings of the Web Conference 2021, 2021, pp. 2069–2080.
[17]
Thakoor S., Tallec C., Azar M.G., Munos R., Veličković P., Valko M., Bootstrapped representation learning on graphs, in: ICLR 2021 Workshop on Geometrical and Topological Representation Learning, 2021.
[18]
Zhang H., Wu Q., Yan J., Wipf D., Yu P.S., From canonical correlation analysis to self-supervised graph neural networks, Adv. Neural Inf. Process. Syst. 34 (2021) 76–89.
[19]
Chuang C.-Y., Robinson J., Lin Y.-C., Torralba A., Jegelka S., Debiased contrastive learning, Adv. Neural Inf. Process. Syst. 33 (2020) 8765–8775.
[20]
Arora S., Khandeparkar H., Khodak M., Plevrakis O., Saunshi N., A theoretical analysis of contrastive unsupervised representation learning, in: 36th International Conference on Machine Learning, ICML 2019, International Machine Learning Society (IMLS), 2019, pp. 9904–9923.
[21]
Caron M., Misra I., Mairal J., Goyal P., Bojanowski P., Joulin A., Unsupervised learning of visual features by contrasting cluster assignments, Adv. Neural Inf. Process. Syst. 33 (2020) 9912–9924.
[22]
Zhang C., Yao H., Chen C., Lin Y., Graph representation learning via contrasting cluster assignments, 2021, arXiv preprint arXiv:2112.07934.
[23]
Li J., Zhou P., Xiong C., Hoi S., Prototypical contrastive learning of unsupervised representations, in: International Conference on Learning Representations, 2020.
[24]
Xie J., Girshick R., Farhadi A., Unsupervised deep embedding for clustering analysis, in: ICML, 2016, pp. 478–487.
[25]
Wang C., Pan S., Hu R., Long G., Jiang J., Zhang C., Attributed graph clustering: A deep attentional embedding approach, 2019, arXiv preprint arXiv:1906.06532.
[26]
Maaten L.v.d., Hinton G., Visualizing data using t-SNE, J. Mach. Learn. Res. 9 (Nov) (2008) 2579–2605.
[27]
Bo D., Wang X., Shi C., Zhu M., Lu E., Cui P., Structural deep clustering network, in: Proceedings of the Web Conference 2020, 2020, pp. 1400–1410.
[28]
Peng Z., Liu H., Jia Y., Hou J., Attention-driven graph clustering network, in: Proceedings of the 29th ACM International Conference on Multimedia, 2021, pp. 935–943.
[29]
Le Cun Y., Matan O., Boser B., Denker J.S., Henderson D., Howard R.E., Hubbard W., Jacket L., Baird H.S., Handwritten zip code recognition with multilayer networks, in: ICPR, Vol. 2, IEEE, 1990, pp. 35–40.
[30]
Stisen A., Blunck H., Bhattacharya S., Prentow T.S., Kjærgaard M.B., Dey A., Sonne T., Jensen M.M., Smart devices are different: Assessing and mitigatingmobile sensing heterogeneities for activity recognition, in: SenSys, 2015, pp. 127–140.
[31]
Lewis D.D., Yang Y., Rose T.G., Li F., Rcv1: A new benchmark collection for text categorization research, J. Mach. Learn. Res. 5 (Apr) (2004) 361–397.
[32]
Altman N.S., An introduction to kernel and nearest-neighbor nonparametric regression, Amer. Statist. 46 (3) (1992) 175–185.
[33]
Hartigan J.A., Wong M.A., Algorithm AS 136: A k-means clustering algorithm, J. R. Stat. Soc. Ser. C (Appl. Stat.) 28 (1) (1979) 100–108.
[34]
Hinton G.E., Salakhutdinov R.R., Reducing the dimensionality of data with neural networks, Science 313 (5786) (2006) 504–507.
[35]
Golub G.H., Reinsch C., Singular value decomposition and least squares solutions, in: Linear Algebra, Springer, 1971, pp. 134–151.
[36]
Kipf T.N., Welling M., Semi-supervised classification with graph convolutional networks, 2016, arXiv preprint arXiv:1609.02907.
[37]
Pan S., Hu R., Long G., Jiang J., Yao L., Zhang C., Adversarially regularized graph autoencoder for graph embedding, 2018, arXiv preprint arXiv:1802.04407.
[38]
You Y., Chen T., Sui Y., Chen T., Wang Z., Shen Y., Graph contrastive learning with augmentations, Adv. Neural Inf. Process. Syst. 33 (2020) 5812–5823.
[39]
Hassani K., Khasahmadi A.H., Contrastive multi-view representation learning on graphs, in: International Conference on Machine Learning, PMLR, 2020, pp. 4116–4126.
[40]
Zheng Y., Zheng Y., Zhou X., Gong C., Lee V.C., Pan S., Unifying graph contrastive learning with flexible contextual scopes, in: 2022 IEEE International Conference on Data Mining (ICDM), IEEE, 2022, pp. 793–802.
[41]
Feng W., Zhang J., Dong Y., Han Y., Luan H., Xu Q., Yang Q., Kharlamov E., Tang J., Graph random neural networks for semi-supervised learning on graphs, Adv. Neural Inf. Process. Syst. 33 (2020) 22092–22103.
[42]
McInnes L., Healy J., Melville J., Umap: Uniform manifold approximation and projection for dimension reduction, 2018, arXiv preprint arXiv:1802.03426.

Index Terms

  1. Efficient block contrastive learning via parameter-free meta-node approximation
          Index terms have been assigned to the content through auto-classification.

          Recommendations

          Comments

          Please enable JavaScript to view thecomments powered by Disqus.

          Information & Contributors

          Information

          Published In

          cover image Neurocomputing
          Neurocomputing  Volume 561, Issue C
          Dec 2023
          359 pages

          Publisher

          Elsevier Science Publishers B. V.

          Netherlands

          Publication History

          Published: 07 December 2023

          Author Tags

          1. Contrastive learning
          2. Graph neural networks
          3. Contrastive approximation
          4. Efficiency
          5. Mathematical proof
          6. Surrogate loss
          7. Proxy loss

          Qualifiers

          • Research-article

          Contributors

          Other Metrics

          Bibliometrics & Citations

          Bibliometrics

          Article Metrics

          • 0
            Total Citations
          • 0
            Total Downloads
          • Downloads (Last 12 months)0
          • Downloads (Last 6 weeks)0
          Reflects downloads up to 03 Jan 2025

          Other Metrics

          Citations

          View Options

          View options

          Media

          Figures

          Other

          Tables

          Share

          Share

          Share this Publication link

          Share on social media