[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/3580305.3599490acmconferencesArticle/Chapter ViewAbstractPublication PageskddConference Proceedingsconference-collections
research-article
Free access

Representation Learning on Hyper-Relational and Numeric Knowledge Graphs with Transformers

Published: 04 August 2023 Publication History

Abstract

In a hyper-relational knowledge graph, a triplet can be associated with a set of qualifiers, where a qualifier is composed of a relation and an entity, providing auxiliary information for the triplet. While existing hyper-relational knowledge graph embedding methods assume that the entities are discrete objects, some information should be represented using numeric values, e.g., (J.R.R., was born in, 1892). Also, a triplet (J.R.R., educated at, Oxford Univ.) can be associated with a qualifier such as (start time, 1911). In this paper, we propose a unified framework named HyNT that learns representations of a hyper-relational knowledge graph containing numeric literals in either triplets or qualifiers. We define a context transformer and a prediction transformer to learn the representations based not only on the correlations between a triplet and its qualifiers but also on the numeric information. By learning compact representations of triplets and qualifiers and feeding them into the transformers, we reduce the computation cost of using transformers. Using HyNT, we can predict missing numeric values in addition to missing entities or relations in a hyper-relational knowledge graph. Experimental results show that HyNT significantly outperforms state-of-the-art methods on real-world datasets.

Supplementary Material

MP4 File (rtfp1281-20min-video.mp4)
In a hyper-relational knowledge graph, a triplet can have qualifiers, providing auxiliary information for the triplet. While existing methods assume that the entities are discrete objects, some information should be represented using numeric values, e.g., (Avatar, duration, 162 minutes). Also, a triplet (Avatar, award received, Saturn Award) can have a qualifier such as (point in time, 2010). In this work, we propose a unified framework named HyNT that learns representations of a hyper-relational knowledge graph containing numeric literals in either triplets or qualifiers. We define a context transformer and a prediction transformer to learn the representations based not only on the correlations between a triplet and its qualifiers but also on the numeric information. Using HyNT, we can predict missing numeric values in addition to missing entities or relations in a hyper-relational knowledge graph. Experiments show that HyNT significantly outperforms state-of-the-art methods on real-world datasets.
MP4 File (rtfp1281-2min-promo.mp4)
In a hyper-relational knowledge graph, a triplet can have qualifiers, providing auxiliary information for the triplet. While existing methods assume that the entities are discrete objects, some information should be represented using numeric values, e.g., (Avatar, duration, 162 minutes). Also, a triplet (Avatar, award received, Saturn Award) can have a qualifier such as (point in time, 2010). In this work, we propose a unified framework named HyNT that learns representations of a hyper-relational knowledge graph containing numeric literals in either triplets or qualifiers. We define a context transformer and a prediction transformer to learn the representations based not only on the correlations between a triplet and its qualifiers but also on the numeric information. Using HyNT, we can predict missing numeric values in addition to missing entities or relations in a hyper-relational knowledge graph. Experiments show that HyNT significantly outperforms state-of-the-art methods on real-world datasets.

References

[1]
Mehdi Ali, Max Berrendorf, Mikhail Galkin, Veronika Thost, Tengfei Ma, Volker Tresp, and Jens Lehmann. 2021. Improving Inductive Link Prediction Using Hyper-relational Facts. In Proceedings of the 20th International Semantic Web Conference. 74--92. https://doi.org/10.1007/978-3-030-88361-4_5
[2]
Dimitrios Alivanistos, Max Berrendorf, Michael Cochez, and Mikhail Galkin. 2022. Query Embedding on Hyper-Relational Knowledge Graphs. In Proceedings of the 10th International Conference on Learning Representations.
[3]
Sören Auer, Christian Bizer, Georgi Kobilarov, Jens Lehmann, Richard Cyganiak, and Zachary Ives. 2007. DBpedia: A Nucleus for a Web of Open Data. In Proceedings of the 6th International Semantic Web Conference and the 2nd Asian Semantic Web Conference. 722--735. https://doi.org/10.1007/978-3-540-76298-0_52
[4]
Jimmy Lei Ba, Jamie Ryan Kiros, and Geoffrey Everest Hinton. 2016. Layer Nor-malization. (2016). https://doi.org/10.48550/arXiv.1607.06450 arXiv:1607.06450
[5]
Eda Bayram, Alberto Garcia-Duran, and Robert West. 2021. Node Attribute Completion in Knowledge Graphs with Multi-Relational Propagation. In Proceedings of the 2021 IEEE International Conference on Acoustics, Speech and Signal Processing. 3590--3594. https://doi.org/10.1109/ICASSP39728.2021.9414016
[6]
Kurt Bollacker, Colin Evans, Praveen Paritosh, Tim Sturge, and Jamie Taylor. 2008. Freebase: A Collaboratively Created Graph Database for Structuring Human Knowledge. In Proceedings of the 2008 ACM SIGMOD International Conference on Management of Data. 1247--1250. https://doi.org/10.1145/1376616.1376746
[7]
Sanxing Chen, Xiaodong Liu, Jianfeng Gao, Jian Jiao, Ruofei Zhang, and Yangfeng Ji. 2021. HittER: Hierarchical Transformers for Knowledge Graph Embeddings. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing. 10395--10407. https://doi.org/10.18653/v1/2021.emnlp-main.812
[8]
Chanyoung Chung and Joyce Jiyoung Whang. 2021. Knowledge Graph Embedding via Metagraph Learning. In Proceedings of the 44th International ACM SIGIR Conference on Research and Development in Information Retrieval. 2212--2216. https://doi.org/10.1145/3404835.3463072
[9]
Chanyoung Chung and Joyce Jiyoung Whang. 2023. Learning Representations of Bi-level Knowledge Graphs for Reasoning beyond Link Prediction. (2023). https://doi.org/10.48550/arXiv.2302.02601 arXiv:2302.02601
[10]
Jacob Devlin, Ming-Wei Chang, Kenton Lee, and Kristina Toutanova. 2019. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding. In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers). 4171--4186. https://doi.org/10.18653/v1/N19-1423
[11]
Shimin Di, Quanming Yao, and Lei Chen. 2021. Searching to Sparsify Tensor Decomposition for N-ary Relational Data. In Proceedings of The Web Conference 2021. 4043--4054. https://doi.org/10.1145/3442381.3449853
[12]
Bahare Fatemi, Perouz Taslakian, David Vazquez, and David Poole. 2020. Knowledge Hypergraphs: Prediction Beyond Binary Relations. In Proceedings of the 29th International Joint Conference on Artificial Intelligence. 2191--2197. https: //doi.org/10.24963/ijcai.2020/303
[13]
Mikhail Galkin, Priyansh Trivedi, Gaurav Maheshwari, Ricardo Usbeck, and Jens Lehmann. 2020. Message Passing for Hyper-Relational Knowledge Graphs. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing. 7346--7359. https://doi.org/10.18653/v1/2020.emnlp-main.596
[14]
Alberto Garcia-Duran and Mathias Niepert. 2018. KBLRN: End-to-end Learning of Knoweldge Base Representations with Latent, Relational, and Numerical Features. In Proceedings of the 34th Conference on Uncertainty in Artificial Intelligence. 372--381.
[15]
Genet Asefa Gesese, Russa Biswas, Mehwish Alam, and Harald Sack. 2021. A Survey on Knowledge Graph Embeddings with Literals: Which Model Links Better Literally? Semantic Web 12, 4 (2021), 617--647. https://doi.org/10.3233/SW-200404
[16]
Saiping Guan, Xiaolong Jin, Jiafeng Guo, Yuanzhuo Wang, and Xueqi Cheng. 2020. NeuInfer: Knowledge Inference on N-ary Facts. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics. 6141--6151. https://doi.org/10.18653/v1/2020.acl-main.546
[17]
Saiping Guan, Xiaolong Jin, Jiafeng Guo, Yuanzhuo Wang, and Xueqi Cheng. 2021. Link Prediction on N-ary Relational Data Based on Relatedness Evaluation. IEEE Transactions on Knowledge and Data Engineering (2021). https://doi.org/10. 1109/TKDE.2021.3073483
[18]
Saiping Guan, Xiaolong Jin, Yuanzhuo Wang, and Xueqi Cheng. 2019. Link Prediction on N-ary Relational Data. In Proceedings of The Web Conference 2019. 583--593. https://doi.org/10.1145/3308558.3313414
[19]
Kaiming He, Xiangyu Zhang, Shaoqing Ren, and Jian Sun. 2016. Deep Residual Learning for Image Recognition. In Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition. 770--778. https://doi.org/10.1109/CVPR. 2016.90
[20]
Vinh Thinh Ho, Daria Stepanova, Dragan Milchevski, Jannik Strötgen, and Gerhard Weikum. 2022. Enhancing Knowledge Bases with Quantity Facts. In Proceedings of The Web Conference 2022. 893--901. https://doi.org/10.1145/3485447. 3511932
[21]
Shaoxiong Ji, Shirui Pan, Erik Cambria, Pekka Marttinen, and Philip S. Yu. 2022. A Survey on Knowledge Graphs: Representation, Acquisition, and Applications. IEEE Transactions on Neural Networks and Learning Systems 33, 2 (2022), 494--514. https://doi.org/10.1109/TNNLS.2021.3070843
[22]
Diederik P. Kingma and Jimmy Ba. 2015. Adam: A Method for Stochastic Opti-mization. In Proceedings of the 3rd International Conference on Learning Representations.
[23]
Bhushan Kotnis and Alberto Garcia-Duran. 2019. Learning Numerical Attributes in Knowledge Bases. In Proceedings of the 1st Conference on Automated Knowledge Base Construction. https://doi.org/10.24432/C5Z59Q
[24]
Agustinus Kristiadi, Mohammad Asif Khan, Denis Lukovnikov, Jens Lehmann, and Asja Fischer. 2019. Incorporating Literals into Knowledge Graph Embeddings. In Proceedings of the 18th International Semantic Web Conference. 347--363. https: //doi.org/10.1007/978-3-030-30793-6_20
[25]
Ji Ho Kwak, Jaejun Lee, Joyce Jiyoung Whang, and Sungho Jo. 2022. Semantic Grasping Via a Knowledge Graph of Robotic Manipulation: A Graph Representation Learning Approach. IEEE Robotics and Automation Letters 7, 4 (2022), 9397--9404. https://doi.org/10.1109/LRA.2022.3191194
[26]
Jaejun Lee, Chanyoung Chung, and Joyce Jiyoung Whang. 2023. InGram: Inductive Knowledge Graph Embedding via Relation Graphs. (2023). https: //doi.org/10.48550/arXiv.2305.19987 arXiv:2305.19987
[27]
Ye Liu, Hui Li, Alberto Garcia-Duran, Mathias Niepert, Daniel Onoro-Rubio, and David S. Rosenblum. 2019. MMKG: Multi-modal Knowledge Graphs. In Proceedings of the 16th International Semantic Web Conference. 459--474. https: //doi.org/10.1007/978-3-030-21348-0_30
[28]
Yu Liu, Quanming Yao, and Yong Li. 2020. Generalizing Tensor Decomposition for N-ary Relational Knowledge Bases. In Proceedings of The Web Conference 2020. 1104--1114. https://doi.org/10.1145/3366423.3380188
[29]
Yu Liu, Quanming Yao, and Yong Li. 2021. Role-Aware Modeling for N-ary Relational Knowledge Bases. In Proceedings of The Web Conference 2021. 2660--2671. https://doi.org/10.1145/3442381.3449874
[30]
Ilya Loshchilov and Frank Hutter. 2017. SGDR: Stochastic Gradient Descent with Warm Restarts. In Proceedings of the 5th International Conference on Learning Representations.
[31]
Farzaneh Mahdisoltani, Joanna Biega, and Fabian M. Suchanek. 2015. YAGO3: A Knowledge Base from Multilingual Wikipedias. In Proceedings of the 7th Biennial Conference on Innovative Data Systems Research.
[32]
Adam Paszke, Sam Gross, Francisco Massa, Adam Lerer, James Bradbury, Gregory Chanan, Trevor Killeen, Zeming Lin, Natalia Gimelshein, Luca Antiga, Alban Desmaison, Andreas Kopf, Edward Yang, Zachary DeVito, Martin Raison, Alykhan Tejani, Sasank Chilamkurthy, Benoit Steiner, Lu Fang, Junjie Bai, and Soumith Chintala. 2019. PyTorch: An Imperative Style, High-Performance Deep Learning Library. In Proceedings of the 33rd Conference on Neural Information Processing Systems. 8026--8037.
[33]
Pouya Pezeshkpour, Liyan Chen, and Sameer Singh. 2018. Embedding Multimodal Relational Data for Knowledge Base Completion. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing. 3208--3218. https://doi.org/10.18653/v1/D18-1359
[34]
Hongyu Ren, Hanjun Dai, Bo Dai, Xinyun Chen, Denny Zhou, Jure Leskovec, and Dale Schuurmans. 2022. SMORE: Knowledge Graph Completion and Multi-hop Reasoning in Massive Knowledge Graphs. In Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining. 1472--1482. https: //doi.org/10.1145/3534678.3539405
[35]
Paolo Rosso, Dingqi Yang, and Philippe Cudre-Mauroux. 2020. Beyond Triplets: Hyper-Relational Knowledge Graph Embedding for Link Prediction. In Proceedings of The Web Conference 2020. 1885--1896. https://doi.org/10.1145/3366423. 3380257
[36]
Guus Schreiber, VU University Amsterdam, Yves Raimond, and BBC. 2014. RDF 1.1 Primer. Retrieved January 31, 2023 from https://www.w3.org/TR/2014/NOTE-rdf11-primer-20140624/
[37]
Nitish Srivastava, Geoffrey Hinton, Alex Krizhevsky, Ilya Sutskever, and Ruslan Salakhutdinov. 2014. Dropout: A Simple Way to Prevent Neural Networks from Overfitting. Journal of Machine Learning Research 15, 56 (2014), 1929--1958.
[38]
Fabian M. Suchanek, Gjergji Kasneci, and Gerhard Weikum. 2007. Yago: A Core of Semantic Knowledge. In Proceedings of the 16th International Conference on World Wide Web. 697--706. https://doi.org/10.1145/1242572.1242667
[39]
Christian Szegedy, Vincent Vanhoucke, Sergey Ioffe, Jon Shlens, and Zbigniew Wojna. 2016. Rethinking the Inception Architecture for Computer Vision. In Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition. 2818--2826. https://doi.org/10.1109/CVPR.2016.308
[40]
Thomas Pellissier Tanon, Denny Vrandečić, Sebastian Schaffert, Thomas Steiner, and Lydia Pintscher. 2016. From Freebase to Wikidata: The Great Migration. In Proceedings of the 25th International Conference on World Wide Web. 1419--1428. https://doi.org/10.1145/2872427.2874809
[41]
Yi Tay, Luu Anh Tuan, Minh C. Phan, and Siu Cheung Hui. 2017. Multi-Task Neural Network for Non-discrete Attribute Prediction in Knowledge Graphs. In Proceedings of the 26th ACM International Conference on Information and Knowledge Management. 1029--1038. https://doi.org/10.1145/3132847.3132937
[42]
Kristina Toutanova and Danqi Chen. 2015. Observed versus Latent Features for Knowledge Base and Text Inference. In Proceedings of the 3rd Workshop on Continuous Vector Space Models and their Compositionality. 57--66. https: //doi.org/10.18653/v1/W15--4007
[43]
Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Łukasz Kaiser, and Illia Polosukhin. 2017. Attention is All you Need. In Proceedings of the 31st Conference on Neural Information Processing Systems. 5998--6008.
[44]
Petar Veličković, Guillem Cucurull, Arantxa Casanova, Adriana Romero, Pietro Liò, and Yoshua Bengio. 2018. Graph Attention Networks. In Proceedings of the 6th International Conference on Learning Representations.
[45]
Denny Vrandečić and Markus Krötzsch. 2014. Wikidata: A Free Collaborative Knowledgebase. Commun. ACM 57, 10 (2014), 78--85. https://doi.org/10.1145/ 2629489
[46]
Bo Wang, Tao Shen, Guodong Long, Tianyi Zhou, Ying Wang, and Yi Chang. 2021. Structure-Augmented Text Representation Learning for Efficient Knowledge Graph Completion. In Proceedings of The Web Conference 2021. 1737--1748. https: //doi.org/10.1145/3442381.3450043
[47]
Quan Wang, Haifeng Wang, Yajuan Lyu, and Yong Zhu. 2021. Link Prediction on N-ary Relational Facts: A Graph-based Approach. In Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021. 396--407. https://doi.org/10. 18653/v1/2021.findings-acl.35
[48]
Jianfeng Wen. 2016. The JF17K Dataset. Retrieved January 31, 2023 from https://www.site.uottawa.ca/~yymao/JF17K/
[49]
Jianfeng Wen, Jianxin Li, Yongyi Mao, Shini Chen, and Richong Zhang. 2016. On the Representation and Embedding of Knowledge Bases beyond Binary Relations. In Proceedings of the 25th International Joint Conference on Artificial Intelligence. 1300--1307.
[50]
Yanrong Wu and Zhichun Wang. 2018. Knowledge Graph Embedding with Numeric Attributes of Entities. In Proceedings of the 3rd Workshop on Representation Learning for NLP. 132--136. https://doi.org/10.18653/v1/W18--3017
[51]
Donghan Yu and Yiming Yang. 2022. Improving Hyper-Relational Knowledge Graph Completion. (2022). https://doi.org/10.48550/arXiv.2104.08167 arXiv:2104.08167
[52]
Richong Zhang, Junpeng Li, Jiajie Mei, and Yongyi Mao. 2018. Scalable Instance Reconstruction in Knowledge Bases via Relatedness Affiliated Embedding. In Proceedings of The Web Conference 2018. 1185--1194. https://doi.org/10.1145/ 3178876.3186017

Cited By

View all
  • (2024)Multi-task Learning for Hyper-Relational Knowledge Graph CompletionAdvanced Intelligent Computing Technology and Applications10.1007/978-981-97-5669-8_10(115-126)Online publication date: 3-Aug-2024
  • (2023)Dynamic Relation-Attentive Graph Neural Networks for Fraud Detection2023 IEEE International Conference on Data Mining Workshops (ICDMW)10.1109/ICDMW60847.2023.00143(1092-1096)Online publication date: 4-Dec-2023

Index Terms

  1. Representation Learning on Hyper-Relational and Numeric Knowledge Graphs with Transformers

      Recommendations

      Comments

      Please enable JavaScript to view thecomments powered by Disqus.

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      KDD '23: Proceedings of the 29th ACM SIGKDD Conference on Knowledge Discovery and Data Mining
      August 2023
      5996 pages
      ISBN:9798400701030
      DOI:10.1145/3580305
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 04 August 2023

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. hyper-relational fact
      2. knowledge graph
      3. knowledge graph completion
      4. numeric literals
      5. representation learning
      6. transformer

      Qualifiers

      • Research-article

      Funding Sources

      • National Research Foundation of Korea (NRF), Ministry of Science and ICT
      • Institute of Information & communications Technology Planning & Evaluation (IITP), Ministry of Science and ICT

      Conference

      KDD '23
      Sponsor:

      Acceptance Rates

      Overall Acceptance Rate 1,133 of 8,635 submissions, 13%

      Upcoming Conference

      KDD '25

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)633
      • Downloads (Last 6 weeks)56
      Reflects downloads up to 01 Jan 2025

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)Multi-task Learning for Hyper-Relational Knowledge Graph CompletionAdvanced Intelligent Computing Technology and Applications10.1007/978-981-97-5669-8_10(115-126)Online publication date: 3-Aug-2024
      • (2023)Dynamic Relation-Attentive Graph Neural Networks for Fraud Detection2023 IEEE International Conference on Data Mining Workshops (ICDMW)10.1109/ICDMW60847.2023.00143(1092-1096)Online publication date: 4-Dec-2023

      View Options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Login options

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media