[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

Learning to Update Knowledge Graphs by Reading News

Jizhi Tang, Yansong Feng, Dongyan Zhao


Abstract
News streams contain rich up-to-date information which can be used to update knowledge graphs (KGs). Most current text-based KG updating methods rely on elaborately designed information extraction (IE) systems and carefully crafted rules, which are often domain-specific and hard to maintain. Besides, such methods often hardly pay enough attention to the implicit information that lies underneath texts. In this paper, we propose a novel neural network method, GUpdater, to tackle these problems. GUpdater is built upon graph neural networks (GNNs) with a text-based attention mechanism to guide the updating message passing through the KG structures. Experiments on a real-world KG updating dataset show that our model can effectively broadcast the news information to the KG structures and perform necessary link-adding or link-deleting operations to ensure the KG up-to-date according to news snippets.
Anthology ID:
D19-1265
Volume:
Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP)
Month:
November
Year:
2019
Address:
Hong Kong, China
Editors:
Kentaro Inui, Jing Jiang, Vincent Ng, Xiaojun Wan
Venues:
EMNLP | IJCNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
2632–2641
Language:
URL:
https://aclanthology.org/D19-1265
DOI:
10.18653/v1/D19-1265
Bibkey:
Cite (ACL):
Jizhi Tang, Yansong Feng, and Dongyan Zhao. 2019. Learning to Update Knowledge Graphs by Reading News. In Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), pages 2632–2641, Hong Kong, China. Association for Computational Linguistics.
Cite (Informal):
Learning to Update Knowledge Graphs by Reading News (Tang et al., EMNLP-IJCNLP 2019)
Copy Citation:
PDF:
https://aclanthology.org/D19-1265.pdf