[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to main content

A Multi-Gate Encoder for Joint Entity and Relation Extraction

  • Conference paper
  • First Online:
Chinese Computational Linguistics (CCL 2022)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 13603))

Included in the following conference series:

Abstract

Named entity recognition and relation extraction are core sub-tasks of relational triple extraction. Recent studies have used parameter sharing or joint decoding to create interaction between these two tasks. However, ensuring the specificity of task-specific traits while the two tasks interact properly is a huge difficulty. We propose a multi-gate encoder that models bidirectional task interaction while keeping sufficient feature specificity based on gating mechanism in this paper. Precisely, we design two types of independent gates: task gates to generate task-specific features and interaction gates to generate instructive features to guide the opposite task. Our experiments show that our method increases the state-of-the-art (SOTA) relation F1 scores on ACE04, ACE05 and SciERC datasets to 63.8% (+1.3%), 68.2% (+1.4%), 39.4% (+1.0%), respectively, with higher inference speed over previous SOTA model.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
£29.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
GBP 19.95
Price includes VAT (United Kingdom)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
GBP 35.99
Price includes VAT (United Kingdom)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
GBP 44.99
Price includes VAT (United Kingdom)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    We process the datasets with scripts provided by Luan et al. [20]: https://github.com/luanyi/DyGIE/tree/master/preprocessing.

  2. 2.

    http://nlp.cs.washington.edu/sciIE/.

References

  1. Ba, J.L., Kiros, J.R., Hinton, G.E.: Layer normalization. arXiv preprint arXiv:1607.06450 (2016)

  2. Bekoulis, G., Deleu, J., Demeester, T., Develder, C.: Adversarial training for multi-context joint entity and relation extraction. In: Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, Brussels, Belgium, October-November 2018, pp. 2830–2836. Association for Computational Linguistics (2018). https://doi.org/10.18653/v1/D18-1307. https://aclanthology.org/D18-1307

  3. Bekoulis, G., Deleu, J., Demeester, T., Develder, C.: Joint entity recognition and relation extraction as a multi-head selection problem. CoRR abs/1804.07847 (2018)

    Google Scholar 

  4. Beltagy, I., Lo, K., Cohan, A.: SciBERT: a pretrained language model for scientific text. In: Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), Hong Kong, China, November 2019, pp. 3615–3620. Association for Computational Linguistics (2019). https://doi.org/10.18653/v1/D19-1371. https://aclanthology.org/D19-1371

  5. Chan, Y.S., Roth, D.: Exploiting syntactico-semantic structures for relation extraction. In: Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies, Portland, Oregon, USA, June 2011, pp. 551–560. Association for Computational Linguistics (2011). https://aclanthology.org/P11-1056

  6. Clevert, D.A., Unterthiner, T., Hochreiter, S.: Fast and accurate deep network learning by exponential linear units (ELUs). arXiv preprint arXiv:1511.07289 (2015)

  7. Devlin, J., Chang, M.W., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, vol. 1 (Long and Short Papers), Minneapolis, Minnesota, June 2019, pp. 4171–4186. Association for Computational Linguistics (2019). https://doi.org/10.18653/v1/N19-1423. https://aclanthology.org/N19-1423

  8. Dixit, K., Al-Onaizan, Y.: Span-level model for relation extraction. In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, Florence, Italy, July 2019, pp. 5308–5314. Association for Computational Linguistics (2019). https://doi.org/10.18653/v1/P19-1525. https://aclanthology.org/P19-1525

  9. Doddington, G., Mitchell, A., Przybocki, M., Ramshaw, L., Strassel, S., Weischedel, R.: The automatic content extraction (ACE) program - tasks, data, and evaluation. In: Proceedings of the Fourth International Conference on Language Resources and Evaluation (LREC 2004), Lisbon, Portugal, May 2004. European Language Resources Association (ELRA) (2004). https://www.lrec-conf.org/proceedings/lrec2004/pdf/5.pdf

  10. Eberts, M., Ulges, A.: Span-based joint entity and relation extraction with transformer pre-training. In: ECAI 2020, pp. 2006–2013. IOS Press (2020)

    Google Scholar 

  11. Florian, R., Ittycheriah, A., Jing, H., Zhang, T.: Named entity recognition through classifier combination. In: Proceedings of the Seventh Conference on Natural Language Learning at HLT-NAACL 2003, pp. 168–171 (2003). https://aclanthology.org/W03-0425

  12. Fu, T.J., Li, P.H., Ma, W.Y.: GraphRel: modeling text as relational graphs for joint entity and relation extraction. In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, Florence, Italy, July 2019, pp. 1409–1418. Association for Computational Linguistics (2019). https://doi.org/10.18653/v1/P19-1136. https://aclanthology.org/P19-1136

  13. Gormley, M.R., Yu, M., Dredze, M.: Improved relation extraction with feature-rich compositional embedding models. In: Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, Lisbon, Portugal, September 2015, pp. 1774–1784. Association for Computational Linguistics (2015). https://doi.org/10.18653/v1/D15-1205. https://aclanthology.org/D15-1205

  14. Katiyar, A., Cardie, C.: Going out on a limb: joint extraction of entity mentions and relations without dependency trees. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), Vancouver, Canada, July 2017, pp. 917–928. Association for Computational Linguistics (2017). https://doi.org/10.18653/v1/P17-1085. https://aclanthology.org/P17-1085

  15. Lan, Z., Chen, M., Goodman, S., Gimpel, K., Sharma, P., Soricut, R.: ALBERT: a lite BERT for self-supervised learning of language representations. In: ICLR (2020)

    Google Scholar 

  16. Li, Q., Ji, H.: Incremental joint extraction of entity mentions and relations. In: Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), Baltimore, Maryland, June 2014, pp. 402–412. Association for Computational Linguistics (2014). https://doi.org/10.3115/v1/P14-1038. https://aclanthology.org/P14-1038

  17. Li, X., et al.: Entity-relation extraction as multi-turn question answering. In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. pp. 1340–1350. Association for Computational Linguistics (2019). https://doi.org/10.18653/v1/P19-1129. https://aclanthology.org/P19-1129

  18. Lin, Y., Liu, Z., Sun, M., Liu, Y., Zhu, X.: Learning entity and relation embeddings for knowledge graph completion. In: Twenty-Ninth AAAI Conference on Artificial Intelligence (2015)

    Google Scholar 

  19. Luan, Y., He, L., Ostendorf, M., Hajishirzi, H.: Multi-task identification of entities, relations, and coreference for scientific knowledge graph construction. In: Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pp. 3219–3232. Association for Computational Linguistics (2018). https://doi.org/10.18653/v1/D18-1360. https://aclanthology.org/D18-1360

  20. Luan, Y., Wadden, D., He, L., Shah, A., Ostendorf, M., Hajishirzi, H.: A general framework for information extraction using dynamic span graphs. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, vol. 1 (Long and Short Papers), Minneapolis, Minnesota, June 2019, pp. 3036–3046. Association for Computational Linguistics (2019). https://doi.org/10.18653/v1/N19-1308. https://aclanthology.org/N19-1308

  21. Luoma, J., Pyysalo, S.: Exploring cross-sentence contexts for named entity recognition with BERT. In: Proceedings of the 28th International Conference on Computational Linguistics, Barcelona, Spain, December 2020, pp. 904–914. International Committee on Computational Linguistics (2020). https://doi.org/10.18653/v1/2020.coling-main.78. https://aclanthology.org/2020.coling-main.78

  22. Miwa, M., Bansal, M.: End-to-end relation extraction using LSTMs on sequences and tree structures. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), Berlin, Germany, August 2016, pp. 1105–1116. Association for Computational Linguistics (2016). https://doi.org/10.18653/v1/P16-1105. https://aclanthology.org/P16-1105

  23. Miwa, M., Sasaki, Y.: Modeling joint entity and relation extraction with table representation. In: Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), Doha, Qatar, October 2014, pp. 1858–1869. Association for Computational Linguistics (2014). https://doi.org/10.3115/v1/D14-1200. https://aclanthology.org/D14-1200

  24. Ren, F., et al.: A novel global feature-oriented relational triple extraction model based on table filling. In: Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, Online and Punta Cana, Dominican Republic, November 2021, pp. 2646–2656. Association for Computational Linguistics (2021). https://doi.org/10.18653/v1/2021.emnlp-main.208. https://aclanthology.org/2021.emnlp-main.208

  25. Seo, M., Kembhavi, A., Farhadi, A., Hajishirzi, H.: Bidirectional attention flow for machine comprehension, June 2018. https://arxiv.org/abs/1611.01603. Number: arXiv:1611.01603 [cs]

  26. Walker, C., Strassel, S., Medero, J., Maeda, K.: ACE 2005 multilingual training corpus. Linguist. Data Consort. Philadelphia 57, 45 (2006)

    Google Scholar 

  27. Wang, J., Lu, W.: Two are better than one: joint entity and relation extraction with table-sequence encoders. In: Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pp. 1706–1721. Association for Computational Linguistics, November 2020. https://doi.org/10.18653/v1/2020.emnlp-main.133. https://aclanthology.org/2020.emnlp-main.133

  28. Wang, S., Zhang, Y., Che, W., Liu, T.: Joint extraction of entities and relations based on a novel graph scheme. In: Proceedings of the 27th International Joint Conference on Artificial Intelligence, IJCAI 2018, pp. 4461–4467. AAAI Press (2018)

    Google Scholar 

  29. Wang, Y., Sun, C., Wu, Y., Yan, J., Gao, P., Xie, G.: Pre-training entity relation encoder with intra-span and inter-span information. In: Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pp. 1692–1705. Association for Computational Linguistics, November 2020. https://doi.org/10.18653/v1/2020.emnlp-main.132. https://aclanthology.org/2020.emnlp-main.132

  30. Wang, Y., Yu, B., Zhang, Y., Liu, T., Zhu, H., Sun, L.: TPLinker: single-stage joint extraction of entities and relations through token pair linking. In: Proceedings of the 28th International Conference on Computational Linguistics, Barcelona, Spain, December 2020, pp. 1572–1582. International Committee on Computational Linguistics (2020). https://doi.org/10.18653/v1/2020.coling-main.138. https://aclanthology.org/2020.coling-main.138

  31. Xue, F., Sun, A., Zhang, H., Chng, E.S.: GDPNet: refining latent multi-view graph for relation extraction. In: Thirty-Fifth AAAI Conference on Artificial Intelligence, AAAI, pp. 2–9 (2021)

    Google Scholar 

  32. Yan, Z., Zhang, C., Fu, J., Zhang, Q., Wei, Z.: A partition filter network for joint entity and relation extraction. In: Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, Punta Cana, Dominican Republic, November 2021, pp. 185–197. Association for Computational Linguistics (2021). https://doi.org/10.18653/v1/2021.emnlp-main.17. https://aclanthology.org/2021.emnlp-main.17

  33. Ye, D., Lin, Y., Li, P., Sun, M.: Pack together: entity and relation extraction with levitated marker. In: Proceedings of ACL 2022 (2022)

    Google Scholar 

  34. Yu, X., Lam, W.: Jointly identifying entities and extracting relations in encyclopedia text via a graphical model approach. In: Coling 2010: Posters, Beijing, China, August 2010, pp. 1399–1407. Coling 2010 Organizing Committee (2010). https://aclanthology.org/C10-2160

  35. Zelenko, D., Aone, C., Richardella, A.: Kernel methods for relation extraction. In: Proceedings of the 2002 Conference on Empirical Methods in Natural Language Processing (EMNLP 2002), pp. 71–78. Association for Computational Linguistics, July 2002. https://doi.org/10.3115/1118693.1118703. https://aclanthology.org/W02-1010

  36. Zeng, X., Zeng, D., He, S., Liu, K., Zhao, J.: Extracting relational facts by an end-to-end neural model with copy mechanism. In: Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), Melbourne, Australia, July 2018, pp. 506–514. Association for Computational Linguistics (2018). https://doi.org/10.18653/v1/P18-1047. https://aclanthology.org/P18-1047

  37. Zhang, M., Zhang, Y., Fu, G.: End-to-end neural relation extraction with global optimization. In: Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, Copenhagen, Denmark, September 2017, pp. 1730–1740. Association for Computational Linguistics (2017). https://doi.org/10.18653/v1/D17-1182. https://aclanthology.org/D17-1182

  38. Zheng, S., Wang, F., Bao, H., Hao, Y., Zhou, P., Xu, B.: Joint extraction of entities and relations based on a novel tagging scheme. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), Vancouver, Canada, July 2017, pp. 1227–1236. Association for Computational Linguistics (2017). https://doi.org/10.18653/v1/P17-1113. https://aclanthology.org/P17-1113

  39. Zhong, Z., Chen, D.: A frustratingly easy approach for entity and relation extraction. In: Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp. 50–61. Association for Computational Linguistics, June 2021. https://doi.org/10.18653/v1/2021.naacl-main.5. https://aclanthology.org/2021.naacl-main.5

  40. Zhou, G., Su, J., Zhang, J., Zhang, M.: Exploring various knowledge in relation extraction. In: Proceedings of the 43rd Annual Meeting of the Association for Computational Linguistics (ACL 2005), Ann Arbor, Michigan, June 2005, pp. 427–434. Association for Computational Linguistics (2005). https://doi.org/10.3115/1219840.1219893. https://aclanthology.org/P05-1053

Download references

Acknowledgements

This work was supported by the National Defense Science and Technology Key Laboratory Fund Project of the Chinese Academy of Sciences: Space Science and Application of Big Data Knowledge Graph Construction and Intelligent Application Research and Manned Space Engineering Project: Research on Technology and Method of Engineering Big Data Knowledge Mining.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Shengyang Li .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Xiong, X., Liu, Y., Liu, A., Gong, S., Li, S. (2022). A Multi-Gate Encoder for Joint Entity and Relation Extraction. In: Sun, M., et al. Chinese Computational Linguistics. CCL 2022. Lecture Notes in Computer Science(), vol 13603. Springer, Cham. https://doi.org/10.1007/978-3-031-18315-7_11

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-18315-7_11

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-18314-0

  • Online ISBN: 978-3-031-18315-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics