[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

Going out on a limb: Joint Extraction of Entity Mentions and Relations without Dependency Trees

Arzoo Katiyar, Claire Cardie


Abstract
We present a novel attention-based recurrent neural network for joint extraction of entity mentions and relations. We show that attention along with long short term memory (LSTM) network can extract semantic relations between entity mentions without having access to dependency trees. Experiments on Automatic Content Extraction (ACE) corpora show that our model significantly outperforms feature-based joint model by Li and Ji (2014). We also compare our model with an end-to-end tree-based LSTM model (SPTree) by Miwa and Bansal (2016) and show that our model performs within 1% on entity mentions and 2% on relations. Our fine-grained analysis also shows that our model performs significantly better on Agent-Artifact relations, while SPTree performs better on Physical and Part-Whole relations.
Anthology ID:
P17-1085
Volume:
Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2017
Address:
Vancouver, Canada
Editors:
Regina Barzilay, Min-Yen Kan
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
917–928
Language:
URL:
https://aclanthology.org/P17-1085
DOI:
10.18653/v1/P17-1085
Bibkey:
Cite (ACL):
Arzoo Katiyar and Claire Cardie. 2017. Going out on a limb: Joint Extraction of Entity Mentions and Relations without Dependency Trees. In Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 917–928, Vancouver, Canada. Association for Computational Linguistics.
Cite (Informal):
Going out on a limb: Joint Extraction of Entity Mentions and Relations without Dependency Trees (Katiyar & Cardie, ACL 2017)
Copy Citation:
PDF:
https://aclanthology.org/P17-1085.pdf
Video:
 https://aclanthology.org/P17-1085.mp4
Data
ACE 2004ACE 2005