[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

Deep Semantic Role Labeling: What Works and What’s Next

Luheng He, Kenton Lee, Mike Lewis, Luke Zettlemoyer


Abstract
We introduce a new deep learning model for semantic role labeling (SRL) that significantly improves the state of the art, along with detailed analyses to reveal its strengths and limitations. We use a deep highway BiLSTM architecture with constrained decoding, while observing a number of recent best practices for initialization and regularization. Our 8-layer ensemble model achieves 83.2 F1 on theCoNLL 2005 test set and 83.4 F1 on CoNLL 2012, roughly a 10% relative error reduction over the previous state of the art. Extensive empirical analysis of these gains show that (1) deep models excel at recovering long-distance dependencies but can still make surprisingly obvious errors, and (2) that there is still room for syntactic parsers to improve these results.
Anthology ID:
P17-1044
Volume:
Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2017
Address:
Vancouver, Canada
Editors:
Regina Barzilay, Min-Yen Kan
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
473–483
Language:
URL:
https://aclanthology.org/P17-1044
DOI:
10.18653/v1/P17-1044
Bibkey:
Cite (ACL):
Luheng He, Kenton Lee, Mike Lewis, and Luke Zettlemoyer. 2017. Deep Semantic Role Labeling: What Works and What’s Next. In Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 473–483, Vancouver, Canada. Association for Computational Linguistics.
Cite (Informal):
Deep Semantic Role Labeling: What Works and What’s Next (He et al., ACL 2017)
Copy Citation:
PDF:
https://aclanthology.org/P17-1044.pdf
Video:
 https://aclanthology.org/P17-1044.mp4
Code
 luheng/deep_srl
Data
CoNLLOntoNotes 5.0