[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

Optimizing Differentiable Relaxations of Coreference Evaluation Metrics

Phong Le, Ivan Titov


Abstract
Coreference evaluation metrics are hard to optimize directly as they are non-differentiable functions, not easily decomposable into elementary decisions. Consequently, most approaches optimize objectives only indirectly related to the end goal, resulting in suboptimal performance. Instead, we propose a differentiable relaxation that lends itself to gradient-based optimisation, thus bypassing the need for reinforcement learning or heuristic modification of cross-entropy. We show that by modifying the training objective of a competitive neural coreference system, we obtain a substantial gain in performance. This suggests that our approach can be regarded as a viable alternative to using reinforcement learning or more computationally expensive imitation learning.
Anthology ID:
K17-1039
Volume:
Proceedings of the 21st Conference on Computational Natural Language Learning (CoNLL 2017)
Month:
August
Year:
2017
Address:
Vancouver, Canada
Editors:
Roger Levy, Lucia Specia
Venue:
CoNLL
SIG:
SIGNLL
Publisher:
Association for Computational Linguistics
Note:
Pages:
390–399
Language:
URL:
https://aclanthology.org/K17-1039
DOI:
10.18653/v1/K17-1039
Bibkey:
Cite (ACL):
Phong Le and Ivan Titov. 2017. Optimizing Differentiable Relaxations of Coreference Evaluation Metrics. In Proceedings of the 21st Conference on Computational Natural Language Learning (CoNLL 2017), pages 390–399, Vancouver, Canada. Association for Computational Linguistics.
Cite (Informal):
Optimizing Differentiable Relaxations of Coreference Evaluation Metrics (Le & Titov, CoNLL 2017)
Copy Citation:
PDF:
https://aclanthology.org/K17-1039.pdf
Code
 lephong/diffmetric_coref
Data
CoNLL-2012