[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

Annotating and Detecting Fine-grained Factual Errors for Dialogue Summarization

Rongxin Zhu, Jianzhong Qi, Jey Han Lau


Abstract
A series of datasets and models have been proposed for summaries generated for well-formatted documents such as news articles. Dialogue summaries, however, have been under explored. In this paper, we present the first dataset with fine-grained factual error annotations named DIASUMFACT. We define fine-grained factual error detection as a sentence-level multi-label classification problem, and weevaluate two state-of-the-art (SOTA) models on our dataset. Both models yield sub-optimal results, with a macro-averaged F1 score of around 0.25 over 6 error classes. We further propose an unsupervised model ENDERANKER via candidate ranking using pretrained encoder-decoder models. Our model performs on par with the SOTA models while requiring fewer resources. These observations confirm the challenges in detecting factual errors from dialogue summaries, which call for further studies, for which our dataset and results offer a solid foundation.
Anthology ID:
2023.acl-long.377
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6825–6845
Language:
URL:
https://aclanthology.org/2023.acl-long.377
DOI:
10.18653/v1/2023.acl-long.377
Bibkey:
Cite (ACL):
Rongxin Zhu, Jianzhong Qi, and Jey Han Lau. 2023. Annotating and Detecting Fine-grained Factual Errors for Dialogue Summarization. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 6825–6845, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Annotating and Detecting Fine-grained Factual Errors for Dialogue Summarization (Zhu et al., ACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.acl-long.377.pdf
Video:
 https://aclanthology.org/2023.acl-long.377.mp4