[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

Generating Extractive Answers: Gated Recurrent Memory Reader for Conversational Question Answering

Xuanyu Zhang, Qing Yang


Abstract
Conversational question answering (CQA) is a more complicated task than traditional single-turn machine reading comprehension (MRC). Different from large language models (LLMs) like ChatGPT, the models of CQA need to extract answers from given contents to answer follow-up questions according to conversation history. In this paper, we propose a novel architecture, i.e., Gated Recurrent Memory Reader (GRMR), which integrates traditional extractive MRC models into a generalized sequence-to-sequence framework. After the passage is encoded, the decoder will generate the extractive answers turn by turn. Different from previous models that concatenate the previous questions and answers as context superficially and redundantly, our model can use less storage space and consider historical memory deeply and selectively. Experiments on the Conversational Question Answering (CoQA) dataset show that our model achieves comparable results to most models with the least space occupancy.
Anthology ID:
2023.findings-emnlp.516
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7699–7704
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.516
DOI:
10.18653/v1/2023.findings-emnlp.516
Bibkey:
Cite (ACL):
Xuanyu Zhang and Qing Yang. 2023. Generating Extractive Answers: Gated Recurrent Memory Reader for Conversational Question Answering. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 7699–7704, Singapore. Association for Computational Linguistics.
Cite (Informal):
Generating Extractive Answers: Gated Recurrent Memory Reader for Conversational Question Answering (Zhang & Yang, Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-emnlp.516.pdf