[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

Learning to Ask: Neural Question Generation for Reading Comprehension

Xinya Du, Junru Shao, Claire Cardie


Abstract
We study automatic question generation for sentences from text passages in reading comprehension. We introduce an attention-based sequence learning model for the task and investigate the effect of encoding sentence- vs. paragraph-level information. In contrast to all previous work, our model does not rely on hand-crafted rules or a sophisticated NLP pipeline; it is instead trainable end-to-end via sequence-to-sequence learning. Automatic evaluation results show that our system significantly outperforms the state-of-the-art rule-based system. In human evaluations, questions generated by our system are also rated as being more natural (i.e.,, grammaticality, fluency) and as more difficult to answer (in terms of syntactic and lexical divergence from the original text and reasoning needed to answer).
Anthology ID:
P17-1123
Volume:
Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2017
Address:
Vancouver, Canada
Editors:
Regina Barzilay, Min-Yen Kan
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1342–1352
Language:
URL:
https://aclanthology.org/P17-1123
DOI:
10.18653/v1/P17-1123
Bibkey:
Cite (ACL):
Xinya Du, Junru Shao, and Claire Cardie. 2017. Learning to Ask: Neural Question Generation for Reading Comprehension. In Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 1342–1352, Vancouver, Canada. Association for Computational Linguistics.
Cite (Informal):
Learning to Ask: Neural Question Generation for Reading Comprehension (Du et al., ACL 2017)
Copy Citation:
PDF:
https://aclanthology.org/P17-1123.pdf
Code
 xinyadu/nqg +  additional community code
Data
MCTestSQuADVQG