[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

Boosting Dialog Response Generation

Wenchao Du, Alan W Black


Abstract
Neural models have become one of the most important approaches to dialog response generation. However, they still tend to generate the most common and generic responses in the corpus all the time. To address this problem, we designed an iterative training process and ensemble method based on boosting. We combined our method with different training and decoding paradigms as the base model, including mutual-information-based decoding and reward-augmented maximum likelihood learning. Empirical results show that our approach can significantly improve the diversity and relevance of the responses generated by all base models, backed by objective measurements and human evaluation.
Anthology ID:
P19-1005
Volume:
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2019
Address:
Florence, Italy
Editors:
Anna Korhonen, David Traum, Lluís Màrquez
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
38–43
Language:
URL:
https://aclanthology.org/P19-1005
DOI:
10.18653/v1/P19-1005
Bibkey:
Cite (ACL):
Wenchao Du and Alan W Black. 2019. Boosting Dialog Response Generation. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 38–43, Florence, Italy. Association for Computational Linguistics.
Cite (Informal):
Boosting Dialog Response Generation (Du & Black, ACL 2019)
Copy Citation:
PDF:
https://aclanthology.org/P19-1005.pdf
Video:
 https://aclanthology.org/P19-1005.mp4