[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

Space Efficient Context Encoding for Non-Task-Oriented Dialogue Generation with Graph Attention Transformer

Fabian Galetzka, Jewgeni Rose, David Schlangen, Jens Lehmann


Abstract
To improve the coherence and knowledge retrieval capabilities of non-task-oriented dialogue systems, recent Transformer-based models aim to integrate fixed background context. This often comes in the form of knowledge graphs, and the integration is done by creating pseudo utterances through paraphrasing knowledge triples, added into the accumulated dialogue context. However, the context length is fixed in these architectures, which restricts how much background or dialogue context can be kept. In this work, we propose a more concise encoding for background context structured in the form of knowledge graphs, by expressing the graph connections through restrictions on the attention weights. The results of our human evaluation show that this encoding reduces space requirements without negative effects on the precision of reproduction of knowledge and perceived consistency. Further, models trained with our proposed context encoding generate dialogues that are judged to be more comprehensive and interesting.
Anthology ID:
2021.acl-long.546
Volume:
Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)
Month:
August
Year:
2021
Address:
Online
Editors:
Chengqing Zong, Fei Xia, Wenjie Li, Roberto Navigli
Venues:
ACL | IJCNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7028–7041
Language:
URL:
https://aclanthology.org/2021.acl-long.546
DOI:
10.18653/v1/2021.acl-long.546
Bibkey:
Cite (ACL):
Fabian Galetzka, Jewgeni Rose, David Schlangen, and Jens Lehmann. 2021. Space Efficient Context Encoding for Non-Task-Oriented Dialogue Generation with Graph Attention Transformer. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pages 7028–7041, Online. Association for Computational Linguistics.
Cite (Informal):
Space Efficient Context Encoding for Non-Task-Oriented Dialogue Generation with Graph Attention Transformer (Galetzka et al., ACL-IJCNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.acl-long.546.pdf
Video:
 https://aclanthology.org/2021.acl-long.546.mp4
Code
 fabiangal/space-efficient-context-encoding-acl21
Data
OpenDialKG