[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

Stateful Memory-Augmented Transformers for Efficient Dialogue Modeling

Qingyang Wu, Zhou Yu


Abstract
Transformer models have achieved great performance in dialogue generation tasks. However, their inability to process long dialogue history often leads to truncation of the context. To address this problem, we propose a novel memory-augmented transformer that is compatible with existing pre-trained encoder-decoder models and enables efficient preservation of the dialogue history information. The new model incorporates a separate memory module alongside the pre-trained transformer, which can effectively interchange information between the memory states and the current input context. We evaluate the efficiency of our model on three dialogue datasets and two language modeling datasets. Experimental results show that our method has achieved superior efficiency and performance compared to other pre-trained Transformer baselines.
Anthology ID:
2024.findings-eacl.57
Volume:
Findings of the Association for Computational Linguistics: EACL 2024
Month:
March
Year:
2024
Address:
St. Julian’s, Malta
Editors:
Yvette Graham, Matthew Purver
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
853–867
Language:
URL:
https://aclanthology.org/2024.findings-eacl.57
DOI:
Bibkey:
Cite (ACL):
Qingyang Wu and Zhou Yu. 2024. Stateful Memory-Augmented Transformers for Efficient Dialogue Modeling. In Findings of the Association for Computational Linguistics: EACL 2024, pages 853–867, St. Julian’s, Malta. Association for Computational Linguistics.
Cite (Informal):
Stateful Memory-Augmented Transformers for Efficient Dialogue Modeling (Wu & Yu, Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-eacl.57.pdf
Software:
 2024.findings-eacl.57.software.zip