%0 Conference Proceedings %T Stateful Memory-Augmented Transformers for Efficient Dialogue Modeling %A Wu, Qingyang %A Yu, Zhou %Y Graham, Yvette %Y Purver, Matthew %S Findings of the Association for Computational Linguistics: EACL 2024 %D 2024 %8 March %I Association for Computational Linguistics %C St. Julian’s, Malta %F wu-yu-2024-stateful %X Transformer models have achieved great performance in dialogue generation tasks. However, their inability to process long dialogue history often leads to truncation of the context. To address this problem, we propose a novel memory-augmented transformer that is compatible with existing pre-trained encoder-decoder models and enables efficient preservation of the dialogue history information. The new model incorporates a separate memory module alongside the pre-trained transformer, which can effectively interchange information between the memory states and the current input context. We evaluate the efficiency of our model on three dialogue datasets and two language modeling datasets. Experimental results show that our method has achieved superior efficiency and performance compared to other pre-trained Transformer baselines. %U https://aclanthology.org/2024.findings-eacl.57 %P 853-867