[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

Local and Global Contexts for Conversation

Zuoquan Lin, Xinyi Shen


Abstract
The context in conversation is the dialog history crucial for multi-turn dialogue. Learning from the relevant contexts in dialog history for grounded conversation is a challenging problem. Local context is the most neighbor and more sensitive to the subsequent response, and global context is relevant to a whole conversation far beyond neighboring utterances. Currently, pretrained transformer models for conversation challenge capturing the correlation and connection between local and global contexts. We introduce a local and global conversation model (LGCM) for general-purpose conversation in open domain. It is a local-global hierarchical transformer model that excels at accurately discerning and assimilating the relevant contexts necessary for generating responses. It employs a local encoder to grasp the local context at the level of individual utterances and a global encoder to understand the broader context at the dialogue level. The seamless fusion of these locally and globally contextualized encodings ensures a comprehensive comprehension of the conversation. Experiments on popular datasets show that LGCM outperforms the existing conversation models on the performance of automatic metrics with significant margins.
Anthology ID:
2024.findings-eacl.95
Volume:
Findings of the Association for Computational Linguistics: EACL 2024
Month:
March
Year:
2024
Address:
St. Julian’s, Malta
Editors:
Yvette Graham, Matthew Purver
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1408–1418
Language:
URL:
https://aclanthology.org/2024.findings-eacl.95
DOI:
Bibkey:
Cite (ACL):
Zuoquan Lin and Xinyi Shen. 2024. Local and Global Contexts for Conversation. In Findings of the Association for Computational Linguistics: EACL 2024, pages 1408–1418, St. Julian’s, Malta. Association for Computational Linguistics.
Cite (Informal):
Local and Global Contexts for Conversation (Lin & Shen, Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-eacl.95.pdf