[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

ECONET: Effective Continual Pretraining of Language Models for Event Temporal Reasoning

Rujun Han, Xiang Ren, Nanyun Peng


Abstract
While pre-trained language models (PTLMs) have achieved noticeable success on many NLP tasks, they still struggle for tasks that require event temporal reasoning, which is essential for event-centric applications. We present a continual pre-training approach that equips PTLMs with targeted knowledge about event temporal relations. We design self-supervised learning objectives to recover masked-out event and temporal indicators and to discriminate sentences from their corrupted counterparts (where event or temporal indicators got replaced). By further pre-training a PTLM with these objectives jointly, we reinforce its attention to event and temporal information, yielding enhanced capability on event temporal reasoning. This **E**ffective **CON**tinual pre-training framework for **E**vent **T**emporal reasoning (ECONET) improves the PTLMs’ fine-tuning performances across five relation extraction and question answering tasks and achieves new or on-par state-of-the-art performances in most of our downstream tasks.
Anthology ID:
2021.emnlp-main.436
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5367–5380
Language:
URL:
https://aclanthology.org/2021.emnlp-main.436
DOI:
10.18653/v1/2021.emnlp-main.436
Bibkey:
Cite (ACL):
Rujun Han, Xiang Ren, and Nanyun Peng. 2021. ECONET: Effective Continual Pretraining of Language Models for Event Temporal Reasoning. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 5367–5380, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
ECONET: Effective Continual Pretraining of Language Models for Event Temporal Reasoning (Han et al., EMNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.emnlp-main.436.pdf
Video:
 https://aclanthology.org/2021.emnlp-main.436.mp4
Code
 pluslabnlp/econet +  additional community code
Data
MC-TACOTorque