[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

Unsupervised Domain Adaptation for Joint Information Extraction

Nghia Ngo, Bonan Min, Thien Nguyen


Abstract
Joint Information Extraction (JIE) aims to jointly solve multiple tasks in the Information Extraction pipeline (e.g., entity mention, event trigger, relation, and event argument extraction). Due to their ability to leverage task dependencies and avoid error propagation, JIE models have presented state-of-the-art performance for different IE tasks. However, an issue with current JIE methods is that they only focus on standard supervised learning setting where training and test data comes from the same domain. Cross-domain/domain adaptation learning with training and test data in different domains have not been explored for JIE, thus hindering the application of this technology to different domains in practice. To address this issue, our work introduces the first study to evaluate performance of JIE models in unsupervised domain adaptation setting. In addition, we present a novel method to induce domain-invariant representations for the tasks in JIE, called Domain Adaptation for Joint Information Extraction (DA4JIE). In DA4JIE, we propose an Instance-relational Domain Adaptation mechanism that seeks to align representations of task instances in JIE across domains through a generalized version of domain-adversarial learning approach. We further devise a Context-invariant Structure Learning technique to filter domain-specialized contextual information from induced representations to boost performance of JIE models in new domains. Extensive experiments and analyses demonstrate that DA4JIE can significantly improve out-of-domain performance for current state-of-the-art JIE systems for all IE tasks.
Anthology ID:
2022.findings-emnlp.434
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2022
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5894–5905
Language:
URL:
https://aclanthology.org/2022.findings-emnlp.434
DOI:
10.18653/v1/2022.findings-emnlp.434
Bibkey:
Cite (ACL):
Nghia Ngo, Bonan Min, and Thien Nguyen. 2022. Unsupervised Domain Adaptation for Joint Information Extraction. In Findings of the Association for Computational Linguistics: EMNLP 2022, pages 5894–5905, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
Unsupervised Domain Adaptation for Joint Information Extraction (Ngo et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-emnlp.434.pdf