[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

Paragraph-based Transformer Pre-training for Multi-Sentence Inference

Luca Di Liello, Siddhant Garg, Luca Soldaini, Alessandro Moschitti


Abstract
Inference tasks such as answer sentence selection (AS2) or fact verification are typically solved by fine-tuning transformer-based models as individual sentence-pair classifiers. Recent studies show that these tasks benefit from modeling dependencies across multiple candidate sentences jointly. In this paper, we first show that popular pre-trained transformers perform poorly when used for fine-tuning on multi-candidate inference tasks. We then propose a new pre-training objective that models the paragraph-level semantics across multiple input sentences. Our evaluation on three AS2 and one fact verification datasets demonstrates the superiority of our pre-training technique over the traditional ones for transformers used as joint models for multi-candidate inference tasks, as well as when used as cross-encoders for sentence-pair formulations of these tasks.
Anthology ID:
2022.naacl-main.181
Volume:
Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Month:
July
Year:
2022
Address:
Seattle, United States
Editors:
Marine Carpuat, Marie-Catherine de Marneffe, Ivan Vladimir Meza Ruiz
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2521–2531
Language:
URL:
https://aclanthology.org/2022.naacl-main.181
DOI:
10.18653/v1/2022.naacl-main.181
Bibkey:
Cite (ACL):
Luca Di Liello, Siddhant Garg, Luca Soldaini, and Alessandro Moschitti. 2022. Paragraph-based Transformer Pre-training for Multi-Sentence Inference. In Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 2521–2531, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
Paragraph-based Transformer Pre-training for Multi-Sentence Inference (Di Liello et al., NAACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.naacl-main.181.pdf
Video:
 https://aclanthology.org/2022.naacl-main.181.mp4
Code
 amazon-research/wqa-multi-sentence-inference
Data
ASNQFEVERTrecQAWikiQA