[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

LaPraDoR: Unsupervised Pretrained Dense Retriever for Zero-Shot Text Retrieval

Canwen Xu, Daya Guo, Nan Duan, Julian McAuley


Abstract
In this paper, we propose LaPraDoR, a pretrained dual-tower dense retriever that does not require any supervised data for training. Specifically, we first present Iterative Contrastive Learning (ICoL) that iteratively trains the query and document encoders with a cache mechanism. ICoL not only enlarges the number of negative instances but also keeps representations of cached examples in the same hidden space. We then propose Lexicon-Enhanced Dense Retrieval (LEDR) as a simple yet effective way to enhance dense retrieval with lexical matching. We evaluate LaPraDoR on the recently proposed BEIR benchmark, including 18 datasets of 9 zero-shot text retrieval tasks. Experimental results show that LaPraDoR achieves state-of-the-art performance compared with supervised dense retrieval models, and further analysis reveals the effectiveness of our training strategy and objectives. Compared to re-ranking, our lexicon-enhanced approach can be run in milliseconds (22.5x faster) while achieving superior performance.
Anthology ID:
2022.findings-acl.281
Volume:
Findings of the Association for Computational Linguistics: ACL 2022
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3557–3569
Language:
URL:
https://aclanthology.org/2022.findings-acl.281
DOI:
10.18653/v1/2022.findings-acl.281
Bibkey:
Cite (ACL):
Canwen Xu, Daya Guo, Nan Duan, and Julian McAuley. 2022. LaPraDoR: Unsupervised Pretrained Dense Retriever for Zero-Shot Text Retrieval. In Findings of the Association for Computational Linguistics: ACL 2022, pages 3557–3569, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
LaPraDoR: Unsupervised Pretrained Dense Retriever for Zero-Shot Text Retrieval (Xu et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-acl.281.pdf
Video:
 https://aclanthology.org/2022.findings-acl.281.mp4
Code
 jetrunner/laprador
Data
BEIRC4CLIMATE-FEVERFEVERHotpotQAMS MARCONatural QuestionsSciFact