[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

Aggregating Bidirectional Encoder Representations Using MatchLSTM for Sequence Matching

Bo Shao, Yeyun Gong, Weizhen Qi, Nan Duan, Xiaola Lin


Abstract
In this work, we propose an aggregation method to combine the Bidirectional Encoder Representations from Transformer (BERT) with a MatchLSTM layer for Sequence Matching. Given a sentence pair, we extract the output representations of it from BERT. Then we extend BERT with a MatchLSTM layer to get further interaction of the sentence pair for sequence matching tasks. Taking natural language inference as an example, we split BERT output into two parts, which is from premise sentence and hypothesis sentence. At each position of the hypothesis sentence, both the weighted representation of the premise sentence and the representation of the current token are fed into LSTM. We jointly train the aggregation layer and pre-trained layer for sequence matching. We conduct an experiment on two publicly available datasets, WikiQA and SNLI. Experiments show that our model achieves significantly improvement compared with state-of-the-art methods on both datasets.
Anthology ID:
D19-1626
Volume:
Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP)
Month:
November
Year:
2019
Address:
Hong Kong, China
Editors:
Kentaro Inui, Jing Jiang, Vincent Ng, Xiaojun Wan
Venues:
EMNLP | IJCNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
6059–6063
Language:
URL:
https://aclanthology.org/D19-1626
DOI:
10.18653/v1/D19-1626
Bibkey:
Cite (ACL):
Bo Shao, Yeyun Gong, Weizhen Qi, Nan Duan, and Xiaola Lin. 2019. Aggregating Bidirectional Encoder Representations Using MatchLSTM for Sequence Matching. In Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), pages 6059–6063, Hong Kong, China. Association for Computational Linguistics.
Cite (Informal):
Aggregating Bidirectional Encoder Representations Using MatchLSTM for Sequence Matching (Shao et al., EMNLP-IJCNLP 2019)
Copy Citation:
PDF:
https://aclanthology.org/D19-1626.pdf