[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

Inspecting Unification of Encoding and Matching with Transformer: A Case Study of Machine Reading Comprehension

Hangbo Bao, Li Dong, Furu Wei, Wenhui Wang, Nan Yang, Lei Cui, Songhao Piao, Ming Zhou


Abstract
Most machine reading comprehension (MRC) models separately handle encoding and matching with different network architectures. In contrast, pretrained language models with Transformer layers, such as GPT (Radford et al., 2018) and BERT (Devlin et al., 2018), have achieved competitive performance on MRC. A research question that naturally arises is: apart from the benefits of pre-training, how many performance gain comes from the unified network architecture. In this work, we evaluate and analyze unifying encoding and matching components with Transformer for the MRC task. Experimental results on SQuAD show that the unified model outperforms previous networks that separately treat encoding and matching. We also introduce a metric to inspect whether a Transformer layer tends to perform encoding or matching. The analysis results show that the unified model learns different modeling strategies compared with previous manually-designed models.
Anthology ID:
D19-5802
Volume:
Proceedings of the 2nd Workshop on Machine Reading for Question Answering
Month:
November
Year:
2019
Address:
Hong Kong, China
Editors:
Adam Fisch, Alon Talmor, Robin Jia, Minjoon Seo, Eunsol Choi, Danqi Chen
Venue:
WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
14–18
Language:
URL:
https://aclanthology.org/D19-5802
DOI:
10.18653/v1/D19-5802
Bibkey:
Cite (ACL):
Hangbo Bao, Li Dong, Furu Wei, Wenhui Wang, Nan Yang, Lei Cui, Songhao Piao, and Ming Zhou. 2019. Inspecting Unification of Encoding and Matching with Transformer: A Case Study of Machine Reading Comprehension. In Proceedings of the 2nd Workshop on Machine Reading for Question Answering, pages 14–18, Hong Kong, China. Association for Computational Linguistics.
Cite (Informal):
Inspecting Unification of Encoding and Matching with Transformer: A Case Study of Machine Reading Comprehension (Bao et al., 2019)
Copy Citation:
PDF:
https://aclanthology.org/D19-5802.pdf