[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

Contextual and Non-Contextual Word Embeddings: an in-depth Linguistic Investigation

Alessio Miaschi, Felice Dell’Orletta


Abstract
In this paper we present a comparison between the linguistic knowledge encoded in the internal representations of a contextual Language Model (BERT) and a contextual-independent one (Word2vec). We use a wide set of probing tasks, each of which corresponds to a distinct sentence-level feature extracted from different levels of linguistic annotation. We show that, although BERT is capable of understanding the full context of each word in an input sequence, the implicit knowledge encoded in its aggregated sentence representations is still comparable to that of a contextual-independent model. We also find that BERT is able to encode sentence-level properties even within single-word embeddings, obtaining comparable or even superior results than those obtained with sentence representations.
Anthology ID:
2020.repl4nlp-1.15
Volume:
Proceedings of the 5th Workshop on Representation Learning for NLP
Month:
July
Year:
2020
Address:
Online
Editors:
Spandana Gella, Johannes Welbl, Marek Rei, Fabio Petroni, Patrick Lewis, Emma Strubell, Minjoon Seo, Hannaneh Hajishirzi
Venue:
RepL4NLP
SIG:
SIGREP
Publisher:
Association for Computational Linguistics
Note:
Pages:
110–119
Language:
URL:
https://aclanthology.org/2020.repl4nlp-1.15
DOI:
10.18653/v1/2020.repl4nlp-1.15
Bibkey:
Cite (ACL):
Alessio Miaschi and Felice Dell’Orletta. 2020. Contextual and Non-Contextual Word Embeddings: an in-depth Linguistic Investigation. In Proceedings of the 5th Workshop on Representation Learning for NLP, pages 110–119, Online. Association for Computational Linguistics.
Cite (Informal):
Contextual and Non-Contextual Word Embeddings: an in-depth Linguistic Investigation (Miaschi & Dell’Orletta, RepL4NLP 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.repl4nlp-1.15.pdf
Video:
 http://slideslive.com/38929781