[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

Analysing Syntactic and Semantic Features in Pre-trained Language Models in a Fully Unsupervised Setting

Necva Bölücü, Burcu Can


Abstract
Transformer-based pre-trained language models (PLMs) have been used in all NLP tasks and resulted in a great success. This has led to the question of whether we can transfer this knowledge to syntactic or semantic parsing in a completely unsupervised setting. In this study, we leverage PLMs as a source of external knowledge to perform a fully unsupervised parser model for semantic, constituency and dependency parsing. We analyse the results for English, German, French, and Turkish to understand the impact of the PLMs on different languages for syntactic and semantic parsing. We visualize the attention layers and heads in PLMs for parsing to understand the information that can be learned throughout the layers and the attention heads in the PLMs both for different levels of parsing tasks. The results obtained from dependency, constituency, and semantic parsing are similar to each other, and the middle layers and the ones closer to the final layers have more syntactic and semantic information.
Anthology ID:
2022.icon-main.3
Volume:
Proceedings of the 19th International Conference on Natural Language Processing (ICON)
Month:
December
Year:
2022
Address:
New Delhi, India
Editors:
Md. Shad Akhtar, Tanmoy Chakraborty
Venue:
ICON
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
19–31
Language:
URL:
https://aclanthology.org/2022.icon-main.3
DOI:
Bibkey:
Cite (ACL):
Necva Bölücü and Burcu Can. 2022. Analysing Syntactic and Semantic Features in Pre-trained Language Models in a Fully Unsupervised Setting. In Proceedings of the 19th International Conference on Natural Language Processing (ICON), pages 19–31, New Delhi, India. Association for Computational Linguistics.
Cite (Informal):
Analysing Syntactic and Semantic Features in Pre-trained Language Models in a Fully Unsupervised Setting (Bölücü & Can, ICON 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.icon-main.3.pdf
Software:
 2022.icon-main.3.software.zip