[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

Pre-training is a Hot Topic: Contextualized Document Embeddings Improve Topic Coherence

Federico Bianchi, Silvia Terragni, Dirk Hovy


Abstract
Topic models extract groups of words from documents, whose interpretation as a topic hopefully allows for a better understanding of the data. However, the resulting word groups are often not coherent, making them harder to interpret. Recently, neural topic models have shown improvements in overall coherence. Concurrently, contextual embeddings have advanced the state of the art of neural models in general. In this paper, we combine contextualized representations with neural topic models. We find that our approach produces more meaningful and coherent topics than traditional bag-of-words topic models and recent neural models. Our results indicate that future improvements in language models will translate into better topic models.
Anthology ID:
2021.acl-short.96
Volume:
Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers)
Month:
August
Year:
2021
Address:
Online
Editors:
Chengqing Zong, Fei Xia, Wenjie Li, Roberto Navigli
Venues:
ACL | IJCNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
759–766
Language:
URL:
https://aclanthology.org/2021.acl-short.96
DOI:
10.18653/v1/2021.acl-short.96
Bibkey:
Cite (ACL):
Federico Bianchi, Silvia Terragni, and Dirk Hovy. 2021. Pre-training is a Hot Topic: Contextualized Document Embeddings Improve Topic Coherence. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pages 759–766, Online. Association for Computational Linguistics.
Cite (Informal):
Pre-training is a Hot Topic: Contextualized Document Embeddings Improve Topic Coherence (Bianchi et al., ACL-IJCNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.acl-short.96.pdf
Video:
 https://aclanthology.org/2021.acl-short.96.mp4
Code
 MilaNLProc/contextualized-topic-models +  additional community code
Data
20 Newsgroups