Probabilistic Approach for Embedding Arbitrary Features of Text
Abstract
References
Recommendations
Using Word Embedding to Evaluate the Coherence of Topics from Twitter Data
SIGIR '16: Proceedings of the 39th International ACM SIGIR conference on Research and Development in Information RetrievalScholars often seek to understand topics discussed on Twitter using topic modelling approaches. Several coherence metrics have been proposed for evaluating the coherence of the topics generated by these approaches, including the pre-calculated Pointwise ...
WELDA: Enhancing Topic Models by Incorporating Local Word Context
JCDL '18: Proceedings of the 18th ACM/IEEE on Joint Conference on Digital LibrariesThe distributional hypothesis states that similar words tend to have similar contexts in which they occur. Word embedding models exploit this hypothesis by learning word vectors based on the local context of words. Probabilistic topic models on the ...
A Framework for Learning Cross-Lingual Word Embedding with Topics
Web and Big DataAbstractCross-lingual word embeddings have been served as fundamental components for many Web-based applications. However, current models learn cross-lingual word embeddings based on projection of two pre-trained monolingual embeddings based on well-known ...
Comments
Please enable JavaScript to view thecomments powered by Disqus.Information & Contributors
Information
Published In
Publisher
Springer-Verlag
Berlin, Heidelberg
Publication History
Author Tags
Qualifiers
- Article
Contributors
Other Metrics
Bibliometrics & Citations
Bibliometrics
Article Metrics
- 0Total Citations
- 0Total Downloads
- Downloads (Last 12 months)0
- Downloads (Last 6 weeks)0