Topic Modeling Enhancement using Word Embeddings
Conference proceedings article
Authors/Editors
Strategic Research Themes
Publication Details
Author list: Limwattana, Siriwat; Prom-On, Santitham;
Publisher: Elsevier
Publication year: 2021
ISBN: 9781665438315
ISSN: 0928-4931
eISSN: 1873-0191
Languages: English-Great Britain (EN-GB)
View in Web of Science | View on publisher site | View citing articles in Web of Science
Abstract
Latent Dirichlet Allocation(LDA) is one of the powerful techniques in extracting topics from a document. The original LDA takes the Bag-of-Word representation as the input and produces topic distributions in documents as output. The drawback of Bag-of-Word is that it represents each word with a plain one-hot encoding which does not encode the word level information. Later research in Natural Language Processing(NLP) demonstrate that word embeddings technique such as Skipgram model can provide a good representation in capturing the relationship and semantic information between words. In recent studies, many NLP tasks could gain better performance by applying the word embedding as the representation of words. In this paper, we propose Deep Word-Topic Latent Dirichlet Allocation(DWT-LDA), a new process for training LDA with word embedding. A neural network with word embedding is applied to the Collapsed Gibbs Sampling process as another choice for word topic assignment. To quantitatively evaluate our model, the topic coherence framework and topic diversity are the metrics used to compare between our approach and the original LDA. The experimental result shows that our method generates more coherent and diverse topics. © 2021 IEEE.
Keywords
Latent Dirichlet Allocation, Word Embedding