Topic Modeling Enhancement using Word Embeddings

Conference proceedings article


Authors/Editors


Strategic Research Themes


Publication Details

Author listLimwattana, Siriwat; Prom-On, Santitham;

PublisherElsevier

Publication year2021

ISBN9781665438315

ISSN0928-4931

eISSN1873-0191

URLhttps://www.scopus.com/inward/record.uri?eid=2-s2.0-85112361663&doi=10.1109%2fJCSSE53117.2021.9493816&partnerID=40&md5=c632cf7d3975cae3d9817d7eb213ef5c

LanguagesEnglish-Great Britain (EN-GB)


View in Web of Science | View on publisher site | View citing articles in Web of Science


Abstract

Latent Dirichlet Allocation(LDA) is one of the powerful techniques in extracting topics from a document. The original LDA takes the Bag-of-Word representation as the input and produces topic distributions in documents as output. The drawback of Bag-of-Word is that it represents each word with a plain one-hot encoding which does not encode the word level information. Later research in Natural Language Processing(NLP) demonstrate that word embeddings technique such as Skipgram model can provide a good representation in capturing the relationship and semantic information between words. In recent studies, many NLP tasks could gain better performance by applying the word embedding as the representation of words. In this paper, we propose Deep Word-Topic Latent Dirichlet Allocation(DWT-LDA), a new process for training LDA with word embedding. A neural network with word embedding is applied to the Collapsed Gibbs Sampling process as another choice for word topic assignment. To quantitatively evaluate our model, the topic coherence framework and topic diversity are the metrics used to compare between our approach and the original LDA. The experimental result shows that our method generates more coherent and diverse topics. © 2021 IEEE.


Keywords

Latent Dirichlet AllocationWord Embedding


Last updated on 2023-18-10 at 07:44