A network approach to topic models

Martin Gerlach, Tiago P. Peixoto, Eduardo G. Altmann

Research output: Contribution to journalArticlepeer-review

136 Citations (SciVal)


One of the main computational and scientific challenges in the modern age is to extract useful information from unstructured texts. Topic models are one popular machine-learning approach that infers the latent topical structure of a collection of documents. Despite their success—particularly of the most widely used variant called latent Dirichlet allocation (LDA)—and numerous applications in sociology, history, and linguistics, topic models are known to suffer from severe conceptual and practical problems, for example, a lack of justification for the Bayesian priors, discrepancies with statistical properties of real texts, and the inability to properly choose the number of topics. We obtain a fresh view of the problem of identifying topical structures by relating it to the problem of finding communities in complex networks. We achieve this by representing text corpora as bipartite networks of documents and words. By adapting existing community-detection methods (using a stochastic block model (SBM) with nonparametric priors), we obtain a more versatile and principled framework for topic modeling (for example, it automatically detects the number of topics and hierarchically clusters both the words and documents). The analysis of artificial and real corpora demonstrates that our SBM approach leads to better topic models than LDA in terms of statistical model selection. Our work shows how to formally relate methods from community detection and topic modeling, opening the possibility of cross-fertilization between these two fields.

Original languageEnglish
Article numbereaaq1360
Pages (from-to)1-12
Number of pages12
JournalScience Advances
Issue number7
Publication statusPublished - 18 Jul 2018

ASJC Scopus subject areas

  • General


Dive into the research topics of 'A network approach to topic models'. Together they form a unique fingerprint.

Cite this