scispace - formally typeset
Open AccessJournal ArticleDOI

Mixed Membership Stochastic Blockmodels

TLDR
In this article, the authors introduce a class of variance allocation models for pairwise measurements, called mixed membership stochastic blockmodels, which combine global parameters that instantiate dense patches of connectivity (blockmodel) with local parameters (mixed membership), and develop a general variational inference algorithm for fast approximate posterior inference.
Abstract
Consider data consisting of pairwise measurements, such as presence or absence of links between pairs of objects. These data arise, for instance, in the analysis of protein interactions and gene regulatory networks, collections of author-recipient email, and social networks. Analyzing pairwise measurements with probabilistic models requires special assumptions, since the usual independence or exchangeability assumptions no longer hold. Here we introduce a class of variance allocation models for pairwise measurements: mixed membership stochastic blockmodels. These models combine global parameters that instantiate dense patches of connectivity (blockmodel) with local parameters that instantiate node-specific variability in the connections (mixed membership). We develop a general variational inference algorithm for fast approximate posterior inference. We demonstrate the advantages of mixed membership stochastic blockmodels with applications to social networks and protein interaction networks.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

Discovering political topics in Facebook discussion threads with graph contextualization

TL;DR: In this paper, a graph contextualization method, pairGraphText, was proposed to study political engagement on Facebook during the 2012 French presidential election, which is a spectral algorithm that contextualizes graph data with text data for online discussion thread.
Journal ArticleDOI

Hierarchical Attention Link Prediction Neural Network

TL;DR: A novel end-to-end neural link prediction model, named Hierarchical Attention Link Prediction Neural Network (HalpNet), is proposed, which comprehensively explores neighborhood information and is able to predict link score of target node pair directly and effectively.
Posted Content

Concentration of random graphs and application to community detection

TL;DR: Recent results on regimes of concentration of random graphs around their expectation are reviewed, showing that dense graphs concentrate and sparse graphs concentrate after regularization.
Book ChapterDOI

Attention-Based Graph Evolution

TL;DR: An attention-based graph evolution model based on the neural attention mechanism that can not only model graph evolution in both space and time, but can also model the transformation between graphs from one state to another.
Proceedings Article

Deep Relational Topic Modeling via Graph Poisson Gamma Belief Network

TL;DR: A novel hierarchical RTM named graph Poisson gamma belief network (GPGBN) is developed, and two different Weibull distribution based variational graph auto-encoders are introduced for efficient model inference and effective network information aggregation.
References
More filters
Journal ArticleDOI

Gene Ontology: tool for the unification of biology

TL;DR: The goal of the Gene Ontology Consortium is to produce a dynamic, controlled vocabulary that can be applied to all eukaryotes even as knowledge of gene and protein roles in cells is accumulating and changing.
Journal ArticleDOI

Latent dirichlet allocation

TL;DR: This work proposes a generative model for text and other collections of discrete data that generalizes or improves on several previous models including naive Bayes/unigram, mixture of unigrams, and Hofmann's aspect model.
Journal ArticleDOI

Finding scientific topics

TL;DR: A generative model for documents is described, introduced by Blei, Ng, and Jordan, and a Markov chain Monte Carlo algorithm is presented for inference in this model, which is used to analyze abstracts from PNAS by using Bayesian model selection to establish the number of topics.
Related Papers (5)