Mixed Membership Stochastic Blockmodels
TLDR
In this article, the authors introduce a class of variance allocation models for pairwise measurements, called mixed membership stochastic blockmodels, which combine global parameters that instantiate dense patches of connectivity (blockmodel) with local parameters (mixed membership), and develop a general variational inference algorithm for fast approximate posterior inference.Abstract:
Consider data consisting of pairwise measurements, such as presence or absence of links between pairs of objects. These data arise, for instance, in the analysis of protein interactions and gene regulatory networks, collections of author-recipient email, and social networks. Analyzing pairwise measurements with probabilistic models requires special assumptions, since the usual independence or exchangeability assumptions no longer hold. Here we introduce a class of variance allocation models for pairwise measurements: mixed membership stochastic blockmodels. These models combine global parameters that instantiate dense patches of connectivity (blockmodel) with local parameters that instantiate node-specific variability in the connections (mixed membership). We develop a general variational inference algorithm for fast approximate posterior inference. We demonstrate the advantages of mixed membership stochastic blockmodels with applications to social networks and protein interaction networks.read more
Citations
More filters
Journal ArticleDOI
Convergence of the groups posterior distribution in latent or stochastic block models
TL;DR: This work establishes sufficient conditions for the groups posterior distribution to converge (as the size of the data increases) to a Dirac mass located at the actual (random) groups configuration.
Journal ArticleDOI
Learning Latent Block Structure in Weighted Networks
TL;DR: This model learns from both the presence and weight of edges, allowing it to discover structure that would otherwise be hidden when weights are discarded or thresholded, and a Bayesian variational algorithm is described for efficiently approximating this model's posterior distribution over latent block structures.
Proceedings Article
Bayesian and L1 Approaches for Sparse Unsupervised Learning
TL;DR: In this article, the authors focus on unsupervised latent variable models, and develop L1 minimising factor models, Bayesian variants of "L1", and Bayesian models with a stronger L0-like sparsity induced through spike-and-slab distributions.
Posted Content
Communication Communities in MOOCs.
TL;DR: It is seen that BNMF yields a superior probabilistic generative model for online discussions when compared to other models, and that the communities it learns are differentiated by their composite students' demographic and course performance indicators.
Journal ArticleDOI
Finding Common Modules in a Time-Varying Network with Application to the Drosophila Melanogaster Gene Regulation Network
Jingfei Zhang,Jiguo Cao +1 more
TL;DR: A statistical framework for detecting common modules in the Drosophila melanogaster time-varying gene regulation network is proposed and both a significance test and a robustness test for the identified modular structure are developed.
References
More filters
Journal ArticleDOI
Maximum likelihood from incomplete data via the EM algorithm
Journal ArticleDOI
Gene Ontology: tool for the unification of biology
M Ashburner,Catherine A. Ball,Judith A. Blake,David Botstein,Heather Butler,J. M. Cherry,Allan Peter Davis,Kara Dolinski,Selina S. Dwight,J.T. Eppig,Midori A. Harris,David P. Hill,Laurie Issel-Tarver,Andrew Kasarskis,Suzanna E. Lewis,John C. Matese,Joel E. Richardson,M. Ringwald,Gerald M. Rubin,Gavin Sherlock +19 more
TL;DR: The goal of the Gene Ontology Consortium is to produce a dynamic, controlled vocabulary that can be applied to all eukaryotes even as knowledge of gene and protein roles in cells is accumulating and changing.
Journal ArticleDOI
Latent dirichlet allocation
TL;DR: This work proposes a generative model for text and other collections of discrete data that generalizes or improves on several previous models including naive Bayes/unigram, mixture of unigrams, and Hofmann's aspect model.
Journal ArticleDOI
Finding scientific topics
TL;DR: A generative model for documents is described, introduced by Blei, Ng, and Jordan, and a Markov chain Monte Carlo algorithm is presented for inference in this model, which is used to analyze abstracts from PNAS by using Bayesian model selection to establish the number of topics.
Journal ArticleDOI
Functional organization of the yeast proteome by systematic analysis of protein complexes
Anne-Claude Gavin,Markus Bösche,Roland Krause,Paola Grandi,Martina Marzioch,Andreas Bauer,Jörg Schultz,Jens Rick,Anne-Marie Michon,Cristina-Maria Cruciat,Marita Remor,Christian Höfert,Malgorzata Schelder,Miro Brajenovic,Heinz Ruffner,Alejandro Merino,Karin Klein,Manuela Hudak,David Dickson,Tatjana Rudi,Volker Gnau,Angela Bauch,Sonja Bastuck,Bettina Huhse,Christina Leutwein,Marie-Anne Heurtier,Richard R. Copley,Angela Edelmann,Erich Querfurth,Vladimir Rybin,Gerard Drewes,Manfred Raida,Tewis Bouwmeester,Peer Bork,Bertrand Séraphin,Bernhard Kuster,Gitte Neubauer,Giulio Superti-Furga +37 more
TL;DR: The analysis provides an outline of the eukaryotic proteome as a network of protein complexes at a level of organization beyond binary interactions, which contains fundamental biological information and offers the context for a more reasoned and informed approach to drug discovery.