Mixed Membership Stochastic Blockmodels
TLDR
In this article, the authors introduce a class of variance allocation models for pairwise measurements, called mixed membership stochastic blockmodels, which combine global parameters that instantiate dense patches of connectivity (blockmodel) with local parameters (mixed membership), and develop a general variational inference algorithm for fast approximate posterior inference.Abstract:
Consider data consisting of pairwise measurements, such as presence or absence of links between pairs of objects. These data arise, for instance, in the analysis of protein interactions and gene regulatory networks, collections of author-recipient email, and social networks. Analyzing pairwise measurements with probabilistic models requires special assumptions, since the usual independence or exchangeability assumptions no longer hold. Here we introduce a class of variance allocation models for pairwise measurements: mixed membership stochastic blockmodels. These models combine global parameters that instantiate dense patches of connectivity (blockmodel) with local parameters that instantiate node-specific variability in the connections (mixed membership). We develop a general variational inference algorithm for fast approximate posterior inference. We demonstrate the advantages of mixed membership stochastic blockmodels with applications to social networks and protein interaction networks.read more
Citations
More filters
Journal ArticleDOI
Generative Models for Item Adoptions Using Social Correlation
TL;DR: A social correlation framework is proposed that considers a social correlation matrix representing the degrees of correlation from every user to the users friends, in addition to a set of latent factors representing topics of interests of individual users.
Posted Content
Stochastic Blockmodels meet Graph Neural Networks
TL;DR: This work develops a \emph{sparse} variational autoencoder for graphs, that retains the interpretability of SBMs, while also enjoying the excellent predictive performance of graph neural nets, accompanied by a fast recognition model that enables fast inference of the node embeddings.
Proceedings ArticleDOI
GraphGen: A Scalable Approach to Domain-agnostic Labeled Graph Generation
TL;DR: GraphGen as mentioned in this paper converts graphs to sequences using minimum DFS codes, which capture the graph structure precisely along with the label information, and learns complex joint distributions between structure and semantic labels through a novel LSTM architecture.
Posted Content
Deep Feature Learning for Graphs.
TL;DR: This paper presents a general graph representation learning framework called DeepGL for learning deep node and edge representations from large (attributed) graphs that automatically learns a multi-layered hierarchical graph representation where each successive layer leverages the output from the previous layer to learn features of a higher-order.
Journal ArticleDOI
Community detection in networks without observing edges
TL;DR: A Bayesian hierarchical model is developed to identify communities of time series and provides an end-to-end community detection algorithm that does not extract information as a sequence of point estimates but propagates uncertainties from the raw data to the community labels.
References
More filters
Journal ArticleDOI
Maximum likelihood from incomplete data via the EM algorithm
Journal ArticleDOI
Gene Ontology: tool for the unification of biology
M Ashburner,Catherine A. Ball,Judith A. Blake,David Botstein,Heather Butler,J. M. Cherry,Allan Peter Davis,Kara Dolinski,Selina S. Dwight,J.T. Eppig,Midori A. Harris,David P. Hill,Laurie Issel-Tarver,Andrew Kasarskis,Suzanna E. Lewis,John C. Matese,Joel E. Richardson,M. Ringwald,Gerald M. Rubin,Gavin Sherlock +19 more
TL;DR: The goal of the Gene Ontology Consortium is to produce a dynamic, controlled vocabulary that can be applied to all eukaryotes even as knowledge of gene and protein roles in cells is accumulating and changing.
Journal ArticleDOI
Latent dirichlet allocation
TL;DR: This work proposes a generative model for text and other collections of discrete data that generalizes or improves on several previous models including naive Bayes/unigram, mixture of unigrams, and Hofmann's aspect model.
Journal ArticleDOI
Finding scientific topics
TL;DR: A generative model for documents is described, introduced by Blei, Ng, and Jordan, and a Markov chain Monte Carlo algorithm is presented for inference in this model, which is used to analyze abstracts from PNAS by using Bayesian model selection to establish the number of topics.
Journal ArticleDOI
Functional organization of the yeast proteome by systematic analysis of protein complexes
Anne-Claude Gavin,Markus Bösche,Roland Krause,Paola Grandi,Martina Marzioch,Andreas Bauer,Jörg Schultz,Jens Rick,Anne-Marie Michon,Cristina-Maria Cruciat,Marita Remor,Christian Höfert,Malgorzata Schelder,Miro Brajenovic,Heinz Ruffner,Alejandro Merino,Karin Klein,Manuela Hudak,David Dickson,Tatjana Rudi,Volker Gnau,Angela Bauch,Sonja Bastuck,Bettina Huhse,Christina Leutwein,Marie-Anne Heurtier,Richard R. Copley,Angela Edelmann,Erich Querfurth,Vladimir Rybin,Gerard Drewes,Manfred Raida,Tewis Bouwmeester,Peer Bork,Bertrand Séraphin,Bernhard Kuster,Gitte Neubauer,Giulio Superti-Furga +37 more
TL;DR: The analysis provides an outline of the eukaryotic proteome as a network of protein complexes at a level of organization beyond binary interactions, which contains fundamental biological information and offers the context for a more reasoned and informed approach to drug discovery.