D
Danny Bickson
Researcher at Carnegie Mellon University
Publications - 65
Citations - 7426
Danny Bickson is an academic researcher from Carnegie Mellon University. The author has contributed to research in topics: Belief propagation & Gaussian. The author has an hindex of 23, co-authored 65 publications receiving 7086 citations. Previous affiliations of Danny Bickson include University of California & University of California, San Diego.
Papers
More filters
Proceedings ArticleDOI
PowerGraph: distributed graph-parallel computation on natural graphs
TL;DR: This paper describes the challenges of computation on natural graphs in the context of existing graph-parallel abstractions and introduces the PowerGraph abstraction which exploits the internal structure of graph programs to address these challenges.
Journal ArticleDOI
Distributed GraphLab: a framework for machine learning and data mining in the cloud
Yucheng Low,Danny Bickson,Joseph E. Gonzalez,Carlos Guestrin,Aapo Kyrola,Joseph M. Hellerstein +5 more
TL;DR: GraphLab as discussed by the authors extends the GraphLab framework to the substantially more challenging distributed setting while preserving strong data consistency guarantees to reduce network congestion and mitigate the effect of network latency in the shared-memory setting.
Proceedings Article
GraphLab: a new framework for parallel machine learning
Yucheng Low,Joseph E. Gonzalez,Aapo Kyrola,Danny Bickson,Carlos Guestrin,Joseph M. Hellerstein +5 more
TL;DR: The expressiveness of the GraphLab framework is demonstrated by designing and implementing parallel versions of belief propagation, Gibbs sampling, Co-EM, Lasso and Compressed Sensing and it is shown that using GraphLab the authors can achieve excellent parallel performance on large scale real-world problems.
Posted Content
Distributed GraphLab: A Framework for Machine Learning in the Cloud
Yucheng Low,Joseph E. Gonzalez,Aapo Kyrola,Danny Bickson,Carlos Guestrin,Joseph M. Hellerstein +5 more
TL;DR: This paper develops graph based extensions to pipelined locking and data versioning to reduce network congestion and mitigate the effect of network latency, and introduces fault tolerance to the GraphLab abstraction using the classic Chandy-Lamport snapshot algorithm.