scispace - formally typeset
D

Dan Alistarh

Researcher at Institute of Science and Technology Austria

Publications -  213
Citations -  4887

Dan Alistarh is an academic researcher from Institute of Science and Technology Austria. The author has contributed to research in topics: Computer science & Stochastic gradient descent. The author has an hindex of 27, co-authored 175 publications receiving 3761 citations. Previous affiliations of Dan Alistarh include ETH Zurich & Microsoft.

Papers
More filters
Proceedings ArticleDOI

Efficiency Guarantees for Parallel Incremental Algorithms under Relaxed Schedulers

TL;DR: In this article, the authors analyze the efficiency guarantees provided by a range of incremental algorithms when parallelized via relaxed schedulers and show that the overheads of relaxation will be outweighed by the improved scalability of the relaxed scheduler.
Proceedings ArticleDOI

The Splay-List: A Distribution-Adaptive Concurrent Skip-List

TL;DR: Experimental results show that the splay-list can leverage distribution-adaptivity to improve on the performance of classic concurrent designs, and can outperform the only previously-known distribution- Adaptive design in certain settings.
Posted Content

Time-Space Trade-offs in Molecular Computation

TL;DR: A unified lower bound is proved, which relates the space available per node with the time complexity achievable by a protocol, and the first result to characterize time complexity for protocols which employ super-constant number of states per node is characterized.
Proceedings ArticleDOI

Randomized loose renaming in o(log log n) time

TL;DR: This paper gives a non-adaptive algorithm with O( log log n ) (individual) step complexity, and an adaptive algorithm with step complexity O((log log k)2 ), where k is the actual contention in the execution.
Proceedings ArticleDOI

A Brief Tutorial on Distributed and Concurrent Machine Learning

TL;DR: This tutorial will focus on parallelization strategies for the fundamental stochastic gradient descent (SGD) algorithm, which is a key tool when training machine learning models, from classical instances such as linear regression, to state-of-the-art neural network architectures.