scispace - formally typeset
A

Amar Phanishayee

Researcher at Microsoft

Publications -  87
Citations -  4796

Amar Phanishayee is an academic researcher from Microsoft. The author has contributed to research in topics: Scheduling (computing) & Computer science. The author has an hindex of 26, co-authored 84 publications receiving 3388 citations. Previous affiliations of Amar Phanishayee include Cornell University & Carnegie Mellon University.

Papers
More filters
Proceedings ArticleDOI

FAWN: a fast array of wimpy nodes

TL;DR: The key contributions of this paper are the principles of the FAWN architecture and the design and implementation of FAWN-KV--a consistent, replicated, highly available, and high-performance key-value storage system built on a FAWN prototype.
Proceedings ArticleDOI

PipeDream: generalized pipeline parallelism for DNN training

TL;DR: PipeDream is presented, a system that adds inter-batch pipelining to intra-batch parallelism to further improve parallel training throughput, helping to better overlap computation with communication and reduce the amount of communication when possible.
Proceedings ArticleDOI

Safe and effective fine-grained TCP retransmissions for datacenter communication

TL;DR: This paper uses high-resolution timers to enable microsecond-granularity TCP timeouts and shows that eliminating the minimum retransmission timeout bound is safe for all environments, including the wide-area.
Proceedings Article

Measurement and analysis of TCP throughput collapse in cluster-based storage systems

TL;DR: This paper analyzes this Incast problem, explores its sensitivity to various system parameters, and examines the effectiveness of alternative TCP- and Ethernet-level strategies in mitigating the TCP throughput collapse.
Proceedings Article

The Non-IID Data Quagmire of Decentralized Machine Learning

TL;DR: SkewScout is presented, a system-level approach that adapts the communication frequency of decentralized learning algorithms to the (skew-induced) accuracy loss between data partitions and it is shown that group normalization can recover much of the accuracy loss of batch normalization.