S
Siddharth Samsi
Researcher at Massachusetts Institute of Technology
Publications - 142
Citations - 1725
Siddharth Samsi is an academic researcher from Massachusetts Institute of Technology. The author has contributed to research in topics: Computer science & Scalability. The author has an hindex of 18, co-authored 116 publications receiving 1175 citations. Previous affiliations of Siddharth Samsi include University of Luxembourg & Ohio State University.
Papers
More filters
Proceedings ArticleDOI
Interactive Supercomputing on 40,000 Cores for Machine Learning and Data Analysis
Albert Reuther,Jeremy Kepner,Chansup Byun,Siddharth Samsi,William Arcand,David Bestor,Bill Bergeron,Vijay Gadepally,Michael Houle,Matthew Hubbell,Michael Jones,Anna Klein,Lauren Milechin,Julia Mullen,Andrew Prout,Antonio Rosa,Charles Yee,Peter Michaleas +17 more
TL;DR: In this paper, the authors demonstrate the capabilities of a 40,000-core supercomputer to launch tens of thousands of TensorFlow and MATLAB/Octave processes in 40 seconds.
Proceedings ArticleDOI
Survey and Benchmarking of Machine Learning Accelerators
TL;DR: This paper surveys the current state of processors and accelerators that have been publicly announced with performance and power consumption numbers, and selects and benchmark two commercially available low size, weight, and power (SWaP) accelerators as these processors are the most interesting for embedded and mobile machine learning inference applications that are most applicable to the DoD and other SWaP constrained users.
Proceedings ArticleDOI
Interactive Supercomputing on 40,000 Cores for Machine Learning and Data Analysis
Albert Reuther,Jeremy Kepner,Chansup Byun,Siddharth Samsi,William Arcand,David Bestor,Bill Bergeron,Vijay Gadepally,Michael Houle,Matthew Hubbell,Michael Jones,Anna Klein,Lauren Milechin,Julia Mullen,Andrew Prout,Antonio Rosa,Charles Yee,Peter Michaleas +17 more
TL;DR: In this paper, the authors show how to scale interactive machine learning frameworks, such as TensorFlow and MATLAB/Octave, to tens of thousands of cores by fine-tuning the launches and prepositioning of applications on a 40,000-core supercomputer.
Proceedings ArticleDOI
Survey of Machine Learning Accelerators.
TL;DR: This paper collects and summarizes the current accelerators that have been publicly announced with performance and power consumption numbers and highlights interesting trends regarding power consumption, numerical precision, and inference versus training.
Proceedings ArticleDOI
Static graph challenge: Subgraph isomorphism
Siddharth Samsi,Vijay Gadepally,Michael Hurley,Michael Jones,Edward K. Kao,Sanjeev Mohindra,Paul Monticciolo,Albert Reuther,Steven T. Smith,William S. Song,Diane Staheli,Jeremy Kepner +11 more
TL;DR: The proposed Subgraph Isomorphism Graph Challenge draws upon prior challenges from machine learning, high performance computing, and visual analytics to create a graph challenge that is reflective of many real-world graph analytics processing systems.