Y
Yuwei Hu
Researcher at Cornell University
Publications - 19
Citations - 1851
Yuwei Hu is an academic researcher from Cornell University. The author has contributed to research in topics: Convolution & Compiler. The author has an hindex of 12, co-authored 16 publications receiving 1122 citations.
Papers
More filters
Proceedings ArticleDOI
TVM: an automated end-to-end optimizing compiler for deep learning
Tianqi Chen,Thierry Moreau,Ziheng Jiang,Lianmin Zheng,Eddie Yan,Meghan Cowan,Haichen Shen,Leyuan Wang,Yuwei Hu,Luis Ceze,Carlos Guestrin,Arvind Krishnamurthy +11 more
TL;DR: TVM as discussed by the authors is a compiler that exposes graph-level and operator-level optimizations to provide performance portability to deep learning workloads across diverse hardware back-ends, such as mobile phones, embedded devices, and accelerators.
Posted Content
TVM: End-to-End Optimization Stack for Deep Learning
Tianqi Chen,Thierry Moreau,Ziheng Jiang,Haichen Shen,Eddie Yan,Leyuan Wang,Yuwei Hu,Luis Ceze,Carlos Guestrin,Arvind Krishnamurthy +9 more
TL;DR: TVM is proposed, an end-to-end optimization stack that exposes graph-level and operator-level optimizations to provide performance portability to deep learning workloads across diverse hardware back-ends and discusses the optimization challenges specific toDeep learning that TVM solves.
Posted Content
TVM: An Automated End-to-End Optimizing Compiler for Deep Learning
Tianqi Chen,Thierry Moreau,Ziheng Jiang,Lianmin Zheng,Eddie Yan,Meghan Cowan,Haichen Shen,Leyuan Wang,Yuwei Hu,Luis Ceze,Carlos Guestrin,Arvind Krishnamurthy +11 more
TL;DR: TVM is a compiler that exposes graph-level and operator-level optimizations to provide performance portability to deep learning workloads across diverse hardware back-ends and automates optimization of low-level programs to hardware characteristics by employing a novel, learning-based cost modeling method for rapid exploration of code optimizations.
Proceedings Article
Improving Neural Network Quantization without Retraining using Outlier Channel Splitting
TL;DR: In this article, the authors propose outlier channel splitting (OCS) which duplicates channels containing outliers and then halves the channel values, so that the network remains functionally identical, but affected outliers are moved toward the center of the distribution.
Posted Content
Improving Neural Network Quantization without Retraining using Outlier Channel Splitting
TL;DR: This work proposes outlier channel splitting (OCS), which duplicates channels containing outliers, then halves the channel values, and shows that OCS can outperform state-of-the-art clipping techniques with only minor overhead.