Z
Zhuo Wang
Researcher at Princeton University
Publications - 28
Citations - 1595
Zhuo Wang is an academic researcher from Princeton University. The author has contributed to research in topics: Boosting (machine learning) & Detector. The author has an hindex of 13, co-authored 27 publications receiving 1104 citations. Previous affiliations of Zhuo Wang include Tiffany & Co. & IBM.
Papers
More filters
Posted Content
PACT: Parameterized Clipping Activation for Quantized Neural Networks
Jungwook Choi,Zhuo Wang,Swagath Venkataramani,Pierce I.Jen Chuang,Vijayalakshmi Srinivasan,Kailash Gopalakrishnan +5 more
TL;DR: It is shown, for the first time, that both weights and activations can be quantized to 4-bits of precision while still achieving accuracy comparable to full precision networks across a range of popular models and datasets.
Journal ArticleDOI
In-Memory Computation of a Machine-Learning Classifier in a Standard 6T SRAM Array
TL;DR: A machine-learning classifier where computations are performed in a standard 6T SRAM array, which stores the machine- learning model, and a training algorithm enables a strong classifier through boosting and also overcomes circuit nonidealities, by combining multiple columns.
Proceedings ArticleDOI
A machine-learning classifier implemented in a standard 6T SRAM array
TL;DR: This paper presents a machine-learning classifier where the computation is performed within a standard 6T SRAM array, which eliminates explicit memory operations, which otherwise pose energy/performance bottlenecks, especially for emerging algorithms that result in high ratio of memory accesses.
Posted Content
Bridging the accuracy gap for 2-bit Quantized Neural Networks (QNN)
Jungwook Choi,Pierce I.Jen Chuang,Zhuo Wang,Zhuo Wang,Swagath Venkataramani,Vijayalakshmi Srinivasan,Kailash Gopalakrishnan +6 more
TL;DR: This paper proposes novel techniques that target weight and activation quantizations separately resulting in an overall quantized neural network (QNN) that achieves state-of-the-art classification accuracy across a range of popular models and datasets.