scispace - formally typeset
H

Huan Wang

Researcher at Zhejiang University

Publications -  41
Citations -  559

Huan Wang is an academic researcher from Zhejiang University. The author has contributed to research in topics: Computer science & Convolutional neural network. The author has an hindex of 10, co-authored 26 publications receiving 274 citations. Previous affiliations of Huan Wang include Northeastern University.

Papers
More filters
Proceedings ArticleDOI

Collaborative Distillation for Ultra-Resolution Universal Style Transfer

TL;DR: A new knowledge distillation method for encoder-decoder based neural style transfer to reduce the convolutional filters and achieves ultra-resolution (over 40 megapixels) universal style transfer on a 12GB GPU for the first time.

MNN: A Universal and Efficient Inference Engine

TL;DR: The contributions of MNN include presenting a mechanism called pre-inference that manages to conduct runtime optimization that delivers thorough kernel optimization on operators to achieve optimal computation performance and introducing backend abstraction module which enables hybrid scheduling and keeps the engine lightweight.
Posted Content

Structured Probabilistic Pruning for Convolutional Neural Network Acceleration

TL;DR: A novel progressive parameter pruning method, named Structured Probabilistic Pruning (SPP), which effectively prunes weights of convolutional layers in a probabilistic manner and can be directly applied to accelerate multi-branch CNN networks, such as ResNet, without specific adaptations.
Posted Content

Neural Pruning via Growing Regularization

TL;DR: This work proposes an L2 regularization variant with rising penalty factors and shows it can bring significant accuracy gains compared with its one-shot counterpart, even when the same weights are removed.
Proceedings ArticleDOI

Structured Pruning for Efficient ConvNets via Incremental Regularization

TL;DR: A new and novel regularization-based pruning method, named IncReg, to incrementally assign different regularization factors to different weights based on their relative importance is proposed, which achieves comparable to even better results compared with state-of-the-arts.