scispace - formally typeset
G

Gao Huang

Researcher at Tsinghua University

Publications -  164
Citations -  43663

Gao Huang is an academic researcher from Tsinghua University. The author has contributed to research in topics: Computer science & Feature (computer vision). The author has an hindex of 37, co-authored 124 publications receiving 26697 citations. Previous affiliations of Gao Huang include Cornell University & University of Science and Technology of China.

Papers
More filters
Proceedings ArticleDOI

Densely Connected Convolutional Networks

TL;DR: DenseNet as mentioned in this paper proposes to connect each layer to every other layer in a feed-forward fashion, which can alleviate the vanishing gradient problem, strengthen feature propagation, encourage feature reuse, and substantially reduce the number of parameters.
Proceedings ArticleDOI

Learning Efficient Convolutional Networks through Network Slimming

TL;DR: In this article, the authors proposed a network slimming method for CNNs to simultaneously reduce the model size, decrease the run-time memory footprint, and lower the number of computing operations without compromising accuracy.
Book ChapterDOI

Deep Networks with Stochastic Depth

TL;DR: Stochastic depth is proposed, a training procedure that enables the seemingly contradictory setup to train short networks and use deep networks at test time and reduces training time substantially and improves the test error significantly on almost all data sets that were used for evaluation.
Journal ArticleDOI

Trends in extreme learning machines

TL;DR: In this paper, the authors report the current state of the theoretical research and practical advances on this subject and provide a comprehensive view of these advances in ELM together with its future perspectives.
Posted Content

Densely Connected Convolutional Networks

TL;DR: The Dense Convolutional Network (DenseNet), which connects each layer to every other layer in a feed-forward fashion, and has several compelling advantages: they alleviate the vanishing-gradient problem, strengthen feature propagation, encourage feature reuse, and substantially reduce the number of parameters.