scispace - formally typeset
G

Gao Huang

Researcher at Tsinghua University

Publications -  164
Citations -  43663

Gao Huang is an academic researcher from Tsinghua University. The author has contributed to research in topics: Computer science & Feature (computer vision). The author has an hindex of 37, co-authored 124 publications receiving 26697 citations. Previous affiliations of Gao Huang include Cornell University & University of Science and Technology of China.

Papers
More filters
Proceedings Article

Not All Images are Worth 16x16 Words: Dynamic Transformers for Efficient Image Recognition

TL;DR: Wang et al. as mentioned in this paper proposed a Dynamic Transformer to automatically configure a proper number of tokens for each input image, which is achieved by cascading multiple Transformers with increasing numbers of tokens, which are sequentially activated in an adaptive fashion.
Proceedings ArticleDOI

Learning to Weight Samples for Dynamic Early-exiting Networks

TL;DR: This work proposes to adopt a weight prediction network to weight the loss of different training samples at each exit of multi-exit networks, jointly optimized under a meta-learning framework with a novel optimization objective.
Journal ArticleDOI

Self-Attention-Based Temporary Curiosity in Reinforcement Learning Exploration

TL;DR: A combination of persisting curiosity and temporary curiosity framework to deal with the problem of overprotection against repetition is proposed, which introduces the self-attention mechanism from the field of computer vision and proposes a sequence-based self-Attention mechanism for temporary curiosity generation.
Proceedings Article

Revisiting Locally Supervised Learning: an Alternative to End-to-end Training

TL;DR: InfoPro as discussed by the authors proposes an information propagation loss, which encourages local modules to preserve as much useful information as possible, while progressively discard task-irrelevant information, and shows that the proposed method boils down to minimizing the combination of a reconstruction loss and a normal crossentropy/contrastive term.
Journal ArticleDOI

TC3KD: Knowledge distillation via teacher-student cooperative curriculum customization

TL;DR: Zhang et al. as mentioned in this paper proposed a knowledge distillation method via teacher-student cooperative curriculum customization to improve the performance of a lightweight student network by transferring some knowledge from a large-scale teacher network.