scispace - formally typeset
Z

Zhicheng Yan

Researcher at Facebook

Publications -  70
Citations -  3492

Zhicheng Yan is an academic researcher from Facebook. The author has contributed to research in topics: Chemistry & Computer science. The author has an hindex of 18, co-authored 46 publications receiving 2208 citations. Previous affiliations of Zhicheng Yan include Columbia University & University of Hong Kong.

Papers
More filters
Proceedings Article

Decoupling Representation and Classifier for Long-Tailed Recognition

TL;DR: It is shown that it is possible to outperform carefully designed losses, sampling strategies, even complex modules with memory, by using a straightforward approach that decouples representation and classification.
Proceedings ArticleDOI

Graph-Based Global Reasoning Networks

TL;DR: A new approach for reasoning globally in which a set of features are globally aggregated over the coordinate space and then projected to an interaction space where relational reasoning can be efficiently computed.
Proceedings ArticleDOI

Drop an Octave: Reducing Spatial Redundancy in Convolutional Neural Networks With Octave Convolution

TL;DR: OctConv as discussed by the authors factorizes the mixed feature maps by their frequencies, and design a novel Octave Convolution operation to store and process feature maps that vary spatially "slower" at a lower spatial resolution.
Proceedings ArticleDOI

HD-CNN: Hierarchical Deep Convolutional Neural Networks for Large Scale Visual Recognition

TL;DR: In this paper, a hierarchical deep CNN (HD-CNN) is proposed by embedding deep CNNs into a two-level category hierarchy, which separates easy classes using a coarse category classifier while distinguishing difficult classes using fine category classifiers.
Journal ArticleDOI

Automatic Photo Adjustment Using Deep Neural Networks

TL;DR: In this article, an image descriptor accounting for the local semantics of an image is introduced to model local adjustments that depend on image semantics, which yields results that are qualitatively and quantitatively better than previous work.