Institution
Beihang University
Education•Beijing, China•
About: Beihang University is a education organization based out in Beijing, China. It is known for research contribution in the topics: Control theory & Microstructure. The organization has 67002 authors who have published 73507 publications receiving 975691 citations. The organization is also known as: Beijing University of Aeronautics and Astronautics.
Topics: Control theory, Microstructure, Nonlinear system, Artificial neural network, Feature extraction
Papers published on a yearly basis
Papers
More filters
••
TL;DR: In this article, the effect of preparing conditions on the deposition of ZnO nanorods was systematically studied by scanning electron microscopy, X-ray diffraction and photoluminescence spectroscopy.
449 citations
••
01 Jun 2019
TL;DR: This paper proposes an effective structured pruning approach that jointly prunes filters as well as other structures in an end-to-end manner and effectively solves the optimization problem by generative adversarial learning (GAL), which learns a sparse soft mask in a label-free and an end to end manner.
Abstract: Structured pruning of filters or neurons has received increased focus for compressing convolutional neural networks. Most existing methods rely on multi-stage optimizations in a layer-wise manner for iteratively pruning and retraining which may not be optimal and may be computation intensive. Besides, these methods are designed for pruning a specific structure, such as filter or block structures without jointly pruning heterogeneous structures. In this paper, we propose an effective structured pruning approach that jointly prunes filters as well as other structures in an end-to-end manner. To accomplish this, we first introduce a soft mask to scale the output of these structures by defining a new objective function with sparsity regularization to align the output of baseline and network with this mask. We then effectively solve the optimization problem by generative adversarial learning (GAL), which learns a sparse soft mask in a label-free and an end-to-end manner. By forcing more scale factors in the soft mask to zero, the fast iterative shrinkage-thresholding algorithm (FISTA) can be leveraged to fast and reliably remove the corresponding structures. Extensive experiments demonstrate the effectiveness of GAL on different datasets, including MNIST, CIFAR-10 and ImageNet ILSVRC 2012. For example, on ImageNet ILSVRC 2012, the pruned ResNet-50 achieves 10.88% Top-5 error and results in a factor of 3.7x speedup. This significantly outperforms state-of-the-art methods.
447 citations
••
TL;DR: High average ZT is obtained by synergistically optimized electrical- and thermal-transport properties via carrier concentration tuning, band structure engineering and hierarchical architecturing, and highlights a realistic prospect of wide applications of thermoelectrics.
Abstract: Obtaining highly efficient thermoelectric materials relies on a high ZT, and on this value being consistently high over a wide temperature range. Here, the authors demonstrate a phase-separated PbTe-based material that exhibits a ZT of >2 from 673 to 923 K, and a resultantly high average ZT of 1.56 between 300 and 900 K.
447 citations
••
15 Jun 2019TL;DR: Zhang et al. as mentioned in this paper investigated the issue of knowledge distillation for training compact semantic segmentation networks by making use of cumbersome networks and proposed to distill the structured knowledge from cumbersome networks into compact networks.
Abstract: In this paper, we investigate the issue of knowledge distillation for training compact semantic segmentation networks by making use of cumbersome networks. We start from the straightforward scheme, pixel-wise distillation, which applies the distillation scheme originally introduced for image classification and performs knowledge distillation for each pixel separately. We further propose to distill the structured knowledge from cumbersome networks into compact networks, which is motivated by the fact that semantic segmentation is a structured prediction problem. We study two such structured distillation schemes: (i) pair-wise distillation that distills the pairwise similarities, and (ii) holistic distillation that uses adversarial training to distill holistic knowledge. The effectiveness of our knowledge distillation approaches is demonstrated by extensive experiments on three scene parsing datasets: Cityscapes, Camvid and ADE20K.
446 citations
••
TL;DR: Experimental results indicate that the effective integration of high catalytic reactivity, high structural stability, and high electronic conductivity into a single material system makes Ni-Fe-OH@Ni3 S2 /NF a remarkable catalytic ability for OER at large current densities.
Abstract: Developing nonprecious oxygen evolution electrocatalysts that can work well at large current densities is of primary importance in a viable water-splitting technology. Herein, a facile ultrafast (5 s) synthetic approach is reported that produces a novel, efficient, non-noble metal oxygen-evolution nano-electrocatalyst that is composed of amorphous Ni-Fe bimetallic hydroxide film-coated, nickel foam (NF)-supported, Ni3 S2 nanosheet arrays. The composite nanomaterial (denoted as Ni-Fe-OH@Ni3 S2 /NF) shows highly efficient electrocatalytic activity toward oxygen evolution reaction (OER) at large current densities, even in the order of 1000 mA cm-2 . Ni-Fe-OH@Ni3 S2 /NF also gives an excellent catalytic stability toward OER both in 1 m KOH solution and in 30 wt% KOH solution. Further experimental results indicate that the effective integration of high catalytic reactivity, high structural stability, and high electronic conductivity into a single material system makes Ni-Fe-OH@Ni3 S2 /NF a remarkable catalytic ability for OER at large current densities.
443 citations
Authors
Showing all 67500 results
Name | H-index | Papers | Citations |
---|---|---|---|
Yi Chen | 217 | 4342 | 293080 |
H. S. Chen | 179 | 2401 | 178529 |
Alan J. Heeger | 171 | 913 | 147492 |
Lei Jiang | 170 | 2244 | 135205 |
Wei Li | 158 | 1855 | 124748 |
Shu-Hong Yu | 144 | 799 | 70853 |
Jian Zhou | 128 | 3007 | 91402 |
Chao Zhang | 127 | 3119 | 84711 |
Igor Katkov | 125 | 972 | 71845 |
Tao Zhang | 123 | 2772 | 83866 |
Nicholas A. Kotov | 123 | 574 | 55210 |
Shi Xue Dou | 122 | 2028 | 74031 |
Li Yuan | 121 | 948 | 67074 |
Robert O. Ritchie | 120 | 659 | 54692 |
Haiyan Wang | 119 | 1674 | 86091 |