scispace - formally typeset
Search or ask a question
Institution

Beihang University

EducationBeijing, China
About: Beihang University is a education organization based out in Beijing, China. It is known for research contribution in the topics: Control theory & Microstructure. The organization has 67002 authors who have published 73507 publications receiving 975691 citations. The organization is also known as: Beijing University of Aeronautics and Astronautics.


Papers
More filters
Journal ArticleDOI
TL;DR: In this article, the effect of preparing conditions on the deposition of ZnO nanorods was systematically studied by scanning electron microscopy, X-ray diffraction and photoluminescence spectroscopy.

449 citations

Proceedings ArticleDOI
01 Jun 2019
TL;DR: This paper proposes an effective structured pruning approach that jointly prunes filters as well as other structures in an end-to-end manner and effectively solves the optimization problem by generative adversarial learning (GAL), which learns a sparse soft mask in a label-free and an end to end manner.
Abstract: Structured pruning of filters or neurons has received increased focus for compressing convolutional neural networks. Most existing methods rely on multi-stage optimizations in a layer-wise manner for iteratively pruning and retraining which may not be optimal and may be computation intensive. Besides, these methods are designed for pruning a specific structure, such as filter or block structures without jointly pruning heterogeneous structures. In this paper, we propose an effective structured pruning approach that jointly prunes filters as well as other structures in an end-to-end manner. To accomplish this, we first introduce a soft mask to scale the output of these structures by defining a new objective function with sparsity regularization to align the output of baseline and network with this mask. We then effectively solve the optimization problem by generative adversarial learning (GAL), which learns a sparse soft mask in a label-free and an end-to-end manner. By forcing more scale factors in the soft mask to zero, the fast iterative shrinkage-thresholding algorithm (FISTA) can be leveraged to fast and reliably remove the corresponding structures. Extensive experiments demonstrate the effectiveness of GAL on different datasets, including MNIST, CIFAR-10 and ImageNet ILSVRC 2012. For example, on ImageNet ILSVRC 2012, the pruned ResNet-50 achieves 10.88% Top-5 error and results in a factor of 3.7x speedup. This significantly outperforms state-of-the-art methods.

447 citations

Journal ArticleDOI
TL;DR: High average ZT is obtained by synergistically optimized electrical- and thermal-transport properties via carrier concentration tuning, band structure engineering and hierarchical architecturing, and highlights a realistic prospect of wide applications of thermoelectrics.
Abstract: Obtaining highly efficient thermoelectric materials relies on a high ZT, and on this value being consistently high over a wide temperature range. Here, the authors demonstrate a phase-separated PbTe-based material that exhibits a ZT of >2 from 673 to 923 K, and a resultantly high average ZT of 1.56 between 300 and 900 K.

447 citations

Proceedings ArticleDOI
15 Jun 2019
TL;DR: Zhang et al. as mentioned in this paper investigated the issue of knowledge distillation for training compact semantic segmentation networks by making use of cumbersome networks and proposed to distill the structured knowledge from cumbersome networks into compact networks.
Abstract: In this paper, we investigate the issue of knowledge distillation for training compact semantic segmentation networks by making use of cumbersome networks. We start from the straightforward scheme, pixel-wise distillation, which applies the distillation scheme originally introduced for image classification and performs knowledge distillation for each pixel separately. We further propose to distill the structured knowledge from cumbersome networks into compact networks, which is motivated by the fact that semantic segmentation is a structured prediction problem. We study two such structured distillation schemes: (i) pair-wise distillation that distills the pairwise similarities, and (ii) holistic distillation that uses adversarial training to distill holistic knowledge. The effectiveness of our knowledge distillation approaches is demonstrated by extensive experiments on three scene parsing datasets: Cityscapes, Camvid and ADE20K.

446 citations

Journal ArticleDOI
TL;DR: Experimental results indicate that the effective integration of high catalytic reactivity, high structural stability, and high electronic conductivity into a single material system makes Ni-Fe-OH@Ni3 S2 /NF a remarkable catalytic ability for OER at large current densities.
Abstract: Developing nonprecious oxygen evolution electrocatalysts that can work well at large current densities is of primary importance in a viable water-splitting technology. Herein, a facile ultrafast (5 s) synthetic approach is reported that produces a novel, efficient, non-noble metal oxygen-evolution nano-electrocatalyst that is composed of amorphous Ni-Fe bimetallic hydroxide film-coated, nickel foam (NF)-supported, Ni3 S2 nanosheet arrays. The composite nanomaterial (denoted as Ni-Fe-OH@Ni3 S2 /NF) shows highly efficient electrocatalytic activity toward oxygen evolution reaction (OER) at large current densities, even in the order of 1000 mA cm-2 . Ni-Fe-OH@Ni3 S2 /NF also gives an excellent catalytic stability toward OER both in 1 m KOH solution and in 30 wt% KOH solution. Further experimental results indicate that the effective integration of high catalytic reactivity, high structural stability, and high electronic conductivity into a single material system makes Ni-Fe-OH@Ni3 S2 /NF a remarkable catalytic ability for OER at large current densities.

443 citations


Authors

Showing all 67500 results

NameH-indexPapersCitations
Yi Chen2174342293080
H. S. Chen1792401178529
Alan J. Heeger171913147492
Lei Jiang1702244135205
Wei Li1581855124748
Shu-Hong Yu14479970853
Jian Zhou128300791402
Chao Zhang127311984711
Igor Katkov12597271845
Tao Zhang123277283866
Nicholas A. Kotov12357455210
Shi Xue Dou122202874031
Li Yuan12194867074
Robert O. Ritchie12065954692
Haiyan Wang119167486091
Network Information
Related Institutions (5)
Harbin Institute of Technology
109.2K papers, 1.6M citations

96% related

Tsinghua University
200.5K papers, 4.5M citations

92% related

University of Science and Technology of China
101K papers, 2.4M citations

92% related

Nanyang Technological University
112.8K papers, 3.2M citations

92% related

City University of Hong Kong
60.1K papers, 1.7M citations

91% related

Performance
Metrics
No. of papers from the Institution in previous years
YearPapers
20241
2023205
20221,178
20216,767
20206,916
20197,080