Institution
Xi'an Jiaotong University
Education•Xi'an, China•
About: Xi'an Jiaotong University is a education organization based out in Xi'an, China. It is known for research contribution in the topics: Heat transfer & Dielectric. The organization has 85440 authors who have published 99682 publications receiving 1579683 citations. The organization is also known as: '''Xi'an Jiaotong University''' & Xi'an Jiao Tong University.
Papers published on a yearly basis
Papers
More filters
••
TL;DR: By analyzing the kernels of the convolutional layers of DNCNN via NAM algorithm, it is found that these kernels act as filters and they become complex when the layers go deeper, which may help to understand what DNCNN has learned in intelligent fault diagnosis of machinery.
405 citations
••
15 May 2009TL;DR: A novel algorithm called Regularized Extreme Learning Machine is proposed, based on structural risk minimization principle and weighted least square, which was improved significantly in most cases without increasing training time.
Abstract: Extreme Learning Machine proposed by Huang G-B has attracted many attentions for its extremely fast training speed and good generalization performance. But it still can be considered as empirical risk minimization theme and tends to generate over-fitting model. Additionally, since ELM doesn't considering heteroskedasticity in real applications, its performance will be affected seriously when outliers exist in the dataset. In order to address these drawbacks, we propose a novel algorithm called Regularized Extreme Learning Machine based on structural risk minimization principle and weighted least square. The generalization performance of the proposed algorithm was improved significantly in most cases without increasing training time.
404 citations
••
01 Oct 2017TL;DR: Zhang et al. as discussed by the authors proposed a view adaptive recurrent neural network (RNN) with LSTM architecture, which enables the network itself to adapt to the most suitable observation viewpoints from end to end.
Abstract: Skeleton-based human action recognition has recently attracted increasing attention due to the popularity of 3D skeleton data. One main challenge lies in the large view variations in captured human actions. We propose a novel view adaptation scheme to automatically regulate observation viewpoints during the occurrence of an action. Rather than re-positioning the skeletons based on a human defined prior criterion, we design a view adaptive recurrent neural network (RNN) with LSTM architecture, which enables the network itself to adapt to the most suitable observation viewpoints from end to end. Extensive experiment analyses show that the proposed view adaptive RNN model strives to (1) transform the skeletons of various views to much more consistent viewpoints and (2) maintain the continuity of the action rather than transforming every frame to the same position with the same body orientation. Our model achieves significant improvement over the state-of-the-art approaches on three benchmark datasets.
402 citations
••
15 Jun 2019TL;DR: Wren et al. as discussed by the authors proposed a simple baseline deraining network by considering network architecture, input and output, and loss functions, which can be used as a suitable baseline in future deraining research.
Abstract: Along with the deraining performance improvement of deep networks, their structures and learning become more and more complicated and diverse, making it difficult to analyze the contribution of various network modules when developing new deraining networks. To handle this issue, this paper provides a better and simpler baseline deraining network by considering network architecture, input and output, and loss functions. Specifically, by repeatedly unfolding a shallow ResNet, progressive ResNet (PRN) is proposed to take advantage of recursive computation. A recurrent layer is further introduced to exploit the dependencies of deep features across stages, forming our progressive recurrent network (PReNet). Furthermore, intra-stage recursive computation of ResNet can be adopted in PRN and PReNet to notably reduce network parameters with unsubstantial degradation in deraining performance. For network input and output, we take both stage-wise result and original rainy image as input to each ResNet and finally output the prediction of residual image. As for loss functions, single MSE or negative SSIM losses are sufficient to train PRN and PReNet. Experiments show that PRN and PReNet perform favorably on both synthetic and real rainy images. Considering its simplicity, efficiency and effectiveness, our models are expected to serve as a suitable baseline in future deraining research. The source codes are available at https://github.com/csdwren/PReNet.
402 citations
••
TL;DR: It is demonstrated that the fine-modification of the flexible side chains of NFAs can yield 17% PCE for OPV cells, suggesting that optimization of the chemical structures of the OPV materials can improve device performance.
Abstract: The development of organic photoactive materials, especially the newly emerging non-fullerene electron acceptors (NFAs), has enabled rapid progress in organic photovoltaic (OPV) cells in recent years. Although the power conversion efficiencies (PCEs) of the top-performance OPV cells have surpassed 16%, the devices are usually fabricated via a spin-coating method and are not suitable for large-area production. Here, we demonstrate that the fine-modification of the flexible side chains of NFAs can yield 17% PCE for OPV cells. More crucially, as the optimal NFA has a suitable solubility and thus a desirable morphology, the high efficiencies of spin-coated devices can be maintained when using scalable blade-coating processing technology. Our results suggest that optimization of the chemical structures of the OPV materials can improve device performance. This has great significance in larger-area production technologies that provide important scientific insights for the commercialization of OPV cells.
402 citations
Authors
Showing all 86109 results
Name | H-index | Papers | Citations |
---|---|---|---|
Feng Zhang | 172 | 1278 | 181865 |
Yang Yang | 164 | 2704 | 144071 |
Jian Yang | 142 | 1818 | 111166 |
Lei Zhang | 130 | 2312 | 86950 |
Yang Liu | 129 | 2506 | 122380 |
Jian Zhou | 128 | 3007 | 91402 |
Chao Zhang | 127 | 3119 | 84711 |
Bin Wang | 126 | 2226 | 74364 |
Xin Wang | 121 | 1503 | 64930 |
Bo Wang | 119 | 2905 | 84863 |
Xuan Zhang | 119 | 1530 | 65398 |
Jian Liu | 117 | 2090 | 73156 |
Andrey L. Rogach | 117 | 576 | 46820 |
Yadong Yin | 115 | 431 | 64401 |
Xin Li | 114 | 2778 | 71389 |