Institution
Beihang University
Education•Beijing, China•
About: Beihang University is a education organization based out in Beijing, China. It is known for research contribution in the topics: Control theory & Microstructure. The organization has 67002 authors who have published 73507 publications receiving 975691 citations. The organization is also known as: Beijing University of Aeronautics and Astronautics.
Topics: Control theory, Microstructure, Nonlinear system, Artificial neural network, Feature extraction
Papers published on a yearly basis
Papers
More filters
•
TL;DR: An efficient transformer-based model for LSTF, named Informer, with three distinctive characteristics: a self-attention mechanism, which achieves $O(L \log L)$ in time complexity and memory usage, and has comparable performance on sequences' dependency alignment.
Abstract: Many real-world applications require the prediction of long sequence time-series, such as electricity consumption planning. Long sequence time-series forecasting (LSTF) demands a high prediction capacity of the model, which is the ability to capture precise long-range dependency coupling between output and input efficiently. Recent studies have shown the potential of Transformer to increase the prediction capacity. However, there are several severe issues with Transformer that prevent it from being directly applicable to LSTF, including quadratic time complexity, high memory usage, and inherent limitation of the encoder-decoder architecture. To address these issues, we design an efficient transformer-based model for LSTF, named Informer, with three distinctive characteristics: (i) a $ProbSparse$ self-attention mechanism, which achieves $O(L \log L)$ in time complexity and memory usage, and has comparable performance on sequences' dependency alignment. (ii) the self-attention distilling highlights dominating attention by halving cascading layer input, and efficiently handles extreme long input sequences. (iii) the generative style decoder, while conceptually simple, predicts the long time-series sequences at one forward operation rather than a step-by-step way, which drastically improves the inference speed of long-sequence predictions. Extensive experiments on four large-scale datasets demonstrate that Informer significantly outperforms existing methods and provides a new solution to the LSTF problem.
832 citations
••
13 Dec 2010TL;DR: A detailed study of 11 widely used internal clustering validation measures for crisp clustering and shows that S\_Dbw is the only internal validation measure which performs well in all five aspects, while other measures have certain limitations in different application scenarios.
Abstract: Clustering validation has long been recognized as one of the vital issues essential to the success of clustering applications. In general, clustering validation can be categorized into two classes, external clustering validation and internal clustering validation. In this paper, we focus on internal clustering validation and present a detailed study of 11 widely used internal clustering validation measures for crisp clustering. From five conventional aspects of clustering, we investigate their validation properties. Experiment results show that S\_Dbw is the only internal validation measure which performs well in all five aspects, while other measures have certain limitations in different application scenarios.
830 citations
••
01 Jun 2014TL;DR: AdaRNN adaptively propagates the sentiments of words to target depending on the context and syntactic relationships between them and it is shown that AdaRNN improves the baseline methods.
Abstract: We propose Adaptive Recursive Neural Network (AdaRNN) for target-dependent Twitter sentiment classification. AdaRNN adaptively propagates the sentiments of words to target depending on the context and syntactic relationships between them. It consists of more than one composition functions, and we model the adaptive sentiment propagations as distributions over these composition functions. The experimental studies illustrate that AdaRNN improves the baseline methods. Furthermore, we introduce a manually annotated dataset for target-dependent Twitter sentiment analysis.
809 citations
••
01 Mar 2019
TL;DR: Shui et al. as mentioned in this paper reported a class of concave Fe-N-C single-atom catalysts possessing an enhanced external surface area and mesoporosity that meets the 2018 PGM-free catalyst activity target.
Abstract: To achieve the US Department of Energy 2018 target set for platinum-group metal-free catalysts (PGM-free catalysts) in proton exchange membrane fuel cells, the low density of active sites must be overcome. Here, we report a class of concave Fe–N–C single-atom catalysts possessing an enhanced external surface area and mesoporosity that meets the 2018 PGM-free catalyst activity target, and a current density of 0.047 A cm–2 at 0.88 ViR-free under 1.0 bar H2–O2. This performance stems from the high density of active sites, which is realized through exposing inaccessible Fe–N4 moieties (that is, increasing their utilization) and enhancing the mass transport of the catalyst layer. Further, we establish structure–property correlations that provide a route for designing highly efficient PGM-free catalysts for practical application, achieving a power density of 1.18 W cm−2 under 2.5 bar H2–O2, and an activity of 129 mA cm−2 at 0.8 ViR-free under 1.0 bar H2–air. Iron single-atom catalysts are among the most promising fuel cell cathode materials in acid electrolyte solution. Now, Shui, Xu and co-workers report concave-shaped Fe–N–C nanoparticles with increased availability of active sites and improved mass transport, meeting the US Department of Energy 2018 target for platinum-group metal-free fuel cell catalysts.
803 citations
•
TL;DR: This paper extensively reviews 400+ papers of object detection in the light of its technical evolution, spanning over a quarter-century's time (from the 1990s to 2019), and makes an in-deep analysis of their challenges as well as technical improvements in recent years.
Abstract: Object detection, as of one the most fundamental and challenging problems in computer vision, has received great attention in recent years. Its development in the past two decades can be regarded as an epitome of computer vision history. If we think of today's object detection as a technical aesthetics under the power of deep learning, then turning back the clock 20 years we would witness the wisdom of cold weapon era. This paper extensively reviews 400+ papers of object detection in the light of its technical evolution, spanning over a quarter-century's time (from the 1990s to 2019). A number of topics have been covered in this paper, including the milestone detectors in history, detection datasets, metrics, fundamental building blocks of the detection system, speed up techniques, and the recent state of the art detection methods. This paper also reviews some important detection applications, such as pedestrian detection, face detection, text detection, etc, and makes an in-deep analysis of their challenges as well as technical improvements in recent years.
802 citations
Authors
Showing all 67500 results
Name | H-index | Papers | Citations |
---|---|---|---|
Yi Chen | 217 | 4342 | 293080 |
H. S. Chen | 179 | 2401 | 178529 |
Alan J. Heeger | 171 | 913 | 147492 |
Lei Jiang | 170 | 2244 | 135205 |
Wei Li | 158 | 1855 | 124748 |
Shu-Hong Yu | 144 | 799 | 70853 |
Jian Zhou | 128 | 3007 | 91402 |
Chao Zhang | 127 | 3119 | 84711 |
Igor Katkov | 125 | 972 | 71845 |
Tao Zhang | 123 | 2772 | 83866 |
Nicholas A. Kotov | 123 | 574 | 55210 |
Shi Xue Dou | 122 | 2028 | 74031 |
Li Yuan | 121 | 948 | 67074 |
Robert O. Ritchie | 120 | 659 | 54692 |
Haiyan Wang | 119 | 1674 | 86091 |