Institution
Beihang University
Education•Beijing, China•
About: Beihang University is a education organization based out in Beijing, China. It is known for research contribution in the topics: Control theory & Microstructure. The organization has 67002 authors who have published 73507 publications receiving 975691 citations. The organization is also known as: Beijing University of Aeronautics and Astronautics.
Topics: Control theory, Microstructure, Nonlinear system, Artificial neural network, Feature extraction
Papers published on a yearly basis
Papers
More filters
••
TL;DR: This paper provides a practical means to evaluate the ACC systems applying the sliding-mode controller and provides a reasonable proposal to design the ACC controller from the perspective of the practical string stability.
Abstract: In this paper, the practical string stability of both homogeneous and heterogeneous platoons of adaptive cruise control (ACC) vehicles, which apply the constant time headway spacing policy, is investigated by considering the parasitic time delays and lags of the actuators and sensors when building the vehicle longitudinal dynamics model. The proposed control law based on the sliding-mode controller can guarantee both homogeneous and heterogeneous string stability, if the control parameters and system parameters meet certain requirements. The analysis of the negative effect of the parasitic time delays and lags on the string stability indicates that the negative effect of the time delays is larger than that of the time lags. This paper provides a practical means to evaluate the ACC systems applying the sliding-mode controller and provides a reasonable proposal to design the ACC controller from the perspective of the practical string stability.
403 citations
••
TL;DR: The excellent absorbing performance together with lightweight and ultrathin thickness endows the CNTs/Co composite with the potential for application in the electromagnetic wave absorbing field.
Abstract: Porous carbon nanotubes/cobalt nanoparticles (CNTs/Co) composite with dodecahedron morphology was synthesized by in situ pyrolysis of the Co-based zeolitic imidazolate framework in a reducing atmosphere. The morphology and microstructure of the composite can be well tuned by controlling the pyrolysis conditions. At lower pyrolysis temperature, the CNTs/Co composite is composed of well-dispersed Co nanoparticles and short CNT clusters with low graphitic degree. The increase of pyrolysis temperature/time promotes the growth and graphitization of CNTs and leads to the aggregation of Co nanoparticles. The optimized CNTs/Co composite exhibits strong dielectric and magnetic losses as well as a good impedance matching property. Interestingly, the CNTs/Co composite displays extremely strong electromagnetic wave absorption with a maximum reflection loss of −60.4 dB. More importantly, the matching thickness of the absorber is as thin as 1.81 mm, and the filler loading of composite in the matrix is only 20 wt %. The...
402 citations
••
TL;DR: DS-SLAM as discussed by the authors combines semantic segmentation network with moving consistency check method to reduce the impact of dynamic objects, and thus the localization accuracy is highly improved in dynamic environments.
Abstract: Simultaneous Localization and Mapping (SLAM) is considered to be a fundamental capability for intelligent mobile robots. Over the past decades, many impressed SLAM systems have been developed and achieved good performance under certain circumstances. However, some problems are still not well solved, for example, how to tackle the moving objects in the dynamic environments, how to make the robots truly understand the surroundings and accomplish advanced tasks. In this paper, a robust semantic visual SLAM towards dynamic environments named DS-SLAM is proposed. Five threads run in parallel in DS-SLAM: tracking, semantic segmentation, local mapping, loop closing, and dense semantic map creation. DS-SLAM combines semantic segmentation network with moving consistency check method to reduce the impact of dynamic objects, and thus the localization accuracy is highly improved in dynamic environments. Meanwhile, a dense semantic octo-tree map is produced, which could be employed for high-level tasks. We conduct experiments both on TUM RGB-D dataset and in the real-world environment. The results demonstrate the absolute trajectory accuracy in DS-SLAM can be improved by one order of magnitude compared with ORB-SLAM2. It is one of the state-of-the-art SLAM systems in high-dynamic environments. Now the code is available at our github: this https URL
401 citations
••
25 May 2019TL;DR: This paper proposes a novel AST-based Neural Network (ASTNN) for source code representation that splits each large AST into a sequence of small statement trees, and encodes the statement trees to vectors by capturing the lexical and syntactical knowledge of statements.
Abstract: Exploiting machine learning techniques for analyzing programs has attracted much attention. One key problem is how to represent code fragments well for follow-up analysis. Traditional information retrieval based methods often treat programs as natural language texts, which could miss important semantic information of source code. Recently, state-of-the-art studies demonstrate that abstract syntax tree (AST) based neural models can better represent source code. However, the sizes of ASTs are usually large and the existing models are prone to the long-term dependency problem. In this paper, we propose a novel AST-based Neural Network (ASTNN) for source code representation. Unlike existing models that work on entire ASTs, ASTNN splits each large AST into a sequence of small statement trees, and encodes the statement trees to vectors by capturing the lexical and syntactical knowledge of statements. Based on the sequence of statement vectors, a bidirectional RNN model is used to leverage the naturalness of statements and finally produce the vector representation of a code fragment. We have applied our neural network based source code representation method to two common program comprehension tasks: source code classification and code clone detection. Experimental results on the two tasks indicate that our model is superior to state-of-the-art approaches.
400 citations
••
14 Jun 2020TL;DR: In this paper, the authors propose a pair similarity optimization viewpoint on deep feature learning, aiming to maximize the within-class similarity $s_p$ and minimize the betweenclass similarity$s_n$.
Abstract: This paper provides a pair similarity optimization viewpoint on deep feature learning, aiming to maximize the within-class similarity $s_p$ and minimize the between-class similarity $s_n$. We find a majority of loss functions, including the triplet loss and the softmax cross-entropy loss, embed $s_n$ and $s_p$ into similarity pairs and seek to reduce $(s_n-s_p)$. Such an optimization manner is inflexible, because the penalty strength on every single similarity score is restricted to be equal. Our intuition is that if a similarity score deviates far from the optimum, it should be emphasized. To this end, we simply re-weight each similarity to highlight the less-optimized similarity scores. It results in a Circle loss, which is named due to its circular decision boundary. The Circle loss has a unified formula for two elemental deep feature learning paradigms, \emph {i.e.}, learning with class-level labels and pair-wise labels. Analytically, we show that the Circle loss offers a more flexible optimization approach towards a more definite convergence target, compared with the loss functions optimizing $(s_n-s_p)$. Experimentally, we demonstrate the superiority of the Circle loss on a variety of deep feature learning tasks. On face recognition, person re-identification, as well as several fine-grained image retrieval datasets, the achieved performance is on par with the state of the art.
400 citations
Authors
Showing all 67500 results
Name | H-index | Papers | Citations |
---|---|---|---|
Yi Chen | 217 | 4342 | 293080 |
H. S. Chen | 179 | 2401 | 178529 |
Alan J. Heeger | 171 | 913 | 147492 |
Lei Jiang | 170 | 2244 | 135205 |
Wei Li | 158 | 1855 | 124748 |
Shu-Hong Yu | 144 | 799 | 70853 |
Jian Zhou | 128 | 3007 | 91402 |
Chao Zhang | 127 | 3119 | 84711 |
Igor Katkov | 125 | 972 | 71845 |
Tao Zhang | 123 | 2772 | 83866 |
Nicholas A. Kotov | 123 | 574 | 55210 |
Shi Xue Dou | 122 | 2028 | 74031 |
Li Yuan | 121 | 948 | 67074 |
Robert O. Ritchie | 120 | 659 | 54692 |
Haiyan Wang | 119 | 1674 | 86091 |