Institution
National University of Defense Technology
Education•Changsha, China•
About: National University of Defense Technology is a education organization based out in Changsha, China. It is known for research contribution in the topics: Computer science & Radar. The organization has 39430 authors who have published 40181 publications receiving 358979 citations. The organization is also known as: Guófáng Kēxuéjìshù Dàxué & NUDT.
Topics: Computer science, Radar, Laser, Synthetic aperture radar, Fiber laser
Papers published on a yearly basis
Papers
More filters
••
TL;DR: This work studies a new coverage scenario, sweep coverage, which differs with the previous static coverage, and proposes a centralized algorithm with constant approximation ratio 3 for the min-sensor sweep-coverage problem, and designs a distributed sweep algorithm, DSWEEP, cooperating sensors to provide efficiency with the best effort.
Abstract: Many efforts have been made for addressing coverage problems in sensor networks. They fall into two categories, full coverage and barrier coverage, featured as static coverage. In this work, we study a new coverage scenario, sweep coverage, which differs with the previous static coverage. In sweep coverage, we only need to monitor certain points of interest (POIs) periodically so the coverage at each POI is time-variant, and thus we are able to utilize a small number of mobile sensors to achieve sweep coverage among a much larger number of POIs. We investigate the definitions and model for sweep coverage. Given a set of POIs and their sweep period requirements, we prove that determining the minimum number of required sensors (min-sensor sweep-coverage problem) is NP-hard, and it cannot be approximated within a factor of 2. We propose a centralized algorithm with constant approximation ratio 3 for the min-sensor sweep-coverage problem. We further characterize the nonlocality of the problem and design a distributed sweep algorithm, DSWEEP, cooperating sensors to provide efficiency with the best effort. We conduct extensive simulations to study the performance of the proposed algorithms. Our simulations show that DSWEEP outperforms the randomized scheme in both effectiveness and efficiency.
138 citations
••
TL;DR: This paper presents a simple but effective scene classification approach based on the incorporation of a multi-resolution representation into a bag-of-features model and shows that the proposed approach performs competitively against previous methods across all data sets.
138 citations
•
TL;DR: In this article, a learning to learn approach is proposed to train a domain-invariant feature extractor for heterogeneous domain generalization, where the unseen domains do not share label space with the seen ones and the goal is to learn a feature representation that is useful off-the-shelf for novel data and novel categories.
Abstract: The well known domain shift issue causes model performance to degrade when deployed to a new target domain with different statistics to training. Domain adaptation techniques alleviate this, but need some instances from the target domain to drive adaptation. Domain generalisation is the recently topical problem of learning a model that generalises to unseen domains out of the box, and various approaches aim to train a domain-invariant feature extractor, typically by adding some manually designed losses. In this work, we propose a learning to learn approach, where the auxiliary loss that helps generalisation is itself learned. Beyond conventional domain generalisation, we consider a more challenging setting of heterogeneous domain generalisation, where the unseen domains do not share label space with the seen ones, and the goal is to train a feature representation that is useful off-the-shelf for novel data and novel categories. Experimental evaluation demonstrates that our method outperforms state-of-the-art solutions in both settings.
138 citations
••
TL;DR: Novel l1-regularized space-time adaptive processing algorithms with a generalized sidelobe canceler architecture for airborne radar applications with a sparse regularization to the minimum variance criterion are proposed.
Abstract: In this paper, we propose novel l1-regularized space-time adaptive processing (STAP) algorithms with a generalized sidelobe canceler architecture for airborne radar applications. The proposed methods suppose that a number of samples at the output of the blocking process are not needed for sidelobe canceling, which leads to the sparsity of the STAP filter weight vector. The core idea is to impose a sparse regularization (l1-norm type) to the minimum variance criterion. By solving this optimization problem, an l1-regularized recursive least squares (l1-based RLS) adaptive algorithm is developed. We also discuss the SINR steady-state performance and the penalty parameter setting of the proposed algorithm. To adaptively set the penalty parameter, two switched schemes are proposed for l1-based RLS algorithms. The computational complexity analysis shows that the proposed algorithms have the same complexity level as the conventional RLS algorithm (O((NM)2)), where NM is the filter weight vector length), but a significantly lower complexity level than the loaded sample covariance matrix inversion algorithm (O((NM)3)) and the compressive sensing STAP algorithm (O((NsNd)3), where N8Nd >; NM is the angle-Doppler plane size). The simulation results show that the proposed STAP algorithms converge rapidly and provide a SINR improvement using a small number of snapshots.
138 citations
••
TL;DR: A Bayesian framework consisting of off-line population degradation modeling and on-line degradation assessment and residual life prediction for secondary batteries in the field and a particle filter-based state and static parameter joint estimation method is presented.
138 citations
Authors
Showing all 39659 results
Name | H-index | Papers | Citations |
---|---|---|---|
Rui Zhang | 151 | 2625 | 107917 |
Jian Li | 133 | 2863 | 87131 |
Chi Lin | 125 | 1313 | 102710 |
Wei Xu | 103 | 1492 | 49624 |
Lei Liu | 98 | 2041 | 51163 |
Xiang Li | 97 | 1472 | 42301 |
Chang Liu | 97 | 1099 | 39573 |
Jian Huang | 97 | 1189 | 40362 |
Tao Wang | 97 | 2720 | 55280 |
Wei Liu | 96 | 1538 | 42459 |
Jian Chen | 96 | 1718 | 52917 |
Wei Wang | 95 | 3544 | 59660 |
Peng Li | 95 | 1548 | 45198 |
Jianhong Wu | 93 | 726 | 36427 |
Jianhua Zhang | 92 | 415 | 28085 |