Institution
Nanjing University of Information Science and Technology
Education•Nanjing, China•
About: Nanjing University of Information Science and Technology is a education organization based out in Nanjing, China. It is known for research contribution in the topics: Precipitation & Aerosol. The organization has 14129 authors who have published 17985 publications receiving 267578 citations. The organization is also known as: Nan Xin Da.
Papers published on a yearly basis
Papers
More filters
••
TL;DR: Comparative experiments demonstrate that the uses of waveform representation and deep Boltzmann machines contribute to the improvement of classification accuracies of tree species.
Abstract: Our work addresses the problem of extracting and classifying tree species from mobile LiDAR data. The work includes tree preprocessing and tree classification. In tree preprocessing, voxel-based upward-growing filtering is proposed to remove ground points from the mobile LiDAR data, followed by a tree segmentation that extracts individual trees via Euclidean distance clustering and voxel-based normalized cut segmentation. In tree classification, first, a waveform representation is developed to model geometric structures of trees. Then, deep learning techniques are used to generate high-level feature abstractions of the trees’ waveform representations. Quantitative analysis shows that our algorithm achieves an overall accuracy of 86.1% and a kappa coefficient of 0.8 in classifying urban tree species using mobile LiDAR data. Comparative experiments demonstrate that the uses of waveform representation and deep Boltzmann machines contribute to the improvement of classification accuracies of tree species.
155 citations
••
TL;DR: This paper examines the dynamical behavior of the Lorenz system in a previously unexplored region of parameter space, in particular, where r is zero and b is negative, and finds that the system is bistable or tristable under certain conditions.
Abstract: In this paper, the dynamical behavior of the Lorenz system is examined in a previously unexplored region of parameter space, in particular, where r is zero and b is negative. For certain values of the parameters, the classic butterfly attractor is broken into a symmetric pair of strange attractors, or it shrinks into a small attractor basin intermingled with the basins of a symmetric pair of limit cycles, which means that the system is bistable or tristable under certain conditions. Although the resulting system is no longer a plausible model of fluid convection, it may have application to other physical systems.
155 citations
••
TL;DR: A new regime of chaotic flows is explored in which one of the variables has the freedom of offset boosting, and therefore the variable can switch between a bipolar signal and a unipolar signal according to the constant.
154 citations
••
TL;DR: A multiscale deep feature learning method for high-resolution satellite image scene classification by warp the original satellite image into multiple different scales and developing a multiple kernel learning method to automatically learn the optimal combination of such features.
Abstract: In this paper, we propose a multiscale deep feature learning method for high-resolution satellite image scene classification. Specifically, we first warp the original satellite image into multiple different scales. The images in each scale are employed to train a deep convolutional neural network (DCNN). However, simultaneously training multiple DCNNs is time-consuming. To address this issue, we explore DCNN with spatial pyramid pooling (SPP-net). Since different SPP-nets have the same number of parameters, which share the identical initial values, and only fine-tuning the parameters in fully connected layers ensures the effectiveness of each network, thereby greatly accelerating the training process. Then, the multiscale satellite images are fed into their corresponding SPP-nets, respectively, to extract multiscale deep features. Finally, a multiple kernel learning method is developed to automatically learn the optimal combination of such features. Experiments on two difficult data sets show that the proposed method achieves favorable performance compared with other state-of-the-art methods.
154 citations
••
TL;DR: A dynamic resource allocation method, named DRAM, for load balancing in fog environment is proposed in this paper and a system framework for fog computing and the load-balance analysis for various types of computing nodes are presented.
Abstract: Fog computing is emerging as a powerful and popular computing paradigm to perform IoT (Internet of Things) applications, which is an extension to the cloud computing paradigm to make it possible to execute the IoT applications in the network of edge. The IoT applications could choose fog or cloud computing nodes for responding to the resource requirements, and load balancing is one of the key factors to achieve resource efficiency and avoid bottlenecks, overload, and low load. However, it is still a challenge to realize the load balance for the computing nodes in the fog environment during the execution of IoT applications. In view of this challenge, a dynamic resource allocation method, named DRAM, for load balancing in fog environment is proposed in this paper. Technically, a system framework for fog computing and the load-balance analysis for various types of computing nodes are presented first. Then, a corresponding resource allocation method in the fog environment is designed through static resource allocation and dynamic service migration to achieve the load balance for the fog computing systems. Experimental evaluation and comparison analysis are conducted to validate the efficiency and effectiveness of DRAM.
154 citations
Authors
Showing all 14448 results
Name | H-index | Papers | Citations |
---|---|---|---|
Ashok Kumar | 151 | 5654 | 164086 |
Lei Zhang | 135 | 2240 | 99365 |
Bin Wang | 126 | 2226 | 74364 |
Shuicheng Yan | 123 | 810 | 66192 |
Zeshui Xu | 113 | 752 | 48543 |
Xiaoming Li | 113 | 1932 | 72445 |
Qiang Yang | 112 | 1117 | 71540 |
Yan Zhang | 107 | 2410 | 57758 |
Fei Wang | 107 | 1824 | 53587 |
Yongfa Zhu | 105 | 355 | 33765 |
James C. McWilliams | 104 | 535 | 47577 |
Zhi-Hua Zhou | 102 | 626 | 52850 |
Tao Li | 102 | 2483 | 60947 |
Lei Liu | 98 | 2041 | 51163 |
Jian Feng Ma | 97 | 305 | 32310 |