Institution
Stevens Institute of Technology
Education•Hoboken, New Jersey, United States•
About: Stevens Institute of Technology is a education organization based out in Hoboken, New Jersey, United States. It is known for research contribution in the topics: Computer science & Cognitive radio. The organization has 5440 authors who have published 12684 publications receiving 296875 citations. The organization is also known as: Stevens & Stevens Tech.
Topics: Computer science, Cognitive radio, Communication channel, Wireless network, Artificial neural network
Papers published on a yearly basis
Papers
More filters
••
16 May 2016
TL;DR: A novel algorithm to produce descriptive online 3D occupancy maps using Gaussian processes, which may serve both as an improved-accuracy classifier, and as a predictive tool to support autonomous navigation.
Abstract: We present a novel algorithm to produce descriptive online 3D occupancy maps using Gaussian processes (GPs). GP regression and classification have met with recent success in their application to robot mapping, as GPs are capable of expressing rich correlation among map cells and sensor data. However, the cubic computational complexity has limited its application to large-scale mapping and online use. In this paper we address this issue first by proposing test-data octrees, octrees within blocks of the map that prune away nodes of the same state, condensing the number of test data used in a regression, in addition to allowing fast data retrieval. We also propose a nested Bayesian committee machine which, after new sensor data is partitioned among several GP regressions, fuses the result and updates the map with greatly reduced complexity. Finally, by adjusting the range of influence of the training data and tuning a variance threshold implemented in our method's binary classification step, we are able to control the richness of inference achieved by GPs - and its tradeoff with classification accuracy. The performance of the proposed approach is evaluated with both simulated and real data, demonstrating that the method may serve both as an improved-accuracy classifier, and as a predictive tool to support autonomous navigation.
75 citations
••
TL;DR: This paper proposes four detectors based on the generalized likelihood ratio test principle, by treating the unknown target signal to be deterministic or stochastic and under conditions whether the noise variance is known or unknown, to outperform the conventional cross-correlation based detector which ignores the noise in the reference signal.
75 citations
••
01 Jan 2018TL;DR: In this article, the authors summarized the problems related to heavy metal pollution and various heavy metal remediation technologies, including phytoremediation, which is a green technology that is environment friendly and less expensive compared with other conventional methods.
Abstract: Although heavy metals are naturally occurring compounds, anthropogenic activities introduce them in excessive quantities in different environmental matrices, which impose severe threats on both human and ecosystem health. Heavy metals are nondegradable and can bioaccumulate in living organisms; hence they can contaminate the entire food chain. Remediation of heavy metals requires proper attention to protect soil quality, the ecosystem, and human health. Physical and chemical heavy metal remediation technologies are very expensive, often destructive to the local ecosystem, and require handling of a large amount of hazardous waste. On the other hand, emerging technologies such as phytoremediation have great potential. Phytoremediation is a “green” technology that is environment friendly and less expensive compared with other conventional methods. This chapter summarizes the problems related to heavy metal pollution and various heavy metal remediation technologies.
75 citations
••
TL;DR: Mamunur et al. as discussed by the authors proposed DeepCervix, a hybrid deep feature fusion (HDFF) technique based on DL, to classify the cervical cells accurately, which achieved the state-of-the-art classification accuracy of 99.85%, 99.38%, and 99.14% for 2-class, 3-class and 5-class classification.
75 citations
••
21 Sep 2009TL;DR: A runtime enforcement mechanism is presented that monitors the dynamic nature of interactions via tree structures and prevents a range of attacks that exploit the tree structure in order to transfer sensitive information.
Abstract: This paper explores the problem of tracking information flow in dynamic tree structures. Motivated by the problem of manipulating the Document Object Model (DOM) trees by browser-run client-side scripts, we address the dynamic nature of interactions via tree structures. We present a runtime enforcement mechanism that monitors this interaction and prevents a range of attacks, some of them missed by previous approaches, that exploit the tree structure in order to transfer sensitive information. We formalize our approach for a simple language with DOM-like tree operations and show that the monitor prevents scripts from disclosing secrets.
75 citations
Authors
Showing all 5536 results
Name | H-index | Papers | Citations |
---|---|---|---|
Paul M. Thompson | 183 | 2271 | 146736 |
Roger Jones | 138 | 998 | 114061 |
Georgios B. Giannakis | 137 | 1321 | 73517 |
Li-Jun Wan | 113 | 639 | 52128 |
Joel L. Lebowitz | 101 | 754 | 39713 |
David Smith | 100 | 994 | 42271 |
Derong Liu | 77 | 608 | 19399 |
Robert R. Clancy | 77 | 293 | 18882 |
Karl H. Schoenbach | 75 | 494 | 19923 |
Robert M. Gray | 75 | 371 | 39221 |
Jin Yu | 74 | 480 | 32123 |
Sheng Chen | 71 | 688 | 27847 |
Hui Wu | 71 | 347 | 19666 |
Amir H. Gandomi | 67 | 375 | 22192 |
Haibo He | 66 | 482 | 22370 |