scispace - formally typeset
Search or ask a question
Institution

Stevens Institute of Technology

EducationHoboken, New Jersey, United States
About: Stevens Institute of Technology is a education organization based out in Hoboken, New Jersey, United States. It is known for research contribution in the topics: Computer science & Cognitive radio. The organization has 5440 authors who have published 12684 publications receiving 296875 citations. The organization is also known as: Stevens & Stevens Tech.


Papers
More filters
Journal ArticleDOI
TL;DR: In this article, a comprehensive experimental investigation of gas-liquid absorption in a shell-and-tube type microporous hydrophobic hollow fiber device in a parallel flow configuration was carried out.
Abstract: A comprehensive experimental investigation of gas-liquid absorption in a shell-and-tube type microporous hydrophobic hollow fiber device in a parallel flow configuration was carried out. Two modes of countercurrent gas-liquid contacting were studied, the wetted mode (absorbent liquid filled pores) and the nonwetted mode (gas-filled pores). The absorbent flowed through the fiber bore in most of the experiments. The systems studied include pure CO[sub 2], pure SO[sub 2], CO[sub 2] - N[sub 2] mixtures and SO[sub 2] - air mixtures. The absorbent was pure water. The absorption process was simulated for each case with a numerical model for species transport with and without chemical reaction. Laminar parabolic velocity profile was used for the tube-side flow, and Happel's free surface model was used to characterize the shell-side flow. The model simulations agreed well with the experimental observations in most cases. SO[sub 2] removals as high as 99% were obtained in small compact contactors. High K[sub L]a and low height of transfer unit (HTU) values were obtained with hollow fiber contactors when compared to those of conventional contactors. The applications of direct interest here are those for acid gas cleanup.

309 citations

Journal ArticleDOI
TL;DR: Two resilience-based component importance measures are provided, and an algorithm to perform stochastic ordering of network components due to the uncertain nature of network disruptions, are illustrated with a 20 node, 30 link network example.

309 citations

Journal ArticleDOI
17 Jul 2019
TL;DR: A simple yet effective Horizontal Pyramid Matching (HPM) approach to fully exploit various partial information of a given person, so that correct person candidates can be still identified even even some key parts are missing.
Abstract: Despite the remarkable progress in person re-identification (Re-ID), such approaches still suffer from the failure cases where the discriminative body parts are missing. To mitigate this type of failure, we propose a simple yet effective Horizontal Pyramid Matching (HPM) approach to fully exploit various partial information of a given person, so that correct person candidates can be identified even if some key parts are missing. With HPM, we make the following contributions to produce more robust feature representations for the Re-ID task: 1) we learn to classify using partial feature representations at different horizontal pyramid scales, which successfully enhance the discriminative capabilities of various person parts; 2) we exploit average and max pooling strategies to account for person-specific discriminative information in a global-local manner. To validate the effectiveness of our proposed HPM method, extensive experiments are conducted on three popular datasets including Market-1501, DukeMTMCReID and CUHK03. Respectively, we achieve mAP scores of 83.1%, 74.5% and 59.7% on these challenging benchmarks, which are the new state-of-the-arts.

308 citations

Proceedings Article
30 Apr 2020
TL;DR: In this paper, the authors analyzed the convergence of Federated Averaging on non-iid data and established a convergence rate of O(mathcal{O}(\frac{1}{T}) for strongly convex and smooth problems, where T is the number of SGDs.
Abstract: Federated learning enables a large amount of edge computing devices to jointly learn a model without data sharing. As a leading algorithm in this setting, Federated Averaging (\texttt{FedAvg}) runs Stochastic Gradient Descent (SGD) in parallel on a small subset of the total devices and averages the sequences only once in a while. Despite its simplicity, it lacks theoretical guarantees under realistic settings. In this paper, we analyze the convergence of \texttt{FedAvg} on non-iid data and establish a convergence rate of $\mathcal{O}(\frac{1}{T})$ for strongly convex and smooth problems, where $T$ is the number of SGDs. Importantly, our bound demonstrates a trade-off between communication-efficiency and convergence rate. As user devices may be disconnected from the server, we relax the assumption of full device participation to partial device participation and study different averaging schemes; low device participation rate can be achieved without severely slowing down the learning. Our results indicate that heterogeneity of data slows down the convergence, which matches empirical observations. Furthermore, we provide a necessary condition for \texttt{FedAvg} on non-iid data: the learning rate $\eta$ must decay, even if full-gradient is used; otherwise, the solution will be $\Omega (\eta)$ away from the optimal.

307 citations

Journal ArticleDOI
TL;DR: In this article, the authors examined 69 new product development projects and found that team stability, team member familiarity, and interpersonal trust had a positive impact on the transactive memory system and also had positive influence on team learning, speed-to-market, and new product success.

305 citations


Authors

Showing all 5536 results

NameH-indexPapersCitations
Paul M. Thompson1832271146736
Roger Jones138998114061
Georgios B. Giannakis137132173517
Li-Jun Wan11363952128
Joel L. Lebowitz10175439713
David Smith10099442271
Derong Liu7760819399
Robert R. Clancy7729318882
Karl H. Schoenbach7549419923
Robert M. Gray7537139221
Jin Yu7448032123
Sheng Chen7168827847
Hui Wu7134719666
Amir H. Gandomi6737522192
Haibo He6648222370
Network Information
Related Institutions (5)
Georgia Institute of Technology
119K papers, 4.6M citations

94% related

Nanyang Technological University
112.8K papers, 3.2M citations

92% related

Massachusetts Institute of Technology
268K papers, 18.2M citations

91% related

University of Maryland, College Park
155.9K papers, 7.2M citations

91% related

Purdue University
163.5K papers, 5.7M citations

91% related

Performance
Metrics
No. of papers from the Institution in previous years
YearPapers
202342
2022139
2021765
2020820
2019799
2018563