Institution
Worcester Polytechnic Institute
Education•Worcester, Massachusetts, United States•
About: Worcester Polytechnic Institute is a education organization based out in Worcester, Massachusetts, United States. It is known for research contribution in the topics: Population & Data envelopment analysis. The organization has 6270 authors who have published 12704 publications receiving 332081 citations. The organization is also known as: WPI.
Topics: Population, Data envelopment analysis, Supply chain, Nonlinear system, Finite element method
Papers published on a yearly basis
Papers
More filters
••
TL;DR: This review will nicely bridge ML with biosensors, and greatly expand chemometrics for detection, analysis, and diagnosis.
Abstract: Chemometrics play a critical role in biosensors-based detection, analysis, and diagnosis. Nowadays, as a branch of artificial intelligence (AI), machine learning (ML) have achieved impressive advances. However, novel advanced ML methods, especially deep learning, which is famous for image analysis, facial recognition, and speech recognition, has remained relatively elusive to the biosensor community. Herein, how ML can be beneficial to biosensors is systematically discussed. The advantages and drawbacks of most popular ML algorithms are summarized on the basis of sensing data analysis. Specially, deep learning methods such as convolutional neural network (CNN) and recurrent neural network (RNN) are emphasized. Diverse ML-assisted electrochemical biosensors, wearable electronics, SERS and other spectra-based biosensors, fluorescence biosensors and colorimetric biosensors are comprehensively discussed. Furthermore, biosensor networks and multibiosensor data fusion are introduced. This review will nicely bridge ML with biosensors, and greatly expand chemometrics for detection, analysis, and diagnosis.
214 citations
••
TL;DR: Applications such as the detection of an astronomical object, forward-scattered radiation, and incoherent light are described whereby signal enhancements of at least 7 orders of magnitude may be achieved.
Abstract: I propose to use as a window the dark core of an optical vortex to examine a weak background signal hidden in the glare of a bright coherent source. Applications such as the detection of an astronomical object, forward-scattered radiation, and incoherent light are described whereby signal enhancements of at least 7 orders of magnitude may be achieved.
214 citations
01 Jan 2014
TL;DR: In this paper, 100% recycled hot mix asphalt lab samples were modified with five generic and one proprietary rejuvenators at 12% dose and tested for binder and mixture properties, which ensured excellent rutting resistance while providing longer fatigue life compared to virgin mixtures and most lowered critical cracking temperature.
Abstract: 100% recycled hot mix asphalt lab samples were modified with five generic and one proprietary rejuvenators at 12% dose and tested for binder and mixture properties. Waste Vegetable Oil, Waste Vegetable Grease, Organic Oil, Distilled Tall Oil, and Aromatic Extract reduced the Superpave performance grade (PG) from 94–12 of extracted binder to PG 64-22 while waste engine oil required higher dose. All products ensured excellent rutting resistance while providing longer fatigue life when compared to virgin mixtures and most lowered critical cracking temperature. Rejuvenated samples required more compaction energy compared to virgin and some oils reduced moisture resistance slightly.
214 citations
••
TL;DR: Integrated computational materials engineering (ICME) as mentioned in this paper is a field of study whose time has come to link manufacturing and design via advanced materials models in a seamless, integrated computational environment.
Abstract: Integrated computational materials engineering is a field of study whose time has come. It promises to link manufacturing and design via advanced materials models in a seamless, integrated computational environment. The feasibility of ICME and its benefits have been demonstrated by several projects that have developed methods which are in use in the aerospace and automotive industries. To fully realize the potential of ICME, anumber of technical, cultural, and organizational challenges have been identified and must be overcome.
214 citations
••
19 Jul 2018TL;DR: This work presents a novel RNN model, called the Graph Attention Model (GAM), that processes only a portion of the graph by adaptively selecting a sequence of "informative" nodes, and shows that the proposed method is competitive against various well-known methods in graph classification.
Abstract: Graph classification is a problem with practical applications in many different domains. To solve this problem, one usually calculates certain graph statistics (i.e., graph features) that help discriminate between graphs of different classes. When calculating such features, most existing approaches process the entire graph. In a graphlet-based approach, for instance, the entire graph is processed to get the total count of different graphlets or subgraphs. In many real-world applications, however, graphs can be noisy with discriminative patterns confined to certain regions in the graph only. In this work, we study the problem of attention-based graph classification. The use of attention allows us to focus on small but informative parts of the graph, avoiding noise in the rest of the graph. We present a novel RNN model, called the Graph Attention Model (GAM), that processes only a portion of the graph by adaptively selecting a sequence of "informative" nodes. Experimental results on multiple real-world datasets show that the proposed method is competitive against various well-known methods in graph classification even though our method is limited to only a portion of the graph.
213 citations
Authors
Showing all 6336 results
Name | H-index | Papers | Citations |
---|---|---|---|
Andrew G. Clark | 140 | 823 | 123333 |
Ming Li | 103 | 1669 | 62672 |
Joseph Sarkis | 101 | 482 | 45116 |
Arthur C. Graesser | 95 | 614 | 38549 |
Kevin J. Harrington | 85 | 682 | 33625 |
Kui Ren | 83 | 501 | 32490 |
Bart Preneel | 82 | 844 | 25572 |
Ming-Hui Chen | 82 | 525 | 29184 |
Yuguang Fang | 79 | 572 | 20715 |
Wenjing Lou | 77 | 311 | 29405 |
Bernard Lown | 73 | 330 | 20320 |
Joe Zhu | 72 | 231 | 19017 |
Y.S. Lin | 71 | 304 | 16100 |
Kevin Talbot | 71 | 268 | 15669 |
Christof Paar | 69 | 399 | 21790 |