Institution
University of Notre Dame
Education•Notre Dame, Indiana, United States•
About: University of Notre Dame is a education organization based out in Notre Dame, Indiana, United States. It is known for research contribution in the topics: Population & Context (language use). The organization has 22238 authors who have published 55201 publications receiving 2032925 citations. The organization is also known as: University of Notre Dame du Lac & University of Notre Dame, South Bend.
Papers published on a yearly basis
Papers
More filters
••
TL;DR: Contrary to the long-standing view that working memory depends on sustained, elevated activity, evidence is presented suggesting that humans can hold information in working memory via “activity-silent” synaptic mechanisms and the results support a synaptic theory of working memory.
Abstract: The ability to hold information in working memory is fundamental for cognition. Contrary to the long-standing view that working memory depends on sustained, elevated activity, we present evidence suggesting that humans can hold information in working memory via “activity-silent” synaptic mechanisms. Using multivariate pattern analyses to decode brain activity patterns, we found that the active representation of an item in working memory drops to baseline when attention shifts away. A targeted pulse of transcranial magnetic stimulation produced a brief reemergence of the item in concurrently measured brain activity. This reactivation effect occurred and influenced memory performance only when the item was potentially relevant later in the trial, which suggests that the representation is dynamic and modifiable via cognitive control. The results support a synaptic theory of working memory.
369 citations
••
01 Dec 2015TL;DR: This paper study analytically and experimentally how under sampling affects the posterior probability of a machine learning model, and uses Bayes Minimum Risk theory to find the correct classification threshold and show how to adjust it after under sampling.
Abstract: Under sampling is a popular technique for unbalanced datasets to reduce the skew in class distributions. However, it is well-known that under sampling one class modifies the priors of the training set and consequently biases the posterior probabilities of a classifier. In this paper, we study analytically and experimentally how under sampling affects the posterior probability of a machine learning model. We formalize the problem of under sampling and explore the relationship between conditional probability in the presence and absence of under sampling. Although the bias due to under sampling does not affect the ranking order returned by the posterior probability, it significantly impacts the classification accuracy and probability calibration. We use Bayes Minimum Risk theory to find the correct classification threshold and show how to adjust it after under sampling. Experiments on several real-world unbalanced datasets validate our results.
369 citations
••
TL;DR: In this paper, the authors measured the lift, drag, and pitching moment about the quarter chord on a series of thin flat plates and cambered plates at chord Reynolds numbers varying between 60,000 and 200,000.
Abstract: The design of micro aerial vehicles requires a better understanding of the aerodynamics of small low-aspect-ratio wings An experimental investigation has focused on measuring the lift, drag, and pitching moment about the quarter chord on a series of thin flat plates and cambered plates at chord Reynolds numbers varying between 60,000 and 200,000 Results show that the cambered plates offer better aerodynamic characteristics and performance It also appears that the trailing-edge geometry of the wings and the turbulence intensity in the wind tunnel do not have a strong effect on the lift and drag for thin wings at low Reynolds numbers Moreover, the results did not show the presence of any hysteresis, which is usually observed with thick airfoils/wings
369 citations
••
01 Dec 2017TL;DR: A transient Presiach model is developed that accurately predicts minor loop trajectories and remnant polarization charge for arbitrary pulse width, voltage, and history of FeFET synapses and reveals a 103 to 106 acceleration in online learning latency over multi-state RRAM based analog synapses.
Abstract: The memory requirement of at-scale deep neural networks (DNN) dictate that synaptic weight values be stored and updated in off-chip memory such as DRAM, limiting the energy efficiency and training time. Monolithic cross-bar / pseudo cross-bar arrays with analog non-volatile memories capable of storing and updating weights on-chip offer the possibility of accelerating DNN training. Here, we harness the dynamics of voltage controlled partial polarization switching in ferroelectric-FETs (FeFET) to demonstrate such an analog synapse. We develop a transient Presiach model that accurately predicts minor loop trajectories and remnant polarization charge (P r ) for arbitrary pulse width, voltage, and history. We experimentally demonstrate a 5-bit FeFET synapse with symmetric potentiation and depression characteristics, and a 45x tunable range in conductance with 75ns update pulse. A circuit macro-model is used to evaluate and benchmark on-chip learning performance (area, latency, energy, accuracy) of FeFET synaptic core revealing a 103 to 106 acceleration in online learning latency over multi-state RRAM based analog synapses.
367 citations
••
TL;DR: A method to decompose biochemical networks into subnetworks based on the global geometry of the network is presented and is applied to 43 organisms from the WIT database.
Abstract: Motivation: The vastness and complexity of the biochemical networks that have been mapped out by modern genomics calls for decomposition into subnetworks. Such networks can have inherent non-local ...
367 citations
Authors
Showing all 22586 results
Name | H-index | Papers | Citations |
---|---|---|---|
George Davey Smith | 224 | 2540 | 248373 |
David Miller | 203 | 2573 | 204840 |
Patrick O. Brown | 183 | 755 | 200985 |
Dorret I. Boomsma | 176 | 1507 | 136353 |
Chad A. Mirkin | 164 | 1078 | 134254 |
Darien Wood | 160 | 2174 | 136596 |
Wei Li | 158 | 1855 | 124748 |
Timothy C. Beers | 156 | 934 | 102581 |
Todd Adams | 154 | 1866 | 143110 |
Albert-László Barabási | 152 | 438 | 200119 |
T. J. Pearson | 150 | 895 | 126533 |
Amartya Sen | 149 | 689 | 141907 |
Christopher Hill | 144 | 1562 | 128098 |
Tim Adye | 143 | 1898 | 109010 |
Teruki Kamon | 142 | 2034 | 115633 |