Institution
University of Waterloo
Education•Waterloo, Ontario, Canada•
About: University of Waterloo is a education organization based out in Waterloo, Ontario, Canada. It is known for research contribution in the topics: Population & Context (language use). The organization has 36093 authors who have published 93906 publications receiving 2948139 citations. The organization is also known as: UW & uwaterloo.
Papers published on a yearly basis
Papers
More filters
•
TL;DR: It is discovered that modern neural networks, unlike those from a decade ago, are poorly calibrated, and on most datasets, temperature scaling -- a single-parameter variant of Platt Scaling -- is surprisingly effective at calibrating predictions.
Abstract: Confidence calibration -- the problem of predicting probability estimates representative of the true correctness likelihood -- is important for classification models in many applications. We discover that modern neural networks, unlike those from a decade ago, are poorly calibrated. Through extensive experiments, we observe that depth, width, weight decay, and Batch Normalization are important factors influencing calibration. We evaluate the performance of various post-processing calibration methods on state-of-the-art architectures with image and document classification datasets. Our analysis and experiments not only offer insights into neural network learning, but also provide a simple and straightforward recipe for practical settings: on most datasets, temperature scaling -- a single-parameter variant of Platt Scaling -- is surprisingly effective at calibrating predictions.
1,883 citations
••
TL;DR: GelMA hydrogels could be useful for creating complex, cell- responsive microtissues, such as endothelialized microvasculature, or for other applications that require cell-responsive microengineered hydrogELs.
1,871 citations
•
17 Jul 2017TL;DR: This article found that depth, width, weight decay, and batch normalization are important factors influencing confidence calibration of neural networks, and that temperature scaling is surprisingly effective at calibrating predictions.
Abstract: Confidence calibration -- the problem of predicting probability estimates representative of the true correctness likelihood -- is important for classification models in many applications. We discover that modern neural networks, unlike those from a decade ago, are poorly calibrated. Through extensive experiments, we observe that depth, width, weight decay, and Batch Normalization are important factors influencing calibration. We evaluate the performance of various post-processing calibration methods on state-of-the-art architectures with image and document classification datasets. Our analysis and experiments not only offer insights into neural network learning, but also provide a simple and straightforward recipe for practical settings: on most datasets, temperature scaling -- a single-parameter variant of Platt Scaling -- is surprisingly effective at calibrating predictions.
1,853 citations
••
TL;DR: It is argued that when it is easy to manipulate and measure a proposed psychological process that a series of experiments that demonstrates the proposed causal chain is superior, and that designs that examine underlying process by utilizing moderation can be effective.
Abstract: The authors propose that experiments that utilize mediational analyses as suggested by R. M. Baron and D. A. Kenny (1986) are overused and sometimes improperly held up as necessary for a good social psychological paper. The authors argue that when it is easy to manipulate and measure a proposed psychological process that a series of experiments that demonstrates the proposed causal chain is superior. They further argue that when it is easy to manipulate a proposed psychological process but difficult to measure it that designs that examine underlying process by utilizing moderation can be effective. It is only when measurement of a proposed psychological process is easy and manipulation of it is difficult that designs that rely on mediational analyses should be preferred, and even in these situations careful consideration should be given to the limiting factors of such designs.
1,812 citations
••
TL;DR: In this paper, a simple realization of the 3D Weyl semimetal phase was proposed, utilizing a multilayer structure, composed of identical thin films of a magnetically doped 3D topological insulator, separated by ordinary-insulator spacer layers.
Abstract: We propose a simple realization of the three-dimensional (3D) Weyl semimetal phase, utilizing a multilayer structure, composed of identical thin films of a magnetically doped 3D topological insulator, separated by ordinary-insulator spacer layers. We show that the phase diagram of this system contains a Weyl semimetal phase of the simplest possible kind, with only two Dirac nodes of opposite chirality, separated in momentum space, in its band structure. This Weyl semimetal has a finite anomalous Hall conductivity and chiral edge states and occurs as an intermediate phase between an ordinary insulator and a 3D quantum anomalous Hall insulator. We find that the Weyl semimetal has a nonzero dc conductivity at zero temperature, but Drude weight vanishing as ${T}^{2}$, and is thus an unusual metallic phase, characterized by a finite anomalous Hall conductivity and topologically protected edge states.
1,803 citations
Authors
Showing all 36498 results
Name | H-index | Papers | Citations |
---|---|---|---|
John J.V. McMurray | 178 | 1389 | 184502 |
David A. Weitz | 178 | 1038 | 114182 |
David Taylor | 131 | 2469 | 93220 |
Lei Zhang | 130 | 2312 | 86950 |
Will J. Percival | 129 | 473 | 87752 |
Trevor Hastie | 124 | 412 | 202592 |
Stephen Mann | 120 | 669 | 55008 |
Xuan Zhang | 119 | 1530 | 65398 |
Mark A. Tarnopolsky | 115 | 644 | 42501 |
Qiang Yang | 112 | 1117 | 71540 |
Wei Zhang | 112 | 1189 | 93641 |
Hans-Peter Seidel | 112 | 1213 | 51080 |
Theodore S. Rappaport | 112 | 490 | 68853 |
Robert C. Haddon | 112 | 577 | 52712 |
David Zhang | 111 | 1027 | 55118 |