scispace - formally typeset
T

Terrance DeVries

Researcher at University of Guelph

Publications -  19
Citations -  3831

Terrance DeVries is an academic researcher from University of Guelph. The author has contributed to research in topics: Convolutional neural network & Feature learning. The author has an hindex of 11, co-authored 19 publications receiving 2360 citations. Previous affiliations of Terrance DeVries include Facebook.

Papers
More filters
Posted Content

Improved Regularization of Convolutional Neural Networks with Cutout.

TL;DR: This paper shows that the simple regularization technique of randomly masking out square regions of input during training, which is called cutout, can be used to improve the robustness and overall performance of convolutional neural networks.
Posted Content

Learning Confidence for Out-of-Distribution Detection in Neural Networks.

TL;DR: This work proposes a method of learning confidence estimates for neural networks that is simple to implement and produces intuitively interpretable outputs, and addresses the problem of calibrating out-of-distribution detectors.
Posted Content

Dataset Augmentation in Feature Space

TL;DR: This paper adopts a simpler, domain-agnostic approach to dataset augmentation, and works in the space of context vectors generated by sequence-to-sequence models, demonstrating a technique that is effective for both static and sequential data.
Posted Content

Does Object Recognition Work for Everyone

TL;DR: The paper analyzes the accuracy of publicly available object-recognition systems on a geographically diverse dataset that was designed to have a more representative geographical coverage than commonly used image datasets in object recognition.
Book ChapterDOI

ProxyNCA++: Revisiting and Revitalizing Proxy Neighborhood Component Analysis

TL;DR: This work revisits ProxyNCA and incorporates several enhancements, and finds that low temperature scaling is a performance-critical component and explains why it works, and discovers that Global Max Pooling works better in general when compared to Global Average Pooling.