scispace - formally typeset
D

David Krueger

Researcher at Université de Montréal

Publications -  36
Citations -  4616

David Krueger is an academic researcher from Université de Montréal. The author has contributed to research in topics: Computer science & Deep learning. The author has an hindex of 18, co-authored 25 publications receiving 3042 citations.

Papers
More filters
Proceedings Article

A closer look at memorization in deep networks

TL;DR: The analysis suggests that the notions of effective capacity which are dataset independent are unlikely to explain the generalization performance of deep networks when trained with gradient based methods because training data itself plays an important role in determining the degree of memorization.
Proceedings Article

NICE: Non-linear Independent Components Estimation

TL;DR: A deep learning framework for modeling complex high-dimensional densities called Non-linear Independent Component Estimation (NICE) is proposed, based on the idea that a good representation is one in which the data has a distribution that is easy to model.
Posted Content

NICE: Non-linear Independent Components Estimation

TL;DR: Non-linear Independent Component Estimation (NICE) as discussed by the authors is a deep learning framework for modeling complex high-dimensional densities based on the idea that a good representation is one in which the data has a distribution that is easy to model.
Posted Content

Out-of-Distribution Generalization via Risk Extrapolation (REx)

TL;DR: This work introduces the principle of Risk Extrapolation (REx), and shows conceptually how this principle enables extrapolation, and demonstrates the effectiveness and scalability of instantiations of REx on various OoD generalization tasks.
Posted Content

A Closer Look at Memorization in Deep Networks

TL;DR: The authors examine the role of memorization in deep learning, drawing connections to capacity, generalization, and adversarial robustness, showing that deep networks tend to prioritize learning simple patterns first.