scispace - formally typeset
K

Kilian Q. Weinberger

Researcher at Cornell University

Publications -  241
Citations -  71535

Kilian Q. Weinberger is an academic researcher from Cornell University. The author has contributed to research in topics: Computer science & Deep learning. The author has an hindex of 76, co-authored 222 publications receiving 49707 citations. Previous affiliations of Kilian Q. Weinberger include University of Washington & Washington University in St. Louis.

Papers
More filters
Proceedings Article

Bayesian Optimization with Inequality Constraints

TL;DR: This work presents constrained Bayesian optimization, which places a prior distribution on both the objective and the constraint functions, and evaluates this method on simulated and real data, demonstrating that constrainedBayesian optimization can quickly find optimal and feasible points, even when small feasible regions cause standard methods to fail.
Proceedings Article

Marginalized Denoising Autoencoders for Domain Adaptation

TL;DR: The approach of mSDA marginalizes noise and thus does not require stochastic gradient descent or other optimization algorithms to learn parameters--in fact, they are computed in closed-form, significantly speeds up SDAs by two orders of magnitude.
Posted Content

Snapshot Ensembles: Train 1, get M for free

TL;DR: DenseNet Snapshot Ensembles as mentioned in this paper proposes to train a single neural network, converging to several local minima along its optimization path and saving the model parameters by leveraging recent work on cyclic learning rate schedules.
Proceedings Article

GPyTorch: Blackbox Matrix-Matrix Gaussian Process Inference with GPU Acceleration

TL;DR: In this article, the authors present an efficient and general approach to GP inference based on Blackbox Matrix-Matrix multiplication (BBMM), which uses a modified batched version of the conjugate gradient algorithm to derive all terms for training and inference in a single call.
Journal ArticleDOI

Convolutional Networks with Dense Connectivity

TL;DR: DenseNet as discussed by the authors proposes to connect each layer to every other layer in a feed-forward fashion, which alleviates the vanishing-gradient problem, strengthen feature propagation, encourage feature reuse, and substantially improve parameter efficiency.