scispace - formally typeset
D

David Duvenaud

Researcher at University of Toronto

Publications -  124
Citations -  19439

David Duvenaud is an academic researcher from University of Toronto. The author has contributed to research in topics: Artificial neural network & Estimator. The author has an hindex of 53, co-authored 119 publications receiving 15179 citations. Previous affiliations of David Duvenaud include University of Cambridge & University of British Columbia.

Papers
More filters
Journal ArticleDOI

Automatic Chemical Design Using a Data-Driven Continuous Representation of Molecules

TL;DR: In this article, a deep neural network was trained on hundreds of thousands of existing chemical structures to construct three coupled functions: an encoder, a decoder, and a predictor, which can generate new molecules for efficient exploration and optimization through open-ended spaces of chemical compounds.
Proceedings Article

Convolutional networks on graphs for learning molecular fingerprints

TL;DR: In this paper, a convolutional neural network that operates directly on graphs is proposed to learn end-to-end learning of prediction pipelines whose inputs are graphs of arbitrary size and shape.
Journal ArticleDOI

Automatic chemical design using a data-driven continuous representation of molecules

TL;DR: A method to convert discrete representations of molecules to and from a multidimensional continuous representation that allows us to generate new molecules for efficient exploration and optimization through open-ended spaces of chemical compounds is reported.
Proceedings Article

Neural ordinary differential equations

TL;DR: In this paper, the authors introduce a new family of deep neural network models called continuous normalizing flows, which parameterize the derivative of the hidden state using a neural network, and the output of the network is computed using a black-box differential equation solver.
Posted Content

Neural Ordinary Differential Equations

TL;DR: In this paper, the authors introduce a new family of deep neural network models called continuous normalizing flows, which parameterize the derivative of the hidden state using a neural network, and the output of the network is computed using a black-box differential equation solver.