scispace - formally typeset
J

Jesse Bettencourt

Researcher at University of Toronto

Publications -  11
Citations -  2769

Jesse Bettencourt is an academic researcher from University of Toronto. The author has contributed to research in topics: Artificial neural network & Differential equation. The author has an hindex of 8, co-authored 10 publications receiving 2663 citations.

Papers
More filters
Proceedings Article

Neural ordinary differential equations

TL;DR: In this paper, the authors introduce a new family of deep neural network models called continuous normalizing flows, which parameterize the derivative of the hidden state using a neural network, and the output of the network is computed using a black-box differential equation solver.
Posted Content

Neural Ordinary Differential Equations

TL;DR: In this paper, the authors introduce a new family of deep neural network models called continuous normalizing flows, which parameterize the derivative of the hidden state using a neural network, and the output of the network is computed using a black-box differential equation solver.
Proceedings Article

FFJORD: Free-form Continuous Dynamics for Scalable Reversible Generative Models

TL;DR: This paper uses Hutchinson's trace estimator to give a scalable unbiased estimate of the log-density and demonstrates the approach on high-dimensional density estimation, image generation, and variational inference, achieving the state-of-the-art among exact likelihood methods with efficient sampling.
Posted Content

FFJORD: Free-form Continuous Dynamics for Scalable Reversible Generative Models

TL;DR: In this paper, the authors use Hutchinson's trace estimator to give a scalable unbiased estimate of the log-density, achieving the state-of-the-art among exact likelihood methods with efficient sampling.
Posted Content

DiffEqFlux.jl - A Julia Library for Neural Differential Equations.

TL;DR: This work demonstrates the ability to incorporate DifferentialEquations.jl-defined differential equation problems into a Flux-defined neural network, and vice versa, and discusses the complementary nature between machine learning models and differential equations.