scispace - formally typeset
Y

Yoshua Bengio

Researcher at Université de Montréal

Publications -  1146
Citations -  534376

Yoshua Bengio is an academic researcher from Université de Montréal. The author has contributed to research in topics: Artificial neural network & Deep learning. The author has an hindex of 202, co-authored 1033 publications receiving 420313 citations. Previous affiliations of Yoshua Bengio include McGill University & Centre de Recherches Mathématiques.

Papers
More filters
Journal ArticleDOI

The Bottleneck Simulator: A Model-Based Deep Reinforcement Learning Approach

TL;DR: The Bottleneck Simulator as discussed by the authors is a model-based reinforcement learning method which combines a learned, factorized transition model of the environment with rollout simulations to learn an effective policy from few examples.
Posted Content

The Benefits of Over-parameterization at Initialization in Deep ReLU Networks

TL;DR: This paper proves some desirable theoretical properties at initialization of over-parameterized ReLU networks and shows novel properties that hold under He initialization, including the aforementioned hidden activation norm property, and shows that this property holds for a finite width network even when the number of data samples is infinite.
Proceedings ArticleDOI

Weakly Supervised Representation Learning with Sparse Perturbations

TL;DR: This work shows that if one has weak supervision from observations generated by sparse perturbations of the latent variables–e.g. images in a reinforcement learning environment where actions move individual sprites–identification is achievable under unknown continuous latent distributions.
Posted Content

An objective function for STDP.

TL;DR: Spike-based simulations agree with the proposed relationship between spike timing and the temporal change of postsynaptic activity and show a strong correlation between the biologically observed STDP behavior and the behavior obtained from simulations where the weight change follows the gradient of the predictive objective function.
Journal ArticleDOI

Locally linear embedding for dimensionality reduction in QSAR.

TL;DR: Locally Linear Embedding (LLE), a local non-linear dimensionality reduction technique, that can statistically discover a low-dimensional representation of the chemical data, is introduced.