scispace - formally typeset
Y

Yoshua Bengio

Researcher at Université de Montréal

Publications -  1146
Citations -  534376

Yoshua Bengio is an academic researcher from Université de Montréal. The author has contributed to research in topics: Artificial neural network & Deep learning. The author has an hindex of 202, co-authored 1033 publications receiving 420313 citations. Previous affiliations of Yoshua Bengio include McGill University & Centre de Recherches Mathématiques.

Papers
More filters

Generalization in Deep Learning

TL;DR: The authors provides theoretical insights into why and how deep learning can generalize well, despite its large capacity, complexity, possible algorithmic instability, nonrobustness, and sharp minima.
Book ChapterDOI

Scaling Large Learning Problems with Hard Parallel Mixtures

TL;DR: This work proposes a "hard parallelizable mixture" methodology which yields significantly reduced training time through modularization and parallelization: the training data is iteratively partitioned by a "gater" model in such a way that it becomes easy to learn an "expert" model separately in each region of the partition.
Posted Content

The Bottleneck Simulator: A Model-based Deep Reinforcement Learning Approach

TL;DR: The Bottleneck Simulator is proposed: a model-based reinforcement learning method which combines a learned, factorized transition model of the environment with rollout simulations to learn an effective policy from few examples.
Posted Content

Equivalence of Equilibrium Propagation and Recurrent Backpropagation

TL;DR: This work shows that it is not required to have a side network for the computation of error derivatives and supports the hypothesis that in biological neural networks, temporal derivatives of neural activities may code for error signals.
Posted Content

Gated Orthogonal Recurrent Units: On Learning to Forget

TL;DR: A novel recurrent neural network (RNN)–based model that combines the remembering ability of unitary evolution RNNs with the ability of gated Rnns to effectively forget redundant or irrelevant information in its memory is presented.