scispace - formally typeset
Y

Yoshua Bengio

Researcher at Université de Montréal

Publications -  1146
Citations -  534376

Yoshua Bengio is an academic researcher from Université de Montréal. The author has contributed to research in topics: Artificial neural network & Deep learning. The author has an hindex of 202, co-authored 1033 publications receiving 420313 citations. Previous affiliations of Yoshua Bengio include McGill University & Centre de Recherches Mathématiques.

Papers
More filters
Journal ArticleDOI

Stochastic learning of strategic equilibria for auctions

TL;DR: This article presents a new application of stochastic adaptive learning algorithms to the computation of strategic equilibria in auctions and addresses the problems of tracking a moving target and balancing exploration versus exploitation.
Posted Content

Machine Learning for Glacier Monitoring in the Hindu Kush Himalaya.

TL;DR: This work utilizes readily available remote sensing data to create a model to identify and outline both clean ice and debris-covered glaciers from satellite imagery, and releases data and develops a web tool that allows experts to visualize and correct model predictions, with the ultimate aim of accelerating the glacier mapping process.
Journal ArticleDOI

Sources of Richness and Ineffability for Phenomenally Conscious States

TL;DR: In this paper , an information theoretic dynamical system perspective on the richness and ineffability of consciousness is provided, showing that the richness of experience corresponds to the amount of information in a conscious state, and the loss of information at different stages of processing.
Posted Content

The Variational Bandwidth Bottleneck: Stochastic Evaluation on an Information Budget

TL;DR: In this paper, the authors propose a variational bandwidth bottleneck, which decides for each example on the estimated value of the privileged information before seeing it, and then accordingly chooses stochastically, whether to access the privileged input or not.
Journal ArticleDOI

Regeneration Learning: A Learning Paradigm for Data Generation

TL;DR: In this paper , regeneration learning is proposed for data generation, which first generates Y' (an abstraction/representation of Y) from X and then generates Y from Y' during training, where Y' is obtained from Y through either handcrafted rules or self-supervised learning.