scispace - formally typeset
Open Access

Deep Architectures for Baby AI

Yoshua Bengio
TLDR
Motivation: understanding intelligence, building AI, scaling to large scale learning of complex functions, and measuring the impact of artificial intelligence on the real world.
Abstract
Motivation: understanding intelligence, building AI, scaling to large scale learning of complex functions

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

Computer Vision and Natural Language Processing: Recent Approaches in Multimedia and Robotics

TL;DR: This survey provides a comprehensive introduction of the integration of computer vision and natural language processing in multimedia and robotics applications with more than 200 key references and presents a unified view for the field and proposes possible future directions.
Journal ArticleDOI

A radial basis probabilistic process neural network model and corresponding classification algorithm

TL;DR: The proposed RBPPNN information processing mechanism was extended to the time domain, and through learning time-varying signal training samples, achieved extraction, expression, and information association of time-VARYing signal characteristics, as well as direct classification.
Journal ArticleDOI

Experience-based Causality Learning for Intelligent Agents

TL;DR: This article proposes an experience-based causality learning framework that can infer the causality in text using experience by accessing the corresponding event information based on the input sentence pair.
References
More filters
Journal ArticleDOI

Gradient-based learning applied to document recognition

TL;DR: In this article, a graph transformer network (GTN) is proposed for handwritten character recognition, which can be used to synthesize a complex decision surface that can classify high-dimensional patterns, such as handwritten characters.
Journal ArticleDOI

Learning representations by back-propagating errors

TL;DR: Back-propagation repeatedly adjusts the weights of the connections in the network so as to minimize a measure of the difference between the actual output vector of the net and the desired output vector, which helps to represent important features of the task domain.
Book

Pattern Recognition and Machine Learning

TL;DR: Probability Distributions, linear models for Regression, Linear Models for Classification, Neural Networks, Graphical Models, Mixture Models and EM, Sampling Methods, Continuous Latent Variables, Sequential Data are studied.
Journal ArticleDOI

Pattern Recognition and Machine Learning

Radford M. Neal
- 01 Aug 2007 - 
TL;DR: This book covers a broad range of topics for regular factorial designs and presents all of the material in very mathematical fashion and will surely become an invaluable resource for researchers and graduate students doing research in the design of factorial experiments.