scispace - formally typeset
Search or ask a question

Showing papers on "Convolutional neural network published in 1997"


Journal ArticleDOI
TL;DR: A hybrid neural-network for human face recognition which compares favourably with other methods and analyzes the computational complexity and discusses how new classes could be added to the trained recognizer.
Abstract: We present a hybrid neural-network for human face recognition which compares favourably with other methods. The system combines local image sampling, a self-organizing map (SOM) neural network, and a convolutional neural network. The SOM provides a quantization of the image samples into a topological space where inputs that are nearby in the original space are also nearby in the output space, thereby providing dimensionality reduction and invariance to minor changes in the image sample, and the convolutional neural network provides partial invariance to translation, rotation, scale, and deformation. The convolutional network extracts successively larger features in a hierarchical set of layers. We present results using the Karhunen-Loeve transform in place of the SOM, and a multilayer perceptron (MLP) in place of the convolutional network for comparison. We use a database of 400 images of 40 individuals which contains quite a high degree of variability in expression, pose, and facial details. We analyze the computational complexity and discuss how new classes could be added to the trained recognizer.

2,954 citations


Proceedings ArticleDOI
17 Jun 1997
TL;DR: A new machine learning paradigm called Graph Transformer Networks is proposed that extends the applicability of gradient-based learning algorithms to systems composed of modules that take graphs as inputs and produce graphs as output.
Abstract: We propose a new machine learning paradigm called Graph Transformer Networks that extends the applicability of gradient-based learning algorithms to systems composed of modules that take graphs as inputs and produce graphs as output. Training is performed by computing gradients of a global objective function with respect to all the parameters in the system using a kind of back-propagation procedure. A complete check reading system based on these concepts is described. The system uses convolutional neural network character recognizers, combined with global training techniques to provide record accuracy on business and personal checks. It is presently deployed commercially and reads million of checks per month.

125 citations


Proceedings ArticleDOI
21 Apr 1997
TL;DR: A new machine learning paradigm called multilayer graph transformer network is proposed that extends the applicability of gradient-based learning algorithms to systems composed of modules that take graphs as input and produce graphs as output.
Abstract: We propose a new machine learning paradigm called multilayer graph transformer network that extends the applicability of gradient-based learning algorithms to systems composed of modules that take graphs as input and produce graphs as output. A complete check reading system based on this concept is described. The system combines convolutional neural network character recognizers with graph-based stochastic models trained cooperatively at the document level. It is deployed commercially and reads million of business and personal checks per month with record accuracy.

49 citations


Patent
Yoshua Bengio1, Léon Bottou1, Yann LeCun1
11 Mar 1997
TL;DR: In this paper, a check reading system based on graph transformer networks is described, which uses convolutional neural network character recognizers, combined with global training techniques to provide record accuracy on business and personal checks.
Abstract: A machine learning paradigm called Graph Transformer Networks extends the applicability of gradient-based learning algorithms to systems composed of modules that take graphs as inputs and produce graphs as output. Training is performed by computing gradients of a global objective function with respect to all the parameters in the system using a kind of back-propagation procedure. A complete check reading system based on these concept is described. The system uses convolutional neural network character recognizers, combined with global training techniques to provides record accuracy on business and personal checks.

42 citations



Proceedings Article
01 Dec 1997
TL;DR: An inexpensive, video-based, motorized tracking system that learns to track a head using real time graphical user inputs or an auxiliary infrared detector as supervisory signals to train a convolutional neural network.
Abstract: We have constructed an inexpensive, video-based, motorized tracking system that learns to track a head. It uses real time graphical user inputs or an auxiliary infrared detector as supervisory signals to train a convolutional neural network. The inputs to the neural network consist of normalized luminance and chrominance images and motion information from frame differences. Subsampled images are also used to provide scale invariance. During the online training phase, the neural network rapidly adjusts the input weights depending upon the reliability of the different channels in the surrounding environment. This quick adaptation allows the system to robustly track a head even when other objects are moving within a cluttered background.

5 citations


Proceedings ArticleDOI
07 Jul 1997
TL;DR: This paper shows an analysis of the use of artificial neural networks (ANNs) for induction machines identification, in order to use afterwards for the control of induction machines.
Abstract: This paper shows an analysis of the use of artificial neural networks (ANNs) for induction machines identification, in order to use afterwards for the control of induction machines. A multilayer perceptron neural network with a hidden layer is trained with the backpropagation algorithm to identify the induction motor (IM) for getting the IM neural model. The neural network training process is analyzed with different scenarios (different number of hidden layer neurons, different learning rates and different sampling rates) in order to get neural networks parameters for practical implementations. Finally, the results of the trained neural networks for different load torque are shown.

4 citations