scispace - formally typeset
Search or ask a question

Showing papers on "MNIST database published in 2001"


Proceedings ArticleDOI
15 Jul 2001
TL;DR: It was shown that a large scale Rosenblatt perceptron is comparable with the best classifiers checked on MNIST database, and the influence of the critical parameter N, the number of neurons N in the associative neuron layer, is investigated.
Abstract: The Rosenblatt perceptron was used for handwritten digit recognition. For testing its performance the MNIST database was used. 60,000 samples of handwritten digits were used for perceptron training, and 10,000 samples for testing. A recognition rate of 99.2% was obtained. The critical parameter of Rosenblatt perceptrons is the number of neurons N in the associative neuron layer. We changed the parameter N from 1,000 to 512,000. We investigated the influence of this parameter on the performance of the Rosenblatt perceptron. Increasing N from 1,000 to 512,000 involves decreasing of test errors from 5 to 8 times. It was shown that a large scale Rosenblatt perceptron is comparable with the best classifiers checked on MNIST database (98.9%-99.3%).

36 citations


Proceedings ArticleDOI
01 Sep 2001
TL;DR: A general local learning framework to effectively alleviate the complexities of classifier design by means of "divide and conquer" principle and ensemble method and the method is especially suitable for a large-scale real-world classification problem.
Abstract: This paper proposes a general local learning framework to effectively alleviate the complexities of classifier design by means of "divide and conquer" principle and ensemble method. The learning framework consists of quantization layer and ensemble layer. After GLVQ and MLP are applied to the framework, the proposed method is tested on MNIST handwritten digit database. The obtained performance is very promising, an error rate with 0.99%, which is comparable to that of LeNet5, one of the best classifiers on this database. Further, in contrast to LeNet5, our method is especially suitable for a large-scale real-world classification problem.

14 citations


Proceedings ArticleDOI
15 Jul 2001
TL;DR: This paper study the effectiveness of Umeyama's (1999) supervised ICA (SICA) for feature extraction of handwritten characters with two types of control vectors: average patterns (Type-I); and eigen-patterns ( type-II).
Abstract: Recently, independent component analysis (ICA) has been applied to not only problems of blind signal separation, but also feature extraction of images and sounds. In this paper, we study the effectiveness of Umeyama's (1999) supervised ICA (SICA) for feature extraction of handwritten characters. Two types of control vectors (supervisor) are proposed for SICA: 1) average patterns (Type-I); and 2) eigen-patterns (Type-II). To demonstrate the usefulness of SICA, recognition performance is evaluated for handwritten digits that are included in the MNIST database. From the results of recognition experiments, we certify that SICAs with both types of control vectors work effective for feature extraction. Actually, the within-class variance between-class variance ratio of SICA features with Type-I control vectors becomes slightly larger as compared with a conventional ICA.

10 citations