scispace - formally typeset
Search or ask a question

Showing papers by "Ethem Alpaydin published in 1995"


Proceedings Article
27 Nov 1995
TL;DR: A computational model based on a partially recurrent feedforward network is proposed and made credible by testing on the real-world problem of recognition of handwritten digits with encouraging results.
Abstract: Completely parallel object recognition is NP-complete Achieving a recognizer with feasible complexity requires a compromise between parallel and sequential processing where a system selectively focuses on parts of a given image, one after another Successive fixations are generated to sample the image and these samples are processed and abstracted to generate a temporal context in which results are integrated over time A computational model based on a partially recurrent feedforward network is proposed and made credible by testing on the real-world problem of recognition of handwritten digits with encouraging results

11 citations


Journal ArticleDOI
TL;DR: It is found that perceptrons, when the architecture is suitable, generalise better than local, memory-based kernel estimators, but require a longer training and more precise computation.
Abstract: We compare kernel estimators, single and multi-layered perceptrons and radial-basis functions for the problems of classification of handwritten digits and speech phonemes. By taking two different applications and employing many techniques, we report here a two-dimensional study whereby a domain-independent assessment of these learning methods can be possible. We consider a feed-forward network with one hidden layer. As examples of the local methods, we use kernel estimators like k-nearest neighbour (k-nn), Parzen windows, generalised k-nn, and Grow and Learn (Condensed Nearest Neighbour). We have also considered fuzzy k-nn due to its similarity. As distributed networks, we use linear perceptron, pairwise separating linear perceptron and multi-layer perceptrons with sigmoidal hidden units. We also tested the radial-basis function network, which is a combination of local and distributed networks. Four criteria are taken for comparison: correct classification of the test set; network size; learning time; and the operational complexity. We found that perceptrons, when the architecture is suitable, generalise better than local, memory-based kernel estimators, but require a longer training and more precise computation. Local networks are simple, leant very quickly and acceptably, but use more memory.

5 citations