scispace - formally typeset
Journal ArticleDOI

Combining multiple neural networks by fuzzy integral for robust classification

Sung-Bae Cho, +1 more
- Vol. 25, Iss: 2, pp 380-384
TLDR
The authors propose a method for multinetwork combination based on the fuzzy integral that nonlinearly combines objective evidence, in the form of a fuzzy membership function, with subjective evaluation of the worth of the individual neural networks with respect to the decision.
Abstract
In the area of artificial neural networks, the concept of combining multiple networks has been proposed as a new direction for the development of highly reliable neural network systems. The authors propose a method for multinetwork combination based on the fuzzy integral. This technique nonlinearly combines objective evidence, in the form of a fuzzy membership function, with subjective evaluation of the worth of the individual neural networks with respect to the decision. The experimental results with the recognition problem of on-line handwriting characters confirm the superiority of the presented method to the other voting techniques. >

read more

Citations
More filters
Journal ArticleDOI

On combining classifiers

TL;DR: A common theoretical framework for combining classifiers which use distinct pattern representations is developed and it is shown that many existing schemes can be considered as special cases of compound classification where all the pattern representations are used jointly to make a decision.
MonographDOI

Combining Pattern Classifiers

TL;DR: This combining pattern classifiers methods and algorithms helps people to enjoy a good book with a cup of coffee in the afternoon, instead they cope with some harmful virus inside their computer.
Journal ArticleDOI

Online and off-line handwriting recognition: a comprehensive survey

TL;DR: The nature of handwritten language, how it is transduced into electronic data, and the basic concepts behind written language recognition algorithms are described.
Journal ArticleDOI

Ensemble based systems in decision making

TL;DR: Conditions under which ensemble based systems may be more beneficial than their single classifier counterparts are reviewed, algorithms for generating individual components of the ensemble systems, and various procedures through which the individual classifiers can be combined are reviewed.

Decision templates for multiple classi"er fusion: an experimental comparison

TL;DR: This work presents here a simple rule for adapting the class combiner to the application and shows that decision templates based on integral type measures of similarity are superior to the other schemes on both data sets.
References
More filters
Journal ArticleDOI

Adaptive mixtures of local experts

TL;DR: A new supervised learning procedure for systems composed of many separate networks, each of which learns to handle a subset of the complete set of training cases, which is demonstrated to be able to be solved by a very simple expert network.
Journal ArticleDOI

Neural network ensembles

TL;DR: It is shown that the remaining residual generalization error can be reduced by invoking ensembles of similar networks, which helps improve the performance and training of neural networks for classification.
Journal ArticleDOI

Methods of combining multiple classifiers and their applications to handwriting recognition

TL;DR: On applying these methods to combine several classifiers for recognizing totally unconstrained handwritten numerals, the experimental results show that the performance of individual classifiers can be improved significantly.
Journal ArticleDOI

Computer Processing of Line-Drawing Images

TL;DR: Various forms of line drawing representation are described, different schemes of quantization are compared, and the manner in which a line drawing can be extracted from a tracing or a photographic image is reviewed.
Journal ArticleDOI

Neural Network Classifiers Estimate Bayesian a posteriori Probabilities.

TL;DR: Results of Monte Carlo simulations performed using multilayer perceptron (MLP) networks trained with backpropagation, radial basis function (RBF) networks, and high-order polynomial networks graphically demonstrate that network outputs provide good estimates of Bayesian probabilities.
Related Papers (5)