scispace - formally typeset
Book ChapterDOI

Neural Networks for Pattern Recognition

Suresh Kothari, +1 more
- 01 Jan 1993 - 
- Vol. 37, pp 119-166
Reads0
Chats0
TLDR
The chapter discusses two important directions of research to improve learning algorithms: the dynamic node generation, which is used by the cascade correlation algorithm; and designing learning algorithms where the choice of parameters is not an issue.
Abstract
Publisher Summary This chapter provides an account of different neural network architectures for pattern recognition. A neural network consists of several simple processing elements called neurons. Each neuron is connected to some other neurons and possibly to the input nodes. Neural networks provide a simple computing paradigm to perform complex recognition tasks in real time. The chapter categorizes neural networks into three types: single-layer networks, multilayer feedforward networks, and feedback networks. It discusses the gradient descent and the relaxation method as the two underlying mathematical themes for deriving learning algorithms. A lot of research activity is centered on learning algorithms because of their fundamental importance in neural networks. The chapter discusses two important directions of research to improve learning algorithms: the dynamic node generation, which is used by the cascade correlation algorithm; and designing learning algorithms where the choice of parameters is not an issue. It closes with the discussion of performance and implementation issues.

read more

Citations
More filters
Journal ArticleDOI

Systematic benchmarking of microarray data classification: assessing the role of non-linearity and dimensionality reduction

TL;DR: A systematic benchmarking study comparing linear versions of standard classification and dimensionality reduction techniques with their non-linear versions based on non- linear kernel functions with a radial basis function (RBF) kernel finds that Kernel PCA with linear kernel gives better results.
Journal ArticleDOI

Online Learning Solutions for Freeway Travel Time Prediction

TL;DR: This paper proposes a new extended Kalman filter (EKF) based online-learning approach, i.e., the online-censored EKF method, which can be applied online and offers improvements over a delayed approach in which learning takes place only as realized travel times are available.
Proceedings ArticleDOI

Improved gene selection for classification of microarrays.

TL;DR: A method for evaluating and improving techniques for selecting informative genes from microarray data is derived and it is shown that this filtered set of genes can be used to significantly improve existing classifiers.
Journal ArticleDOI

Neural network prediction model for fine particulate matter (PM2.5) on the US-Mexico border in El Paso (Texas) and Ciudad Juárez (Chihuahua)

TL;DR: The RBF shows up to be the network with the shortest training times, combined with a greater stability during the prediction stage, thus characterizing this topology as an ideal solution for its use in environmental applications instead of the widely used and less effective MLP.
Journal ArticleDOI

Detection of conceptual model rainfall—runoff processes inside an artificial neural network

TL;DR: In this paper, the internal behavior of an artificial neural network rainfall runoff model is examined and it is demonstrated that specific architectural features can be interpreted with respect to the quasi-physical dynamics of a parsimonious water balance model.
References
More filters
Journal ArticleDOI

Neural networks and physical systems with emergent collective computational abilities

TL;DR: A model of a system having a large number of simple equivalent components, based on aspects of neurobiology but readily adapted to integrated circuits, produces a content-addressable memory which correctly yields an entire memory from any subpart of sufficient size.
Journal ArticleDOI

A logical calculus of the ideas immanent in nervous activity

TL;DR: In this article, it is shown that many particular choices among possible neurophysiological assumptions are equivalent, in the sense that for every net behaving under one assumption, there exists another net which behaves under another and gives the same results, although perhaps not in the same time.
Journal ArticleDOI

An introduction to computing with neural nets

TL;DR: This paper provides an introduction to the field of artificial neural nets by reviewing six important neural net models that can be used for pattern classification and exploring how some existing classification and clustering algorithms can be performed using simple neuron-like components.
Journal ArticleDOI

Neurons with graded response have collective computational properties like those of two-state neurons.

TL;DR: A model for a large network of "neurons" with a graded response (or sigmoid input-output relation) is studied and collective properties in very close correspondence with the earlier stochastic model based on McCulloch - Pitts neurons are studied.