scispace - formally typeset
Book ChapterDOI

Neural Networks for Pattern Recognition

Suresh Kothari, +1 more
- 01 Jan 1993 - 
- Vol. 37, pp 119-166
Reads0
Chats0
TLDR
The chapter discusses two important directions of research to improve learning algorithms: the dynamic node generation, which is used by the cascade correlation algorithm; and designing learning algorithms where the choice of parameters is not an issue.
Abstract
Publisher Summary This chapter provides an account of different neural network architectures for pattern recognition. A neural network consists of several simple processing elements called neurons. Each neuron is connected to some other neurons and possibly to the input nodes. Neural networks provide a simple computing paradigm to perform complex recognition tasks in real time. The chapter categorizes neural networks into three types: single-layer networks, multilayer feedforward networks, and feedback networks. It discusses the gradient descent and the relaxation method as the two underlying mathematical themes for deriving learning algorithms. A lot of research activity is centered on learning algorithms because of their fundamental importance in neural networks. The chapter discusses two important directions of research to improve learning algorithms: the dynamic node generation, which is used by the cascade correlation algorithm; and designing learning algorithms where the choice of parameters is not an issue. It closes with the discussion of performance and implementation issues.

read more

Citations
More filters
Journal ArticleDOI

Short-Term Load Forecasting With Exponentially Weighted Methods

TL;DR: Five recently developed exponentially weighted methods that have not previously been used for load forecasting are considered, including several exponential smoothing formulations, as well as methods using discount weighted regression, cubic splines, and singular value decomposition.
Book ChapterDOI

Self-organizing maps as substitutes for k-means clustering

TL;DR: This paper briefly reviews different initialization procedures, and proposes Kohonen’s Self-Organizing Maps as the most convenient method, given the proper training parameters, and shows that in the final stages of its training procedure the Self-organizing Map algorithms is rigorously the same as the k-means algorithm.
Proceedings ArticleDOI

Rapidly Selecting Good Compiler Optimizations using Performance Counters

TL;DR: This paper proposes a different approach using performance counters as a means of determining good compiler optimization settings by learning a model off-line which can then be used to determine good settings for any new program.
Journal ArticleDOI

Hidden Markov models for online classification of single trial EEG data

TL;DR: The classification shows an improvement of the online experiment and the temporal determination of minimal classification error compared to linear classification methods.
Journal ArticleDOI

The bag-of-frames approach to audio pattern recognition: a sufficient model for urban soundscapes but not for polyphonic music.

TL;DR: This paper proposes to explicitly examine the difference between urban soundscapes and polyphonic music with respect to their modeling with the BOF approach, and reveals critical differences in the temporal and statistical structure of the typical frame distribution of each type of signal.
References
More filters
Journal ArticleDOI

Neural networks and physical systems with emergent collective computational abilities

TL;DR: A model of a system having a large number of simple equivalent components, based on aspects of neurobiology but readily adapted to integrated circuits, produces a content-addressable memory which correctly yields an entire memory from any subpart of sufficient size.
Journal ArticleDOI

A logical calculus of the ideas immanent in nervous activity

TL;DR: In this article, it is shown that many particular choices among possible neurophysiological assumptions are equivalent, in the sense that for every net behaving under one assumption, there exists another net which behaves under another and gives the same results, although perhaps not in the same time.
Journal ArticleDOI

An introduction to computing with neural nets

TL;DR: This paper provides an introduction to the field of artificial neural nets by reviewing six important neural net models that can be used for pattern classification and exploring how some existing classification and clustering algorithms can be performed using simple neuron-like components.
Journal ArticleDOI

Neurons with graded response have collective computational properties like those of two-state neurons.

TL;DR: A model for a large network of "neurons" with a graded response (or sigmoid input-output relation) is studied and collective properties in very close correspondence with the earlier stochastic model based on McCulloch - Pitts neurons are studied.