scispace - formally typeset
Journal ArticleDOI

Chaotic resonance — methods and applications for robust classification of noisy and variable patterns

Reads0
Chats0
TLDR
A theory of stochastic chaos is developed, in which aperiodic outputs with 1/f2 spectra are formed by the interaction of globally connected nodes that are individually governed by point attractors under perturbation by continuous white noise.
Abstract
A fundamental tenet of the theory of deterministic chaos holds that infinitesimal variation in the initial conditions of a network that is operating in the basin of a low-dimensional chaotic attractor causes the various trajectories to diverge from each other quickly. This "sensitivity to initial conditions" might seem to hold promise for signal detection, owing to an implied capacity for distinguishing small differences in patterns. However, this sensitivity is incompatible with pattern classification, because it amplifies irrelevant differences in incomplete patterns belonging to the same class, and it renders the network easily corrupted by noise. Here a theory of stochastic chaos is developed, in which aperiodic outputs with 1/f2 spectra are formed by the interaction of globally connected nodes that are individually governed by point attractors under perturbation by continuous white noise. The interaction leads to a high-dimensional global chaotic attractor that governs the entire array of nodes. An example is our spatially distributed KIII network that is derived from studies of the olfactory system, and that is stabilized by additive noise modeled on biological noise sources. Systematic parameterization of the interaction strengths corresponding to synaptic gains among nodes representing excitatory and inhibitory neuron populations enables the formation of a robust high-dimensional global chaotic attractor. Reinforcement learning from examples of patterns to be classified using habituation and association creates lower dimensional local basins, which form a global attractor landscape with one basin for each class. Thereafter, presentation of incomplete examples of a test pattern leads to confinement of the KIII network in the basin corresponding to that pattern, which constitutes many-to-one generalization. The capture after learning is expressed by a stereotypical spatial pattern of amplitude modulation of a chaotic carrier wave. Sensitivity to initial conditions is no longer an issue. Scaling of the additive noise as a parameter optimizes the classification of data sets in a manner that is comparable to stochastic resonance. The local basins constitute dynamical memories that solve difficult problems in classifying data sets that are not linearly separable. New local basins can be added quickly from very few examples without loss of existing basins. The attractor landscape enables the KIII set to provide an interface between noisy, unconstrained environments and conventional pattern classifiers. Examples given here of its robust performance include fault detection in small machine parts and the classification of spatiotemporal EEG patterns from rabbits trained to discriminate visual stimuli.

read more

Citations
More filters
Journal ArticleDOI

Is there chaos in the brain? II. Experimental evidence and related models.

TL;DR: The data and main arguments that support the existence of chaos at all levels from the simplest to the most complex forms of organization of the nervous system are presented.
Journal ArticleDOI

Automatic detection of learner's affect from conversational cues

TL;DR: The reliability of detecting a learner’s affect from conversational features extracted from interactions with AutoTutor, an intelligent tutoring system (ITS) that helps students learn by holding a conversation in natural language, is explored.
Journal ArticleDOI

Origin, structure, and role of background EEG activity. Part 3. Neural frame classification

TL;DR: The size, texture and duration of these AM patterns indicate that spatial patterns of human beta frames may be accessible with high-density scalp arrays for correlation with phenomenological reports by human subjects.
Journal ArticleDOI

Nonlinear brain dynamics as macroscopic manifestation of underlying many-body field dynamics

TL;DR: The feasibility of interpreting neurophysiological data in the context of many-body physics is explored by using tools that physicists have devised to analyze comparable hierarchies in other fields of science using concepts of energy dissipation, the maintenance by cortex of multiple ground states corresponding to AM patterns, and the exclusive selection by spontaneous breakdown of symmetry of single states in sequential phase transitions.
Journal ArticleDOI

Origin, structure, and role of background EEG activity. Part 2. Analytic phase

TL;DR: Estimates of spatiotemporal patterns of phase among beta-gamma oscillations suggest that neocortical dynamics is analogous to the dynamics of self-stabilizing systems, such as a sand pile that maintains its critical angle by avalanches, and a pan of boiling water that maintainsIts critical temperature by bubbles that release heat.
References
More filters
Journal ArticleDOI

Neural networks and physical systems with emergent collective computational abilities

TL;DR: A model of a system having a large number of simple equivalent components, based on aspects of neurobiology but readily adapted to integrated circuits, produces a content-addressable memory which correctly yields an entire memory from any subpart of sufficient size.
Journal ArticleDOI

How brains make chaos in order to make sense of the world

TL;DR: A model to describe the neural dynamics responsible for odor recognition and discrimination is developed and it is hypothesized that chaotic behavior serves as the essential ground state for the neural perceptual apparatus and a mechanism for acquiring new forms of patterned activity corresponding to new learned odors is proposed.
Book ChapterDOI

How does the brain build a cognitive code

TL;DR: In this article, a thought experiment is offered which analyses how a system as a whole can correct errors of hypothesis testing in a fluctuating environment when none of the system's components, taken in isolation, even knows that an error has occurred.