scispace - formally typeset
Search or ask a question
Journal ArticleDOI

A study on a bionic pattern classifier based on olfactory neural system

01 Jan 2006-International Journal of Bifurcation and Chaos (World Scientific Publishing Company)-Vol. 16, Iss: 8, pp 2425-2434
TL;DR: A simulation of a biological olfactory neural system with a KIII set, which is a high-dimensional chaotic neural network designed to simulate the patterns of action potentials and EEG waveforms observed in electrophysiological experiments is presented.
Abstract: This paper presents a simulation of a biological olfactory neural system with a KIII set, which is a high-dimensional chaotic neural network. The KIII set differs from conventional artificial neural networks by use of chaotic attractors for memory locations that are accessed by, chaotic trajectories. It was designed to simulate the patterns of action potentials and EEG waveforms observed in electrophysiological experiments, and has proved its utility as a model for biological intelligence in pattern classification. An application to recognition of handwritten numerals is presented here, in which the classification performance of the KIII network under different noise levels was investigated.

Summary (2 min read)

1. Introduction

  • Artificial Neural Networks (ANN) form a class of models and methods inspired by the study of biological neural systems.
  • Classical ANN are simplistic models in comparison with biological neural systems.
  • Whereas deterministic chaos is stationary, noise-free, autonomous and low-dimensional, brain chaos is unstable with repeated state transitions, drenched in noise, high-dimensional, and engaged with the environment, therefore not autonomous in the sense of having no perturbation once initiated.
  • This formation of local basins corresponds to the memory of different patterns; the recognition of a pattern follows when the system trajectory enters into a certain basin and converges to the attractor in that basin.

2. K Set Model Description

  • The central olfactory neural system is composed of olfactory bulb (OB), anterior nucleus (AON) and prepyriform cortex (PC).
  • Xj(t) represents the state variable of jth neural population, which is connected to the ith, while Wij indicates the connection strength between them.
  • The KIII network describe the whole olfactory neural system, the populations of neurons, local synaptic connection, and long forward and distributed time-delayed feedback loops.
  • Some numerical analysis of the KIII network, using the parameter set in reference [Chang & Freeman, 1998a], is shown in Figs. 2 and 3.
  • It is also another indirect description of the basal chaotic attractor and the state transitions that take place when the stimulus begins and ends.

3. Application on Handwriting Numeral Recognition

  • Pattern recognition is an important subject of artificial intelligence, also a primary field for the application of ANN.
  • According to their specific requirements, the authors made some modifications in the Hebbian learning rule: (1) they designed two methods for increasing the connection strength which is described below; (2) they introduced a bias coefficient K to the learning process.
  • The activity of the ith channel is represented by SDαi, which is the mean standard deviation of the output of the ith mitral node (Mi) over the period of the presentation of input patterns, as Eq. (3).
  • The test data set contains 200 samples in 20 groups of handwritten numeric characters written by 20 different students.
  • Results (Fig. 5) showed that as the noise level increased the correct classification rate of the KIII network increased to a plateau and then decreased.

4. Discussion

  • The first is about the feature extraction in preprocessing.
  • The new algorithm for increasing the connection weight makes KIII network able to memorize and classify more patterns than it used to.
  • Also, it is more reasonable to believe that the connection weights, which represent the biological synaptic connections, change gradually in the learning process.
  • It is demonstrated by electrophysiological experiment and computer simulation that the additive noise in KIII network could maintain the KII components at nonzero point attractor and could stabilize the chaotic attractor landscape formed by learning [Freeman, 1999].
  • In the present research the KIII network is still implemented by digital computer, which differs fundamentally from the analog real olfactory neural system.

Did you find this useful? Give us your feedback

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI
TL;DR: The feasibility of interpreting neurophysiological data in the context of many-body physics is explored by using tools that physicists have devised to analyze comparable hierarchies in other fields of science using concepts of energy dissipation, the maintenance by cortex of multiple ground states corresponding to AM patterns, and the exclusive selection by spontaneous breakdown of symmetry of single states in sequential phase transitions.
Abstract: Neural activity patterns related to behavior occur at many scales in time and space from the atomic and molecular to the whole brain. Patterns form through interactions in both directions, so that the impact of transmitter molecule release can be analyzed to larger scales through synapses, dendrites, neurons, populations and brain systems to behavior, and control of that release can be described step-wise through transforms to smaller scales. Here we explore the feasibility of interpreting neurophysiological data in the context of many-body physics by using tools that physicists have devised to analyze comparable hierarchies in other fields of science. We focus on a mesoscopic level that offers a multi-step pathway between the microscopic functions of neurons and the macroscopic functions of brain systems revealed by hemodynamic imaging. We use electroencephalographic (EEG) records collected from high-density electrode arrays fixed on the epidural surfaces of primary sensory and limbic areas in rabbits and cats trained to discriminate conditioned stimuli (CS) in the various modalities. High temporal resolution of EEG signals with the Hilbert transform gives evidence for diverse intermittent spatial patterns of amplitude (AM) and phase modulations (PM) of carrier waves that repeatedly re-synchronize in the beta and gamma ranges in very short time lags over very long distances. The dominant mechanism for neural interactions by axodendritic synaptic transmission should impose distance-dependent delays on the EEG oscillations owing to finite propagation velocities and sequential synaptic delays. It does not. EEGs show evidence for anomalous dispersion: neural populations have a low velocity range of information and energy transfers, and a high velocity range of the spread of phase transitions. This distinction labels the phenomenon but does not explain it. In this report we analyze these phenomena using concepts of energy dissipation, the maintenance by cortex of multiple ground states corresponding to AM patterns, and the exclusive selection by spontaneous breakdown of symmetry (SBS) of single states in sequential phase transitions.

227 citations

Proceedings ArticleDOI
21 Sep 2009
TL;DR: A dynamical Theory-of-Mind (ToM) is presented to interpret experimental findings and it is proposed that meaningful knowledge is continuously created, processed, and dissipated in the form of sequences of oscillatory patterns of neural activity described through spatio-temporal phase transitions.
Abstract: Human cognition performs a granulation of the seemingly homogeneous temporal sequences of perceptual experiences into meaningful and comprehendible chunks of fuzzy concepts and complex behavioral schemas, which are accessed during future action selection and decisions. In this work a dynamical Theory-of-Mind (ToM) is presented to interpret experimental findings. In our approach meaningful knowledge is continuously created, processed, and dissipated in the form of sequences of oscillatory patterns of neural activity described through spatio-temporal phase transitions. The proposed approach has been implemented in computational and robotic environments.

74 citations

Journal ArticleDOI
TL;DR: A chaotic neural network entitled KIII, which modeled olfactory systems, applied to an electronic nose to discriminate six typical volatile organic compounds (VOCs) in Chinese rice wines, has a good performance in classification of these VOCs of different concentrations.
Abstract: Artificial neural networks (ANNs) are generally considered as the most promising pattern recognition method to process the signals from a chemical sensor array of electronic noses, which makes the system more bionics. This paper presents a chaotic neural network entitled KIII, which modeled olfactory systems, applied to an electronic nose to discriminate six typical volatile organic compounds (VOCs) in Chinese rice wines. Thirty-two-dimensional feature vectors of a sensor array consisting of eight sensors, in which four features were extracted from the transient response of each TGS sensor, were input into the KIII network to investigate its generalization capability for concentration influence elimination and sensor drift counteraction. In comparison with the conventional back propagation trained neural network (BP-NN), experimental results show that the KIII network has a good performance in classification of these VOCs of different concentrations and even for the data obtained 1 month later than the training set. Its robust generalization capability is suitable for electronic nose applications to reduce the influence of concentration and sensor drift.

73 citations


Cites background from "A study on a bionic pattern classif..."

  • ...E-mail address: guangli@zju.edu.cn (G. Li). o l n m s b [ 925-4005/$ – see front matter © 2007 Published by Elsevier B.V. oi:10.1016/j.snb.2007.02.058 t phase; Olfactory model; Sensor drift onitoring [4,5], food and beverage industry [6–8], medical iagnosis [9], public security [10], etc....

    [...]

Journal ArticleDOI
TL;DR: Brain-machine interfaces (BMI) offer a means to understand the downward sequence through correlation of behavior with motor cortical activity, beginning with macroscopic goal states and concluding with recording of microscopic MSA trajectories that operate neuroprostheses.
Abstract: Neocortical state variables are defined and evaluated at three levels: microscopic using multiple spike activity (MSA), mesoscopic using local field potentials (LFP) and electrocorticograms (ECoG), and macroscopic using electroencephalograms (EEG) and brain imaging Transactions between levels occur in all areas of cortex, upwardly by integration (abstraction, generalization) and downwardly by differentiation (speciation) The levels are joined by circular causality: microscopic activity upwardly creates mesoscopic order parameters, which downwardly constrain the microscopic activity that creates them Integration dominates in sensory cortices Microscopic activity evoked by receptor input in sensation induces emergence of mesoscopic activity in perception, followed by integration of perceptual activity into macroscopic activity in concept formation The reverse process dominates in motor cortices, where the macroscopic activity embodying the concepts supports predictions of future states as goals These macroscopic states are conceived to order mesoscopic activity in patterns that constitute plans for actions to achieve the goals These planning patterns are conceived to provide frames in which the microscopic activity evolves in trajectories that adapted to the immediate environmental conditions detected by new stimuli This circular sequence forms the action-perception cycle Its upward limb is understood through correlation of sensory cortical activity with behavior Now brain-machine interfaces (BMI) offer a means to understand the downward sequence through correlation of behavior with motor cortical activity, beginning with macroscopic goal states and concluding with recording of microscopic MSA trajectories that operate neuroprostheses Part 1 develops a hypothesis that describes qualitatively the neurodynamics that supports the action-perception cycle and derivative reflex arc Part 2 describes episodic, “cinematographic–spatial pattern formation and predicts some properties of the macroscopic and mesoscopic frames by which the embedded trajectories of the microscopic activity of cortical sensorimotor neurons might be organized and controlled

67 citations


Cites background from "A study on a bionic pattern classif..."

  • ...These state variables then may also serve as variables in analytic equations that express the dynamics revealed by data-driven models in nonlinear differential equations (Freeman 1975/2004) forming K-sets (Kozma and Freeman 2001; Principe et al. 2001; Kozma et al. 2003; Li et al. 2006 ) and neuropercolation theory (Kozma et al. 2004)....

    [...]

  • ...…serve as variables in analytic equations that express the dynamics revealed by data-driven models in nonlinear differential equations (Freeman 1975/2004) forming K-sets (Kozma and Freeman 2001; Principe et al. 2001; Kozma et al. 2003; Li et al. 2006) and neuropercolation theory (Kozma et al. 2004)....

    [...]

References
More filters
Book
01 Jan 1988

8,937 citations

Journal ArticleDOI
TL;DR: Reading is a need and a hobby at once and this condition is the on that will make you feel that you must read.
Abstract: Some people may be laughing when looking at you reading in your spare time. Some may be admired of you. And some may want be like you who have reading hobby. What about your own feel? Have you felt right? Reading is a need and a hobby at once. This condition is the on that will make you feel that you must read. If you know are looking for the book enPDFd the organization of behavior as the choice of reading, you can find here.

3,986 citations


"A study on a bionic pattern classif..." refers background in this paper

  • ...When an axon of cell A is near enough to excite a cell B and repeatedly or persistently takes part in firing it, some growth process or metabolic change takes place in one or both cells such that A’s efficiency, as one of the cells firing B, is increased [Hebb, 1949]....

    [...]

Book
01 Sep 1991
TL;DR: This two-volume set is an authoritative, comprehensive, modern work on computer vision that covers all of the different areas of vision with a balanced and unified approach.
Abstract: From the Publisher: This two-volume set is an authoritative, comprehensive, modern work on computer vision that covers all of the different areas of vision with a balanced and unified approach. The discussion in "Volume I" focuses on image in, and image out or feature set out. "Volume II" covers the higher level techniques of illumination, perspective projection, analytical photogrammetry, motion, image matching, consistent labeling, model matching, and knowledge-based vision systems.

3,571 citations

Book
30 Jan 2012
TL;DR: The dynamics of neural interaction and transmission, including spatial mapping of evoked brain potentials and EEGs to 27 define population state variables, and the use of behavioral correlates to optimize filters for gamma AM pattern classification are discussed.
Abstract: Prologue.- Prologue.- I The dynamics of neural interaction and transmission.- 1. Spatial mapping of evoked brain potentials and EEGs to 27 define population state variables.- 2. Linear models of impulse inputs and linear basis functions for measuring impulse responses.- 3. Rational approximations in the complex plane for Laplace transforms of transcendental linear operators.- 4. Root locus analysis of piecewise linearized models with multiple feedback loops and unilateral or bilateral saturation.- 5. Opening feedback loops with surgery and anesthesia closing them with noise.- 6. Three degrees of freedom in neural populations: Arousal, learning, and bistability.- 7. Analog computation to model responses based on linear integration, modifiable synapses, and nonlinear trigger zones.- 8. Stability analysis to derive and regulate homeostatic set points for negative feedback loops.- II Designation of contents as meaning, not information.- 9. Multichannel recording to reveal the "code" of the cortex: spatial patterns of amplitude modulation (AM) of mesoscopic carrier waves.- 10. Relations between microscopic and mesoscopic levels shown by calculating pulse probability conditional on EEG amplitude, giving the asymmetric sigmoid function.- 11. Euclidean distance in 64-space and the use of behavioral correlates to optimize filters for gamma AM pattern classification.- 12. Simulating gamma waveforms, AM patterns and 1/f? spectra by means of mesoscopic chaotic neurodynamics.- 13. Tuning curves to optimize temporal segmentation and parameter evaluation of adaptive filters for neocortical EEG.- 14. Stochastic differential equations and random number generators minimize numerical instabilities in digital simulations.- Epilogue: Problems for further development in mesoscopic brain dynamics.- Epilogue: Problems for further development in mesoscopic brain dynamics.- References.- Author Index.

312 citations


"A study on a bionic pattern classif..." refers background or methods in this paper

  • ...…the weights between corresponding nodes, in accordance with the biological increase and decrease of the synaptic connection strengths, which have been evaluated by curve-fitting of solutions to the equations to impulse responses of the olfactory system to electrical stimuli [Freeman, 2000]....

    [...]

  • ...As described in the learning rule [Freeman, 2000a], the period with stimulus patterns is divided into five segments to calculate the nodes’ activity with each segment lasting 40 ms....

    [...]

  • ...…et al., 1997], which introduced “Stochastic Chaos” and made the KIII model free from the sensitivity to variation of parameters and initial conditions, and provided a high-dimensional chaotic system capable of rapid and reliable pattern classification without gradient descent [Freeman, 2000b]....

    [...]

  • ...In recent years, the theory of chaos has been used to understand the mesoscopic neural dynamics, which is at the level of self-organization at which neural populations can create novel activity patterns [Freeman, 2000a]....

    [...]

  • ...…the gain values of the lateral connections, feedforward and feedback loops, were optimized by measuring the olfactory evoked potentials and EEG, simulating their waveforms and statistical properties, and fitting the simulated functions to the data by means of nonlinear regression [Freeman, 2000c]....

    [...]

Journal ArticleDOI
TL;DR: A theory of stochastic chaos is developed, in which aperiodic outputs with 1/f2 spectra are formed by the interaction of globally connected nodes that are individually governed by point attractors under perturbation by continuous white noise.
Abstract: A fundamental tenet of the theory of deterministic chaos holds that infinitesimal variation in the initial conditions of a network that is operating in the basin of a low-dimensional chaotic attractor causes the various trajectories to diverge from each other quickly. This "sensitivity to initial conditions" might seem to hold promise for signal detection, owing to an implied capacity for distinguishing small differences in patterns. However, this sensitivity is incompatible with pattern classification, because it amplifies irrelevant differences in incomplete patterns belonging to the same class, and it renders the network easily corrupted by noise. Here a theory of stochastic chaos is developed, in which aperiodic outputs with 1/f2 spectra are formed by the interaction of globally connected nodes that are individually governed by point attractors under perturbation by continuous white noise. The interaction leads to a high-dimensional global chaotic attractor that governs the entire array of nodes. An example is our spatially distributed KIII network that is derived from studies of the olfactory system, and that is stabilized by additive noise modeled on biological noise sources. Systematic parameterization of the interaction strengths corresponding to synaptic gains among nodes representing excitatory and inhibitory neuron populations enables the formation of a robust high-dimensional global chaotic attractor. Reinforcement learning from examples of patterns to be classified using habituation and association creates lower dimensional local basins, which form a global attractor landscape with one basin for each class. Thereafter, presentation of incomplete examples of a test pattern leads to confinement of the KIII network in the basin corresponding to that pattern, which constitutes many-to-one generalization. The capture after learning is expressed by a stereotypical spatial pattern of amplitude modulation of a chaotic carrier wave. Sensitivity to initial conditions is no longer an issue. Scaling of the additive noise as a parameter optimizes the classification of data sets in a manner that is comparable to stochastic resonance. The local basins constitute dynamical memories that solve difficult problems in classifying data sets that are not linearly separable. New local basins can be added quickly from very few examples without loss of existing basins. The attractor landscape enables the KIII set to provide an interface between noisy, unconstrained environments and conventional pattern classifiers. Examples given here of its robust performance include fault detection in small machine parts and the classification of spatiotemporal EEG patterns from rabbits trained to discriminate visual stimuli.

202 citations


"A study on a bionic pattern classif..." refers background or methods in this paper

  • ...After reinforcement learning to discriminate classes of different patterns, the system forms a landscape of low-dimensional local basins, with one basin for each pattern class [Kozma & Freeman, 2001]....

    [...]

  • ...The values of s = 5 and K = 0.4 are chosen based on the previous experiments of the application of KIII model [Kozma & Freeman, 2001; Principe et al., 2001; Yao & Freeman, 1989]....

    [...]

  • ...…recognition, which simulated an aspect of the biological intelligence, as demonstrated by previous applications of the KIII network to recognition of one-dimensional sequences, industrial data and spatiotemporal EEG patterns [Kozma & Freeman, 2001; Principe et al., 2001; Yao & Freeman, 1989]....

    [...]

  • ...In previous work done by Kozma [Kozma & Freeman, 2001], an optimal noise/signal rate was found for the best classification performance of KIII, which was named “chaotic resonance” in comparison to stochastic resonance....

    [...]

Frequently Asked Questions (1)
Q1. What have the authors contributed in "A study on a bionic pattern classifier based on olfactory neural system" ?

In this paper, the authors proposed a new application example of the KIII network for recognition of handwriting numerals.