Showing papers in "Neural Networks in 2000"
••
TL;DR: The basic theory and applications of ICA are presented, and the goal is to find a linear representation of non-Gaussian data so that the components are statistically independent, or as independent as possible.
8,231 citations
••
TL;DR: For cognitive neuroscience to go forward a more explicit effort is needed to use neurophysiology to constrain how the brain produces human mental functions, and the suggestion that two fundamental features may be critical for this effort is suggested.
511 citations
••
TL;DR: It is found that specific neuroanatomical motifs are uniquely associated with high levels of complexity and that they are embedded in the pattern of long-range cortico-cortical pathways linking segregated areas of the mammalian cerebral cortex.
495 citations
•
TL;DR: In this article, a growing hierarchical self-organizing map is proposed to organize a real-world document collection according to their similarities, and the model evolves into a hierarchical structure according to the requirements of the input data during an unsupervised training process.
280 citations
••
TL;DR: It is shown that when also the genetic encoding is left free to evolve, artificial evolution will select to exploit mechanisms of self-organization and Fitness Space is suggested as a framework to conceive fitness functions in Evolutionary Robotics.
244 citations
••
TL;DR: A biologically inspired neural network approach to real-time collision-free motion planning of mobile robots or robot manipulators in a nonstationary environment is proposed and is guaranteed by qualitative analysis and the Lyapunov stability theory.
229 citations
••
TL;DR: New conditions ensuring existence, uniqueness, and global asymptotical stability of the equilibrium point of Hopfield neural network models with fixed time delays or distributed time delays are presented.
201 citations
••
TL;DR: The hierarchical geometric structure of the parameter space of three-layer perceptrons is investigated in order to show the existence of local minima and plateaus and it is proved that a critical point of the model with H - 1 hidden units always gives many critical points of themodel with H hidden units.
197 citations
••
TL;DR: This paper shows that the adaptive natural gradient method can be extended to be applicable to a wide class of stochastic models: regression with an arbitrary noise model and classification with an arbitrarily number of classes.
188 citations
••
TL;DR: The article contributes to the quest to relate global data on brain and behavior from PET, Positron Emission Tomography, and fMRI to the underpinning neural networks by using computational models of biological neural circuitry based on animal data to predict and analyze the results of human PET studies.
187 citations
••
TL;DR: The research demonstrates that the proposed network architecture and the associated learning algorithm are quite effective in modeling the dynamics of complex processes and performing accurate MS predictions.
••
TL;DR: Results concerning the global asymptotic stability (GAS) and absolute stability (ABST) of delay models of continuous-time neural networks are milder than previously known criteria; they apply to neural networks with a broad range of activation functions assuming neither differentiability nor strict monotonicity.
••
TL;DR: An overview of the different functional brain imaging methods, the kinds of questions these methods try to address and some of the questions associated with functional neuroimaging data for which neural modeling must be employed to provide reasonable answers can be found in this paper.
••
TL;DR: It is proved that the proposed neural network can converge globally to the solution set of the problem when the matrix involved in the problem is positive semidefinite and can converge exponentially to a unique solution when the Matrix is positive definite.
••
TL;DR: Nonlinear canonical correlation analysis (NLCCA) method is formulated here using three feedforward neural networks that outperformed the CCA when the two sets of variables contained correlated nonlinear structures.
••
TL;DR: A new model of impulsive autoassociative neural networks is formulates and studies, and several fundamental issues, such as global exponential stability and existence and uniqueness of equilibria of such neural networks are established.
••
TL;DR: It is shown how fuzzy lattice neurocomputing (FLN) emerges as a connectionist paradigm in the framework of fuzzy lattices (FL-framework) whose advantages include the capacity to deal rigorously with disparate types of data such as numeric and linguistic data, intervals of values, 'missing' and 'don't care' data.
••
TL;DR: This article implements a factor analysis model for pre-processing of neurobiological data and shows an approach to separate noise-contaminated data without knowing the number of independent components is effective.
••
TL;DR: A computational model developed to investigate the role of parallel basal ganglia-thalamocortical loops in solving tasks that rely on working memory predicts the temporal unfolding of neuronal activity in different brain regions, both in the normal case and in the two disease states.
••
TL;DR: It is shown that new methods for measuring effective connectivity allow us to characterise the interactions between brain regions that underlie the complex interactions among different processing stages of functional architectures.
••
TL;DR: The Lyapunov function method is utilized to analyze stability of continuous nonlinear neural networks with delays and obtain some new sufficient conditions ensuring the globally asymptotic stability independent of delays.
••
TL;DR: In this paper, a spectral analysis of short segments reveals peaks in the classical frequency ranges of the alpha (8-12 Hz), theta (3-7 Hz), beta (13-30 Hz) and gamma (30-100 Hz) bands of the EEG and MEG.
••
TL;DR: A neural model is developed which suggests how parietal and motor cortical mechanisms, such as difference vector encoding, interact with adaptively timed, predictive cerebellar learning during movement imitation and predictive performance.
••
TL;DR: Training the derivative of a feedforward neural network with the extended backpropagation algorithm is presented, used to solve a class of first-order partial differential equations for input-to-state linearizable or approximate linearizable systems.
••
TL;DR: It is shown that the least squares and linear Taylor expansion based approach compares favorably with other analytic approaches, and that it is an efficient and economic alternative to the nonanalytic and computationally intensive bootstrap methods.
••
TL;DR: In this paper, Chen and Aihara's chaotic simulated annealing (CSA) with decaying self-coupling, Wang and Smith's CSA with decaying timestep, and Hopfield network with chaotic noise are compared in different parameter spaces, with optimization performance measured in terms of feasibility, efficiency, robustness and scalability.
••
TL;DR: The results showed that the use of the proposed ANN controller can lead to 7.5% annual energy savings in the case of a highly insulated passive solar test cell.
••
TL;DR: A neural model proposes how the laminar circuits of V1 and V2 generate perceptual groupings that maintain sensitivity to the contrasts and spatial organization of scenic cues, and is used to simulate psychophysical and neurophysiological data about perceptual grouping, including various Gestalt grouping laws.
••
TL;DR: From the simulation results, it has been shown that a fully connected ULN with three nodes is able to display chaotic behaviors and the parameters of ULNs can be adjusted so that the maximum Lyapunov exponent approaches the target value.
••
TL;DR: It is shown that the resulting network improves the approximation results reported for continuous mappings and for those exhibiting a finite number of discontinuities.