Showing papers in "Neural Networks in 2010"
••
TL;DR: This work has demonstrated experimentally the formation of associative memory in a simple neural network consisting of three electronic neurons connected by two memristor-emulator synapses and opens up new possibilities in the understanding of neural processes using memory devices.
840 citations
••
TL;DR: The iCub is described, which was designed to support collaborative research in cognitive development through autonomous exploration and social interaction and which has attracted a growing community of users and developers.
549 citations
••
TL;DR: A novel regressor that determines a pair of -insensitive up- and down-bound functions by solving two related SVM-type problems, each of which is smaller than that in a classical SVR.
382 citations
••
TL;DR: A minimal architecture for joint action is suggested that focuses on representations, action monitoring and action prediction processes, as well as ways of simplifying coordination, for understanding how coordination can be facilitated by exploiting relations between multiple agents' actions and between actions and the environment.
320 citations
••
TL;DR: A comprehensive overview of competitive learning based clustering methods is given and two examples are given to demonstrate the use of the clustering Methods.
273 citations
••
TL;DR: Theoretical properties of surprise are discussed, in particular how it differs and complements Shannon's definition of information.
266 citations
••
TL;DR: This method estimates a likelihood gradient by sampling directly in parameter space, which leads to lower variance gradient estimates than obtained by regular policy gradient methods, and shows that the improvement is largest when the parameter samples are drawn symmetrically.
258 citations
••
TL;DR: A novel approach to stratified sampling, based on Neyman sampling of the self-organizing map (SOM), is developed, which shows that the SOM-based approach can be used with greater confidence than other approaches, especially in the case of non-uniform datasets, with the benefit of scalability to perform data splitting on large datasets.
195 citations
••
TL;DR: It is proposed that activation of this distributed network during coordinated attention enhances the depth of information processing and encoding beginning in the first year of life and with development, joint attention becomes internalized as the capacity to socially coordinate mental attention to internal representations.
184 citations
••
TL;DR: A novel probabilistic spiking neuron model (pSNM) is proposed and ways of building pSNN for a wide range of applications including classification, string pattern recognition and associative memory are suggested.
157 citations
••
TL;DR: A new modular and integrative sensory information system inspired by the way the brain performs information processing, in particular, pattern recognition is presented, trained to perform the specific task of person authentication.
••
TL;DR: This work located domains of bursting synchronization suppression in terms of perturbation strength and time delay, and presents computational evidence that synchronization suppression is easier in scale-free networks than in the more commonly studied global (mean-field) networks.
••
TL;DR: The hypothesis tested in this study is that infants use information derived from an entity's interactions with other agents as evidence about whether that entity is a perceiver.
••
TL;DR: It is argued that a 'culture as patterned practices' approach obviates a rigid nature-culture distinction, avoids the problems involved in conceptualizing 'culture' as a homogenous grouping variable, and suggests that participating as a competent participant in particular practices may affect both the subjective and objective measures of behavior and brain activity.
••
TL;DR: The software package, named SPYCODE, was mainly developed to respond to the above constraints and generally to offer the scientific community a 'smart' tool for multi-channel data processing.
••
TL;DR: A modification of v-support vector machines (v-SVM) for regression and classification is described, and the use of a parametric insensitive/margin model with an arbitrary shape is demonstrated.
••
TL;DR: A delay&sum readout is introduced, which adds trainable delays in the synaptic connections of output neurons and therefore vastly improves the memory capacity of echo state networks.
••
TL;DR: A new robust learning algorithm is proposed which produces a sparse kernel model with the capability of learning regularized parameters and kernel hyperparameters and is demonstrated to possess considerable computational advantages.
••
TL;DR: This paper investigates the neural networks with a class of nondecreasing piecewise linear activation functions with 2r corner points with the proposed model of n-neuron dynamical systems, which can have and only have (2r+1)(n) equilibria under some conditions, which are locally exponentially stable and others are unstable.
••
TL;DR: This work proposes an architecture in which dynamic neural networks create stable states at each stage of a sequence by exploiting neural attractors triggered by a neural representation of a condition of satisfaction for each action.
••
TL;DR: This paper evaluates techniques for automatic neural network modelling within a Bayesian framework, as applied to six samples containing daily load and weather data for four different countries, and analyzes input selection as carried out by the Bayesian 'automatic relevance determination'.
••
TL;DR: The improvement of performance on Radial Basis Function Networks is analyzed by means of the use of several imputation methods in the classification task with missing values to overcome the negative impact of the presence of Missing Values to a certain degree.
••
TL;DR: A new view of ASD, as a disorder defined on the basis of alterations in the developmental trajectories across multiple domains, is provided, as an emergent disorder characterized by the loss of social communication skills in the period between 9 and 24 months.
••
TL;DR: The formulas proved in this paper are equations of states in statistical estimation because they hold for any true distribution, any parametric model, and any a priori distribution.
••
TL;DR: This work draws on simulation theory as a framework for imagining others' internal states as a means for imaginative play and raises the question of the veracity of children's beliefs about personified robots.
••
TL;DR: This work develops an analytical model which allows the calculation of the memory function for continuous time linear dynamical systems, which can be considered as networks of linear leaky integrator neurons, and research memory properties for different types of reservoir.
••
TL;DR: This paper shows how, in the absence of knowledge to define the potential function manually, this function can be learned online in parallel with the actual reinforcement learning process.
••
TL;DR: This paper applies the method of Lyapunov functions for differential equations with piecewise constant argument of generalized type to a model of recurrent neural networks (RNNs) and obtains sufficient conditions for global exponential stability of the equilibrium point.
••
TL;DR: A delay partition approach is proposed to derive a delay-dependent condition under which the resulting error system is globally asymptotically stable.
••
TL;DR: The number of nonzero coefficients in 1-norm SVMs is at most equal to the number of only the exact support vectors lying on the +1 and -1 discriminating surfaces, which implies that 1- norm SVMs have better sparseness than that of standard SVMs.