scispace - formally typeset
Journal ArticleDOI

The basins of attraction of a new Hopfield learning rule

Amos Storkey, +1 more
- 01 Jul 1999 - 
- Vol. 12, Iss: 6, pp 869-876
Reads0
Chats0
TLDR
The nature of the basins of attraction of a Hopfield network is as important as the capacity, and a new learning rule is re-introduced, which has a higher capacity than that of the Hebb rule, and still keeps important functionality, such as incrementality and locality, which the pseudo-inverse lacks.
About
This article is published in Neural Networks.The article was published on 1999-07-01. It has received 85 citations till now. The article focuses on the topics: Learning rule & Attraction.

read more

Citations
More filters
Journal ArticleDOI

A novel memristive neural network with hidden attractors and its circuitry implementation

TL;DR: Interestingly, the memristive neural network can generate hyperchaotic attractors without the presence of equilibrium points and circuital implementation of such memristives is presented to show its feasibility.
Journal ArticleDOI

Emergent States in Virtual Teams: A Complex Adaptive Systems Perspective:

TL;DR: The complex adaptive systems (CAS) perspective is used to integrate the literature on emergent states in VTs and provides an overview of artificial simulation models as well as simulation results concerning the emergence of the four states described in the CAS framework and discusses several ways to improve the accuracy of the simulation models using empirical data collected in real VTs.
Journal ArticleDOI

Fractional Hopfield Neural Networks: Fractional Dynamic Associative Recurrent Neural Networks

TL;DR: A novel conceptual framework: fractional Hopfield neural networks (FHNN) is proposed in the form of an analog circuit by utilizing a fractor and the fractional steepest descent approach, and its Lyapunov function is constructed, to apply FHNN to the defense against chip cloning attacks for anticounterfeiting.
Journal ArticleDOI

A bidirectional heteroassociative memory for binary and grey-level patterns

TL;DR: This paper introduces a new bidirectional heteroassociative memory model that uses a simple self-convergent iterative learning rule and a new nonlinear output function that can learn online without being subject to overlearning.
Journal ArticleDOI

Almost periodic attractor of delayed neural networks with variable coefficients

TL;DR: In this paper, several sufficient conditions are obtained for the almost periodic attractor of delayed neural networks with variable coefficients based on the theory of exponential dichotomy, Lyapunov method and Halanay inequality technique.
References
More filters
Journal ArticleDOI

Neural networks and physical systems with emergent collective computational abilities

TL;DR: A model of a system having a large number of simple equivalent components, based on aspects of neurobiology but readily adapted to integrated circuits, produces a content-addressable memory which correctly yields an entire memory from any subpart of sufficient size.
Book

Probability and random processes

TL;DR: In this article, the authors present a survey of the history and varieties of probability for the laws of chance and their application in the context of Markov chains convergence of random variables.
Journal ArticleDOI

The capacity of the Hopfield associative memory

TL;DR: In this paper, the capacity of Hopfield associative memory was studied under the assumption that every one of the m fundamental memories can be recoverable exactly, with the added restriction that all the m original memories be exactly recoverable.
Journal ArticleDOI

Statistical neurodynamics of associative memory

TL;DR: A new statistical neurodynamical method is proposed for analyzing the non-equilibrium dynamical behaviors of an autocorrelation associative memory model and explains the strange behaviors due to strange shapes of the basins of attractors.
Journal ArticleDOI

Learning algorithms with optimal stability in neural networks

TL;DR: The authors motivate this proposal and provide optimal stability learning rules for two different choices of normalisation for the synaptic matrix (Jij) and numerical work is presented which gives the value of the optimal stability for random uncorrelated patterns.