scispace - formally typeset
Journal ArticleDOI

Global Asymptotic Stability for a Class of Generalized Neural Networks With Interval Time-Varying Delays

Reads0
Chats0
TLDR
It is theoretically proven that some novel delay-independent and delay-dependent stability criteria are more effective than some existing ones either for SNNs or for LFNNs, which is confirmed by some numerical examples.
Abstract
This paper is concerned with global asymptotic stability for a class of generalized neural networks (NNs) with interval time-varying delays, which include two classes of fundamental NNs, i.e., static neural networks (SNNs) and local field neural networks (LFNNs), as their special cases. Some novel delay-independent and delay-dependent stability criteria are derived. These stability criteria are applicable not only to SNNs but also to LFNNs. It is theoretically proven that these stability criteria are more effective than some existing ones either for SNNs or for LFNNs, which is confirmed by some numerical examples.

read more

Citations
More filters
Journal ArticleDOI

A Comprehensive Review of Stability Analysis of Continuous-Time Recurrent Neural Networks

TL;DR: The purpose of this paper is to provide a comprehensive review of the research on stability of continuous-time recurrent Neural networks, including Hopfield neural networks, Cohen-Grossberg neural networks and related models.
Journal ArticleDOI

An improved reciprocally convex inequality and an augmented Lyapunov–Krasovskii functional for stability of linear systems with time-varying delay

TL;DR: An optimal reciprocally convex inequality is proposed and a new Lyapunov-Krasovskii functional is tailored for the use of auxiliary function-based integral inequality for stability of a linear system with a time-varying delay.
Journal ArticleDOI

Stability Analysis for Delayed Neural Networks Considering Both Conservativeness and Complexity

TL;DR: A new Lyapunov-Krasovskii functional with simple augmented terms and delay-dependent terms is constructed, and its derivative is estimated by several techniques, including free-weighting matrix and inequality estimation methods.
Journal ArticleDOI

Overview of recent advances in stability of linear systems with time-varying delays

TL;DR: In this article, the authors provide an overview and in-depth analysis of recent advances in stability of linear systems with time-varying delays, including stability criteria for three cases of a time varying delay.
Journal ArticleDOI

Global asymptotic stability analysis for delayed neural networks using a matrix-based quadratic convex approach.

TL;DR: This paper is concerned with global asymptotic stability for a class of generalized neural networks with interval time-varying delays by constructing a new Lyapunov-Krasovskii functional which includes some integral terms in the form of ∫(t-h)(t)(h-t-s)(j)ẋ(T)(s)Rj(s)ds(j=1,2,3).
References
More filters
Book

Introduction To The Theory Of Neural Computation

TL;DR: This book is a detailed, logically-developed treatment that covers the theory and uses of collective computational networks, including associative memory, feed forward networks, and unsupervised learning.
Journal ArticleDOI

Cellular neural networks: applications

TL;DR: Examples of cellular neural networks which can be designed to recognize the key features of Chinese characters are presented and their applications to such areas as image processing and pattern recognition are demonstrated.
Journal ArticleDOI

Computing with neural circuits: a model

TL;DR: A new conceptual framework and a minimization principle together provide an understanding of computation in model neural circuits that represent an approximation to biological neurons in which a simplified set of important computational properties is retained.
Journal ArticleDOI

New conditions for global stability of neural networks with application to linear and quadratic programming problems

TL;DR: In this paper, the authors present new conditions ensuring existence, uniqueness, and global asymptotic stability of the equilibrium point for a large class of neural networks, which are applicable to both symmetric and nonsymmetric interconnection matrices and allow for the consideration of all continuous non-reasing neuron activation functions.
Journal ArticleDOI

Global exponential stability of generalized recurrent neural networks with discrete and distributed delays

TL;DR: A linear matrix inequality (LMI) approach is developed to establish sufficient conditions for the RNNs to be globally exponentially stable, and the existence and uniqueness of the equilibrium point under mild conditions is proved.
Related Papers (5)