M
Mauro Forti
Researcher at University of Siena
Publications - 148
Citations - 4333
Mauro Forti is an academic researcher from University of Siena. The author has contributed to research in topics: Cellular neural network & Artificial neural network. The author has an hindex of 28, co-authored 138 publications receiving 3881 citations. Previous affiliations of Mauro Forti include Polytechnic University of Turin & University of Florence.
Papers
More filters
Journal ArticleDOI
New conditions for global stability of neural networks with application to linear and quadratic programming problems
Mauro Forti,Alberto Tesi +1 more
TL;DR: In this paper, the authors present new conditions ensuring existence, uniqueness, and global asymptotic stability of the equilibrium point for a large class of neural networks, which are applicable to both symmetric and nonsymmetric interconnection matrices and allow for the consideration of all continuous non-reasing neuron activation functions.
Journal ArticleDOI
Global convergence of neural networks with discontinuous neuron activations
Mauro Forti,Paolo Nistri +1 more
TL;DR: Results from the theory of differential equations with discontinuous right-hand side as introduced by Filippov are employed, and global convergence is addressed by using a Lyapunov-like approach based on the concept of monotone trajectories of a differential inclusion.
Journal ArticleDOI
Global exponential stability and global convergence in finite time of delayed neural networks with infinite gain
TL;DR: This paper introduces a general class of neural networks with arbitrary constant delays in the neuron interconnections, and neuron activations belonging to the set of discontinuous monotone increasing and (possibly) unbounded functions, for which stability is instead insensitive to the presence of a delay.
Journal ArticleDOI
Generalized Lyapunov approach for convergence of neural networks with discontinuous or non-Lipschitz activations
TL;DR: A general result is proved on global exponential convergence toward a unique equilibrium point of the neural network solutions by means of the Lyapunov-like approach and new results on global convergence in finite time are established.
Journal ArticleDOI
Generalized neural network for nonsmooth nonlinear programming problems
TL;DR: In this paper, a generalized canonical nonlinear programming circuit (G-NPC) was proposed to solve a general class of nonsmooth non-linear programming problems, where the objective function and constraints are assumed to satisfy only the weak condition of being regular functions.