Back to Search Start Over

An overview of recent developments in Lyapunov–Krasovskii functionals and stability criteria for recurrent neural networks with time-varying delays.

Authors :
Zhang, Xian-Ming
Han, Qing-Long
Ge, Xiaohua
Ding, Derui
Source :
Neurocomputing. Nov2018, Vol. 313, p392-401. 10p.
Publication Year :
2018

Abstract

Global asymptotic stability is an important issue for wide applications of recurrent neural networks with time-varying delays. The Lyapunov–Krasovskii functional method is a powerful tool to check the global asymptotic stability of a delayed recurrent neural network. When the Lyapunov–Krasovskii functional method is employed, three steps are necessary in order to derive a global asymptotic stability criterion: (i) constructing a Lyapunov–Krasovskii functional, (ii) estimating the derivative of the Lyapunov–Krasovskii functional, and (iii) formulating a global asymptotic stability criterion. This paper provides an overview of recent developments in each step with insightful understanding. In the first step, some existing Lyapunov–Krasovskii functionals for stability of delayed recurrent neural networks are anatomized. In the second step, a free-weighting matrix approach, an integral inequality approach and its recent developments, reciprocally convex inequalities and S -procedure are analyzed in detail. In the third step, linear convex and quadratic convex approaches, together with the refinement of allowable delay sets are reviewed. Finally, some challenging issues are presented to guide the future research. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
09252312
Volume :
313
Database :
Academic Search Index
Journal :
Neurocomputing
Publication Type :
Academic Journal
Accession number :
131469163
Full Text :
https://doi.org/10.1016/j.neucom.2018.06.038