Stability Analysis for Delayed Neural Networks: Reciprocally Convex Approach
Hongjun Yu,
Xiaozhan Yang,
Chunfeng Wu and
Qingshuang Zeng
Mathematical Problems in Engineering, 2013, vol. 2013, 1-12
Abstract:
This paper is concerned with global stability analysis for a class of continuous neural networks with time-varying delay. The lower and upper bounds of the delay and the upper bound of its first derivative are assumed to be known. By introducing a novel Lyapunov-Krasovskii functional, some delay-dependent stability criteria are derived in terms of linear matrix inequality, which guarantee the considered neural networks to be globally stable. When estimating the derivative of the LKF, instead of applying Jensen’s inequality directly, a substep is taken, and a slack variable is introduced by reciprocally convex combination approach, and as a result, conservatism reduction is proved to be more obvious than the available literature. Numerical examples are given to demonstrate the effectiveness and merits of the proposed method.
Date: 2013
References: Add references at CitEc
Citations:
Downloads: (external link)
http://downloads.hindawi.com/journals/MPE/2013/639219.pdf (application/pdf)
http://downloads.hindawi.com/journals/MPE/2013/639219.xml (text/xml)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:hin:jnlmpe:639219
DOI: 10.1155/2013/639219
Access Statistics for this article
More articles in Mathematical Problems in Engineering from Hindawi
Bibliographic data for series maintained by Mohamed Abdelhakeem ().