EconPapers    
Economics at your fingertips  
 

Global Robust Exponential Dissipativity for Interval Recurrent Neural Networks with Infinity Distributed Delays

Xiaohong Wang and Huan Qi

Abstract and Applied Analysis, 2013, vol. 2013, 1-16

Abstract:

This paper is concerned with the robust dissipativity problem for interval recurrent neural networks (IRNNs) with general activation functions, and continuous time-varying delay, and infinity distributed time delay. By employing a new differential inequality, constructing two different kinds of Lyapunov functions, and abandoning the limitation on activation functions being bounded, monotonous and differentiable, several sufficient conditions are established to guarantee the global robust exponential dissipativity for the addressed IRNNs in terms of linear matrix inequalities (LMIs) which can be easily checked by LMI Control Toolbox in MATLAB. Furthermore, the specific estimation of positive invariant and global exponential attractive sets of the addressed system is also derived. Compared with the previous literatures, the results obtained in this paper are shown to improve and extend the earlier global dissipativity conclusions. Finally, two numerical examples are provided to demonstrate the potential effectiveness of the proposed results.

Date: 2013
References: Add references at CitEc
Citations:

Downloads: (external link)
http://downloads.hindawi.com/journals/AAA/2013/585709.pdf (application/pdf)
http://downloads.hindawi.com/journals/AAA/2013/585709.xml (text/xml)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:hin:jnlaaa:585709

DOI: 10.1155/2013/585709

Access Statistics for this article

More articles in Abstract and Applied Analysis from Hindawi
Bibliographic data for series maintained by Mohamed Abdelhakeem ().

 
Page updated 2025-03-19
Handle: RePEc:hin:jnlaaa:585709