EconPapers    
Economics at your fingertips  
 

Global output convergence of Cohen–Grossberg neural networks with both time-varying and distributed delays

Yan Ji, Xuyang Lou and Baotong Cui

Chaos, Solitons & Fractals, 2009, vol. 40, issue 1, 344-354

Abstract: This paper considers the global output convergence of Cohen–Grossberg neural networks with both time-varying and distributed delays. The inputs of the neural networks are required to be time-varying and the activation functions should be globally Lipschitz continuous and monotonely nondecreasing. Based on M-matrix theory, several sufficient conditions are established to guarantee the global output convergence of this class of neural networks. Symmetry in the connection weight matrices and the boundedness of the activation functions are abandoned in this paper. The convergence results are useful in solving some optimization problems and the design of Cohen–Grossberg neural networks with both time-varying and distributed delays. Two examples are given to illustrate the effectiveness of our results.

Date: 2009
References: View references in EconPapers View complete reference list from CitEc
Citations:

Downloads: (external link)
http://www.sciencedirect.com/science/article/pii/S0960077907005826
Full text for ScienceDirect subscribers only

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:eee:chsofr:v:40:y:2009:i:1:p:344-354

DOI: 10.1016/j.chaos.2007.07.068

Access Statistics for this article

Chaos, Solitons & Fractals is currently edited by Stefano Boccaletti and Stelios Bekiros

More articles in Chaos, Solitons & Fractals from Elsevier
Bibliographic data for series maintained by Thayer, Thomas R. ().

 
Page updated 2025-03-19
Handle: RePEc:eee:chsofr:v:40:y:2009:i:1:p:344-354