EconPapers    
Economics at your fingertips  
 

Convergence of a Class of Delayed Neural Networks with Real Memristor Devices

Mauro Di Marco, Mauro Forti, Riccardo Moretti, Luca Pancioni, Giacomo Innocenti and Alberto Tesi
Additional contact information
Mauro Di Marco: Dipartimento di Ingegneria dell’Informazione e Scienze Matematiche, Università di Siena, Via Roma 56, 53100 Siena, Italy
Mauro Forti: Dipartimento di Ingegneria dell’Informazione e Scienze Matematiche, Università di Siena, Via Roma 56, 53100 Siena, Italy
Riccardo Moretti: Dipartimento di Ingegneria dell’Informazione e Scienze Matematiche, Università di Siena, Via Roma 56, 53100 Siena, Italy
Luca Pancioni: Dipartimento di Ingegneria dell’Informazione e Scienze Matematiche, Università di Siena, Via Roma 56, 53100 Siena, Italy
Giacomo Innocenti: Dipartimento di Ingegneria dell’Informazione, Università degli Studi di Firenze, Via S. Marta 3, 50139 Firenze, Italy
Alberto Tesi: Dipartimento di Ingegneria dell’Informazione, Università degli Studi di Firenze, Via S. Marta 3, 50139 Firenze, Italy

Mathematics, 2022, vol. 10, issue 14, 1-20

Abstract: Neural networks with memristors are promising candidates to overcome the limitations of traditional von Neumann machines via the implementation of novel analog and parallel computation schemes based on the in-memory computing principle. Of special importance are neural networks with generic or extended memristor models that are suited to accurately describe real memristor devices. The manuscript considers a general class of delayed neural networks where the memristors obey the relevant and widely used generic memristor model, the voltage threshold adaptive memristor (VTEAM) model. Due to physical limitations, the memristor state variables evolve in a closed compact subset of the space; therefore, the network can be mathematically described by a special class of differential inclusions named differential variational inequalities (DVIs). By using the theory of DVI, and the Lyapunov approach, the paper proves some fundamental results on convergence of solutions toward equilibrium points, a dynamic property that is extremely useful in neural network applications to content addressable memories and signal-processing in real time. The conditions for convergence, which hold in the general nonsymmetric case and for any constant delay, are given in the form of a linear matrix inequality (LMI) and can be readily checked numerically. To the authors knowledge, the obtained results are the only ones available in the literature on the convergence of neural networks with real generic memristors.

Keywords: convergence; delay; differential variational inequalities (DVIs); linear matrix inequalities (LMIs); Lyapunov method; memristor; neural networks (search for similar items in EconPapers)
JEL-codes: C (search for similar items in EconPapers)
Date: 2022
References: View complete reference list from CitEc
Citations: View citations in EconPapers (1)

Downloads: (external link)
https://www.mdpi.com/2227-7390/10/14/2439/pdf (application/pdf)
https://www.mdpi.com/2227-7390/10/14/2439/ (text/html)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:gam:jmathe:v:10:y:2022:i:14:p:2439-:d:861830

Access Statistics for this article

Mathematics is currently edited by Ms. Emma He

More articles in Mathematics from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().

 
Page updated 2025-03-19
Handle: RePEc:gam:jmathe:v:10:y:2022:i:14:p:2439-:d:861830